Sample records for technical baseline database

  1. TWRS technical baseline database manager definition document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  2. Owning the program technical baseline for future space systems acquisition: program technical baseline tracking tool

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.; Hant, James J.; Kizer, Justin R.; Min, Inki A.; Siedlak, Dennis J. L.; Yoh, James

    2017-05-01

    The U.S. Air Force (USAF) has recognized the needs for owning the program and technical knowledge within the Air Force concerning the systems being acquired to ensure success. This paper extends the previous work done by the authors [1-2] on the "Resilient Program Technical Baseline Framework for Future Space Systems" and "Portfolio Decision Support Tool (PDST)" to the development and implementation of the Program and Technical Baseline (PTB) Tracking Tool (PTBTL) for the DOD acquisition life cycle. The paper describes the "simplified" PTB tracking model with a focus on the preaward phases and discusses how to implement this model in PDST.

  3. TAPIR--Finnish national geochemical baseline database.

    PubMed

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  4. Solid Waste Program technical baseline description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  5. A publication database for optical long baseline interferometry

    NASA Astrophysics Data System (ADS)

    Malbet, Fabien; Mella, Guillaume; Lawson, Peter; Taillifet, Esther; Lafrasse, Sylvain

    2010-07-01

    Optical long baseline interferometry is a technique that has generated almost 850 refereed papers to date. The targets span a large variety of objects from planetary systems to extragalactic studies and all branches of stellar physics. We have created a database hosted by the JMMC and connected to the Optical Long Baseline Interferometry Newsletter (OLBIN) web site using MySQL and a collection of XML or PHP scripts in order to store and classify these publications. Each entry is defined by its ADS bibcode, includes basic ADS informations and metadata. The metadata are specified by tags sorted in categories: interferometric facilities, instrumentation, wavelength of operation, spectral resolution, type of measurement, target type, and paper category, for example. The whole OLBIN publication list has been processed and we present how the database is organized and can be accessed. We use this tool to generate statistical plots of interest for the community in optical long baseline interferometry.

  6. Environment Online: The Greening of Databases. Part 2. Scientific and Technical Databases.

    ERIC Educational Resources Information Center

    Alston, Patricia Gayle

    1991-01-01

    This second in a series of articles about online sources of environmental information describes scientific and technical databases that are useful for searching environmental data. Topics covered include chemicals and hazardous substances; agriculture; pesticides; water; forestry, oil, and energy resources; air; environmental and occupational…

  7. COMSATCOM service technical baseline strategy development approach using PPBW concept

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2016-05-01

    This paper presents an innovative approach to develop a Commercial Satellite Communications (COMSATCOM) service Technical Baseline (TB) and associated Program Baseline (PB) strategy using Portable Pool Bandwidth (PPBW) concept. The concept involves trading of the purchased commercial transponders' Bandwidths (BWs) with existing commercial satellites' bandwidths participated in a "designated pool bandwidth"3 according to agreed terms and conditions. Space Missile Systems Center (SMC) has been implementing the Better Buying Power (BBP 3.0) directive4 and recommending the System Program Offices (SPO) to own the Program and Technical Baseline (PTB) [1, 2] for the development of flexible acquisition strategy and achieving affordability and increased in competition. This paper defines and describes the critical PTB parameters and associated requirements that are important to the government SPO for "owning" an affordable COMSATCOM services contract using PPBW trading concept. The paper describes a step-by-step approach to optimally perform the PPBW trading to meet DoD and its stakeholders (i) affordability requirement, and (ii) fixed and variable bandwidth requirements by optimizing communications performance, cost and PPBW accessibility in terms of Quality of Services (QoS), Bandwidth Sharing Ratio (BSR), Committed Information Rate (CIR), Burstable Information Rate (BIR), Transponder equivalent bandwidth (TPE) and transponder Net Presence Value (NPV). The affordable optimal solution that meets variable bandwidth requirements will consider the operating and trading terms and conditions described in the Fair Access Policy (FAP).

  8. IDENTIFICATION OF BIOLOGICALLY RELEVANT GENES USING A DATABASE OF RAT LIVER AND KIDNEY BASELINE GENE EXPRESSION

    EPA Science Inventory

    Microarray data from independent labs and studies can be compared to potentially identify toxicologically and biologically relevant genes. The Baseline Animal Database working group of HESI was formed to assess baseline gene expression from microarray data derived from control or...

  9. Production and distribution of scientific and technical databases - Comparison among Japan, US and Europe

    NASA Astrophysics Data System (ADS)

    Onodera, Natsuo; Mizukami, Masayuki

    This paper estimates several quantitative indice on production and distribution of scientific and technical databases based on various recent publications and attempts to compare the indice internationally. Raw data used for the estimation are brought mainly from the Database Directory (published by MITI) for database production and from some domestic and foreign study reports for database revenues. The ratio of the indice among Japan, US and Europe for usage of database is similar to those for general scientific and technical activities such as population and R&D expenditures. But Japanese contributions to production, revenue and over-countory distribution of databases are still lower than US and European countries. International comparison of relative database activities between public and private sectors is also discussed.

  10. TECHNICAL REPORTS DATABASE

    EPA Science Inventory

    The Defense Technical Information Center (DTIC?) is the central facility for the collection and dissemination of scientific and technical information for the Department of Defense (DoD). Much of this information is made available by DTIC in the form of technical reports about com...

  11. Development and operation of NEW-KOTIS : In-house technical information database of Nippon Kokan Corp.

    NASA Astrophysics Data System (ADS)

    Yagi, Yukio; Takahashi, Kaei

    The purpose of this report is to describe how the activities for managing technical information has been and is now being conducted by the Engineering department of Nippon Kokan Corp. In addition, as a practical example of database generation promoted by the department, this book gives whole aspects of the NEW-KOTIS (background of its development, history, features, functional details, control and operation method, use in search operations, and so forth). The NEW-KOTIS (3rd-term system) is an "in-house technical information database system," which started its operation on May, 1987. This database system now contains approximately 65,000 information items (research reports, investigation reports, technical reports, etc.) generated within the company, and this information is available to anyone in any department through the network connecting all the company's structures.

  12. The Consolidated Human Activity Database — Master Version (CHAD-Master) Technical Memorandum

    EPA Pesticide Factsheets

    This technical memorandum contains information about the Consolidated Human Activity Database -- Master version, including CHAD contents, inventory of variables: Questionnaire files and Event files, CHAD codes, and references.

  13. Technical implementation of an Internet address database with online maintenance module.

    PubMed

    Mischke, K L; Bollmann, F; Ehmer, U

    2002-01-01

    The article describes the technical implementation and management of the Internet address database of the center for ZMK (University of Münster, Dental School) Münster, which is integrated in the "ZMK-Web" website. The editorially maintained system guarantees its topicality primarily due to the electronically organized division of work with the aid of an online maintenance module programmed in JavaScript/PHP, as well as a database-related feedback function for the visitor to the website through configuration-independent direct mail windows programmed in JavaScript/PHP.

  14. Technical Work Plan for: Thermodynamic Database for Chemical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C.F. Jovecolon

    The objective of the work scope covered by this Technical Work Plan (TWP) is to correct and improve the Yucca Mountain Project (YMP) thermodynamic databases, to update their documentation, and to ensure reasonable consistency among them. In addition, the work scope will continue to generate database revisions, which are organized and named so as to be transparent to internal and external users and reviewers. Regarding consistency among databases, it is noted that aqueous speciation and mineral solubility data for a given system may differ according to how solubility was determined, and the method used for subsequent retrieval of thermodynamic parametermore » values from measured data. Of particular concern are the details of the determination of ''infinite dilution'' constants, which involve the use of specific methods for activity coefficient corrections. That is, equilibrium constants developed for a given system for one set of conditions may not be consistent with constants developed for other conditions, depending on the species considered in the chemical reactions and the methods used in the reported studies. Hence, there will be some differences (for example in log K values) between the Pitzer and ''B-dot'' database parameters for the same reactions or species.« less

  15. Advanced Neonatal Medicine in China: A National Baseline Database.

    PubMed

    Liao, Xiang-Peng; Chipenda-Dansokho, Selma; Lewin, Antoine; Abdelouahab, Nadia; Wei, Shu-Qin

    2017-01-01

    each unit had more than 20 admissions of ELBW infants in 2010; and the median hospital cost for a single hospital stay in ELBW infants was US$8,613 (IQR 8,153-9,216), which was 3.0 times (IQR 2.0-3.2) the average per-capita disposable income, or 63 times (IQR 40.3-72.1) the average per-capita health expenditure of local urban residents in 2011. Our national database provides baseline data on the status of advanced neonatal medicine in China, gathering valuable information for quality improvement, decision making, longitudinal studies and horizontal comparisons.

  16. Advanced Neonatal Medicine in China: A National Baseline Database

    PubMed Central

    Chipenda-Dansokho, Selma; Lewin, Antoine; Abdelouahab, Nadia; Wei, Shu-Qin

    2017-01-01

    five hospitals where each unit had more than 20 admissions of ELBW infants in 2010; and the median hospital cost for a single hospital stay in ELBW infants was US$8,613 (IQR 8,153–9,216), which was 3.0 times (IQR 2.0–3.2) the average per-capita disposable income, or 63 times (IQR 40.3–72.1) the average per-capita health expenditure of local urban residents in 2011. Our national database provides baseline data on the status of advanced neonatal medicine in China, gathering valuable information for quality improvement, decision making, longitudinal studies and horizontal comparisons. PMID:28099450

  17. The International Linear Collider Technical Design Report - Volume 3.II: Accelerator Baseline Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adolphsen, Chris

    2013-06-26

    The International Linear Collider Technical Design Report (TDR) describes in four volumes the physics case and the design of a 500 GeV centre-of-mass energy linear electron-positron collider based on superconducting radio-frequency technology using Niobium cavities as the accelerating structures. The accelerator can be extended to 1 TeV and also run as a Higgs factory at around 250 GeV and on the Z0 pole. A comprehensive value estimate of the accelerator is give, together with associated uncertainties. It is shown that no significant technical issues remain to be solved. Once a site is selected and the necessary site-dependent engineering is carriedmore » out, construction can begin immediately. The TDR also gives baseline documentation for two high-performance detectors that can share the ILC luminosity by being moved into and out of the beam line in a "push-pull" configuration. These detectors, ILD and SiD, are described in detail. They form the basis for a world-class experimental programme that promises to increase significantly our understanding of the fundamental processes that govern the evolution of the Universe.« less

  18. Development of a general baseline toxicity QSAR model for the fish embryo acute toxicity test.

    PubMed

    Klüver, Nils; Vogs, Carolina; Altenburger, Rolf; Escher, Beate I; Scholz, Stefan

    2016-12-01

    Fish embryos have become a popular model in ecotoxicology and toxicology. The fish embryo acute toxicity test (FET) with the zebrafish embryo was recently adopted by the OECD as technical guideline TG 236 and a large database of concentrations causing 50% lethality (LC 50 ) is available in the literature. Quantitative Structure-Activity Relationships (QSARs) of baseline toxicity (also called narcosis) are helpful to estimate the minimum toxicity of chemicals to be tested and to identify excess toxicity in existing data sets. Here, we analyzed an existing fish embryo toxicity database and established a QSAR for fish embryo LC 50 using chemicals that were independently classified to act according to the non-specific mode of action of baseline toxicity. The octanol-water partition coefficient K ow is commonly applied to discriminate between non-polar and polar narcotics. Replacing the K ow by the liposome-water partition coefficient K lipw yielded a common QSAR for polar and non-polar baseline toxicants. This developed baseline toxicity QSAR was applied to compare the final mode of action (MOA) assignment of 132 chemicals. Further, we included the analysis of internal lethal concentration (ILC 50 ) and chemical activity (La 50 ) as complementary approaches to evaluate the robustness of the FET baseline toxicity. The analysis of the FET dataset revealed that specifically acting and reactive chemicals converged towards the baseline toxicity QSAR with increasing hydrophobicity. The developed FET baseline toxicity QSAR can be used to identify specifically acting or reactive compounds by determination of the toxic ratio and in combination with appropriate endpoints to infer the MOA for chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. War-gaming application for future space systems acquisition part 1: program and technical baseline war-gaming modeling and simulation approaches

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2017-05-01

    This paper describes static Bayesian game models with "Pure" and "Mixed" games for the development of an optimum Program and Technical Baseline (PTB) solution for affordable acquisition of future space systems. The paper discusses System Engineering (SE) frameworks and analytical and simulation modeling approaches for developing the optimum PTB solutions from both the government and contractor perspectives.

  20. Ankylosing Spondylitis Patients Commencing Biologic Therapy Have High Baseline Levels of Comorbidity: A Report from the Australian Rheumatology Association Database

    PubMed Central

    Oldroyd, John; Schachna, Lionel; Buchbinder, Rachelle; Staples, Margaret; Murphy, Bridie; Bond, Molly; Briggs, Andrew; Lassere, Marissa; March, Lyn

    2009-01-01

    Aims. To compare the baseline characteristics of a population-based cohort of patients with ankylosing spondylitis (AS) commencing biological therapy to the reported characteristics of bDMARD randomised controlled trials (RCTs) participants. Methods. Descriptive analysis of AS participants in the Australian Rheumatology Association Database (ARAD) who were commencing bDMARD therapy. Results. Up to December 2008, 389 patients with AS were enrolled in ARAD. 354 (91.0%) had taken bDMARDs at some time, and 198 (55.9%) completed their entry questionnaire prior to or within 6 months of commencing bDMARDs. 131 (66.1%) had at least one comorbid condition, and 24 (6.8%) had a previous malignancy (15 nonmelanoma skin, 4 melanoma, 2 prostate, 1 breast, cervix, and bowel). Compared with RCT participants, ARAD participants were older, had longer disease duration and higher baseline disease activity. Conclusions. AS patients commencing bDMARDs in routine care are significantly different to RCT participants and have significant baseline comorbidities. PMID:20107564

  1. Fullerene data mining using bibliometrics and database tomography

    PubMed

    Kostoff; Braun; Schubert; Toothman; Humenik

    2000-01-01

    Database tomography (DT) is a textual database analysis system consisting of two major components: (1) algorithms for extracting multiword phrase frequencies and phrase proximities (physical closeness of the multiword technical phrases) from any type of large textual database, to augment (2) interpretative capabilities of the expert human analyst. DT was used to derive technical intelligence from a fullerenes database derived from the Science Citation Index and the Engineering Compendex. Phrase frequency analysis by the technical domain experts provided the pervasive technical themes of the fullerenes database, and phrase proximity analysis provided the relationships among the pervasive technical themes. Bibliometric analysis of the fullerenes literature supplemented the DT results with author/journal/institution publication and citation data. Comparisons of fullerenes results with past analyses of similarly structured near-earth space, chemistry, hypersonic/supersonic flow, aircraft, and ship hydrodynamics databases are made. One important finding is that many of the normalized bibliometric distribution functions are extremely consistent across these diverse technical domains and could reasonably be expected to apply to broader chemical topics than fullerenes that span multiple structural classes. Finally, lessons learned about integrating the technical domain experts with the data mining tools are presented.

  2. The Structural Ceramics Database: Technical Foundations

    PubMed Central

    Munro, R. G.; Hwang, F. Y.; Hubbard, C. R.

    1989-01-01

    The development of a computerized database on advanced structural ceramics can play a critical role in fostering the widespread use of ceramics in industry and in advanced technologies. A computerized database may be the most effective means of accelerating technology development by enabling new materials to be incorporated into designs far more rapidly than would have been possible with traditional information transfer processes. Faster, more efficient access to critical data is the basis for creating this technological advantage. Further, a computerized database provides the means for a more consistent treatment of data, greater quality control and product reliability, and improved continuity of research and development programs. A preliminary system has been completed as phase one of an ongoing program to establish the Structural Ceramics Database system. The system is designed to be used on personal computers. Developed in a modular design, the preliminary system is focused on the thermal properties of monolithic ceramics. The initial modules consist of materials specification, thermal expansion, thermal conductivity, thermal diffusivity, specific heat, thermal shock resistance, and a bibliography of data references. Query and output programs also have been developed for use with these modules. The latter program elements, along with the database modules, will be subjected to several stages of testing and refinement in the second phase of this effort. The goal of the refinement process will be the establishment of this system as a user-friendly prototype. Three primary considerations provide the guidelines to the system’s development: (1) The user’s needs; (2) The nature of materials properties; and (3) The requirements of the programming language. The present report discusses the manner and rationale by which each of these considerations leads to specific features in the design of the system. PMID:28053397

  3. Solar central electric power generation - A baseline design

    NASA Technical Reports Server (NTRS)

    Powell, J. C.

    1976-01-01

    The paper presents the conceptual technical baseline design of a solar electric power plant using the central receiver concept, and derives credible cost estimates from the baseline design. The major components of the plant - heliostats, tower, receiver, tower piping, and thermal storage - are discussed in terms of technical and cost information. The assumed peak plant output is 215 MW(e), over 4000 daylight hours. The contribution of total capital investment to energy cost is estimated to be about 55 mills per kwh in mid-1974 dollars.

  4. Chesapeake Bay Program Water Quality Database

    EPA Pesticide Factsheets

    The Chesapeake Information Management System (CIMS), designed in 1996, is an integrated, accessible information management system for the Chesapeake Bay Region. CIMS is an organized, distributed library of information and software tools designed to increase basin-wide public access to Chesapeake Bay information. The information delivered by CIMS includes technical and public information, educational material, environmental indicators, policy documents, and scientific data. Through the use of relational databases, web-based programming, and web-based GIS a large number of Internet resources have been established. These resources include multiple distributed on-line databases, on-demand graphing and mapping of environmental data, and geographic searching tools for environmental information. Baseline monitoring data, summarized data and environmental indicators that document ecosystem status and trends, confirm linkages between water quality, habitat quality and abundance, and the distribution and integrity of biological populations are also available. One of the major features of the CIMS network is the Chesapeake Bay Program's Data Hub, providing users access to a suite of long- term water quality and living resources databases. Chesapeake Bay mainstem and tidal tributary water quality, benthic macroinvertebrates, toxics, plankton, and fluorescence data can be obtained for a network of over 800 monitoring stations.

  5. Geochemical baseline studies of soil in Finland

    NASA Astrophysics Data System (ADS)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  6. Integrated Baseline Review (IBR) Handbook

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An Integrated Baseline Review (IBR) is a review of a supplier?s Performance Measurement Baseline (PMB). It is conducted by Program/Project Managers and their technical staffs on contracts and in-house work requiring compliance with NASA Earned Value Management System (EVMS) policy as defined in program/project policy, NPR 7120.5, or in NASA Federal Acquisition Regulations. The IBR Handbook may also be of use to those responsible for preparing the Terms of Reference for internal project reviews. While risks may be identified and actions tracked as a result of the IBR, it is important to note that an IBR cannot be failed.

  7. Materials And Processes Technical Information System (MAPTIS) LDEF materials database

    NASA Technical Reports Server (NTRS)

    Davis, John M.; Strickland, John W.

    1992-01-01

    The Materials and Processes Technical Information System (MAPTIS) is a collection of materials data which was computerized and is available to engineers in the aerospace community involved in the design and development of spacecraft and related hardware. Consisting of various database segments, MAPTIS provides the user with information such as material properties, test data derived from tests specifically conducted for qualification of materials for use in space, verification and control, project management, material information, and various administrative requirements. A recent addition to the project management segment consists of materials data derived from the LDEF flight. This tremendous quantity of data consists of both pre-flight and post-flight data in such diverse areas as optical/thermal, mechanical and electrical properties, atomic concentration surface analysis data, as well as general data such as sample placement on the satellite, A-O flux, equivalent sun hours, etc. Each data point is referenced to the primary investigator(s) and the published paper from which the data was taken. The MAPTIS system is envisioned to become the central location for all LDEF materials data. This paper consists of multiple parts, comprising a general overview of the MAPTIS System and the types of data contained within, and the specific LDEF data element and the data contained in that segment.

  8. 48 CFR 1034.202 - Integrated Baseline Reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... which the management process provides effective and integrated technical/schedule/cost planning and... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Integrated Baseline... SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.202...

  9. 48 CFR 34.202 - Integrated Baseline Reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... inherent risks in offerors'/contractors' performance plans and the underlying management control systems...) The degree to which the management process provides effective and integrated technical/schedule/cost... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Integrated Baseline...

  10. On-Line Databases in Mexico.

    ERIC Educational Resources Information Center

    Molina, Enzo

    1986-01-01

    Use of online bibliographic databases in Mexico is provided through Servicio de Consulta a Bancos de Informacion, a public service that provides information retrieval, document delivery, translation, technical support, and training services. Technical infrastructure is based on a public packet-switching network and institutional users may receive…

  11. Mathematical Notation in Bibliographic Databases.

    ERIC Educational Resources Information Center

    Pasterczyk, Catherine E.

    1990-01-01

    Discusses ways in which using mathematical symbols to search online bibliographic databases in scientific and technical areas can improve search results. The representations used for Greek letters, relations, binary operators, arrows, and miscellaneous special symbols in the MathSci, Inspec, Compendex, and Chemical Abstracts databases are…

  12. Annual Review of Database Developments: 1993.

    ERIC Educational Resources Information Center

    Basch, Reva

    1993-01-01

    Reviews developments in the database industry for 1993. Topics addressed include scientific and technical information; environmental issues; social sciences; legal information; business and marketing; news services; documentation; databases and document delivery; electronic bulletin boards and the Internet; and information industry organizational…

  13. Global Oil & Gas Features Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly Rose; Jennifer Bauer; Vic Baker

    This submission contains a zip file with the developed Global Oil & Gas Features Database (as an ArcGIS geodatabase). Access the technical report describing how this database was produced using the following link: https://edx.netl.doe.gov/dataset/development-of-an-open-global-oil-and-gas-infrastructure-inventory-and-geodatabase

  14. Technical Communication, Knowledge Management, and XML.

    ERIC Educational Resources Information Center

    Applen, J. D.

    2002-01-01

    Describes how technical communicators can become involved in knowledge management. Examines how technical communicators can teach organizations to design, access, and contribute to databases; alert them to new information; and facilitate trust and sharing. Concludes that successful technical communicators would do well to establish a culture that…

  15. Updating a Searchable Database of Dropout Prevention Programs and Policies in Nine Low-Income Urban School Districts in the Northeast and Islands Region. REL Technical Brief. REL 2012-No. 020

    ERIC Educational Resources Information Center

    Myint-U, Athi; O'Donnell, Lydia; Phillips, Dawna

    2012-01-01

    This technical brief describes updates to a database of dropout prevention programs and policies in 2006/07 created by the Regional Education Laboratory (REL) Northeast and Islands and described in the Issues & Answers report, "Piloting a searchable database of dropout prevention programs in nine low-income urban school districts in the…

  16. Analysis of baseline gene expression levels from ...

    EPA Pesticide Factsheets

    The use of gene expression profiling to predict chemical mode of action would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in gene expression. A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques. The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selectiv

  17. Draft secure medical database standard.

    PubMed

    Pangalos, George

    2002-01-01

    Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.

  18. Hydroponics Database and Handbook for the Advanced Life Support Test Bed

    NASA Technical Reports Server (NTRS)

    Nash, Allen J.

    1999-01-01

    During the summer 1998, I did student assistance to Dr. Daniel J. Barta, chief plant growth expert at Johnson Space Center - NASA. We established the preliminary stages of a hydroponic crop growth database for the Advanced Life Support Systems Integration Test Bed, otherwise referred to as BIO-Plex (Biological Planetary Life Support Systems Test Complex). The database summarizes information from published technical papers by plant growth experts, and it includes bibliographical, environmental and harvest information based on plant growth under varying environmental conditions. I collected 84 lettuce entries, 14 soybean, 49 sweet potato, 16 wheat, 237 white potato, and 26 mix crop entries. The list will grow with the publication of new research. This database will be integrated with a search and systems analysis computer program that will cross-reference multiple parameters to determine optimum edible yield under varying parameters. Also, we have made preliminary effort to put together a crop handbook for BIO-Plex plant growth management. It will be a collection of information obtained from experts who provided recommendations on a particular crop's growing conditions. It includes bibliographic, environmental, nutrient solution, potential yield, harvest nutritional, and propagation procedure information. This handbook will stand as the baseline growth conditions for the first set of experiments in the BIO-Plex facility.

  19. The IUGS/IAGC Task Group on Global Geochemical Baselines

    USGS Publications Warehouse

    Smith, David B.; Wang, Xueqiu; Reeder, Shaun; Demetriades, Alecos

    2012-01-01

    The Task Group on Global Geochemical Baselines, operating under the auspices of both the International Union of Geological Sciences (IUGS) and the International Association of Geochemistry (IAGC), has the long-term goal of establishing a global geochemical database to document the concentration and distribution of chemical elements in the Earth’s surface or near-surface environment. The database and accompanying element distribution maps represent a geochemical baseline against which future human-induced or natural changes to the chemistry of the land surface may be recognized and quantified. In order to accomplish this long-term goal, the activities of the Task Group include: (1) developing partnerships with countries conducting broad-scale geochemical mapping studies; (2) providing consultation and training in the form of workshops and short courses; (3) organizing periodic international symposia to foster communication among the geochemical mapping community; (4) developing criteria for certifying those projects whose data are acceptable in a global geochemical database; (5) acting as a repository for data collected by those projects meeting the criteria for standardization; (6) preparing complete metadata for the certified projects; and (7) preparing, ultimately, a global geochemical database. This paper summarizes the history and accomplishments of the Task Group since its first predecessor project was established in 1988.

  20. Lessons Learned and Technical Standards: A Logical Marriage

    NASA Technical Reports Server (NTRS)

    Gill, Paul; Vaughan, William W.; Garcia, Danny; Gill, Maninderpal S. (Technical Monitor)

    2001-01-01

    A comprehensive database of lessons learned that corresponds with relevant technical standards would be a boon to technical personnel and standards developers. The authors discuss the emergence of one such database within NASA, and show how and why the incorporation of lessons learned into technical standards databases can be an indispensable tool for government and industry. Passed down from parent to child, teacher to pupil, and from senior to junior employees, lessons learned have been the basis for our accomplishments throughout the ages. Government and industry, too, have long recognized the need to systematically document And utilize the knowledge gained from past experiences in order to avoid the repetition of failures and mishaps. The use of lessons learned is a principle component of any organizational culture committed to continuous improvement. They have formed the foundation for discoveries, inventions, improvements, textbooks, and technical standards. Technical standards are a very logical way to communicate these lessons. Using the time-honored tradition of passing on lessons learned while utilizing the newest in information technology, the National Aeronautics and Space Administration (NASA) has launched an intensive effort to link lessons learned with specific technical standards through various Internet databases. This article will discuss the importance of lessons learned to engineers, the difficulty in finding relevant lessons learned while engaged in an engineering project, and the new NASA project that can help alleviate this difficulty. The article will conclude with recommendations for more expanded cross-sectoral uses of lessons learned with reference to technical standards.

  1. Optimizing the NASA Technical Report Server

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maa, Ming-Hokng

    1996-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.

  2. Wilder-Naifeh Technical Skills Grant Program Report: A Baseline Evaluation

    ERIC Educational Resources Information Center

    Tennessee Higher Education Commission, 2010

    2010-01-01

    The Wilder-Naifeh Technical Skills Grant, introduced in Winter 2004, grants awards of up to $2,000 to students who attend one of the 27 Tennessee Technology Centers. Since the inception of this program, approximately 50,000 students have received grants, and the state of Tennessee has spent roughly $47.5 million on the program over the last four…

  3. Organization's Orderly Interest Exploration: Inception, Development and Insights of AIAA's Topics Database

    NASA Technical Reports Server (NTRS)

    Marshall, Jospeh R.; Morris, Allan T.

    2007-01-01

    Since 2003, AIAA's Computer Systems and Software Systems Technical Committees (TCs) have developed a database that aids technical committee management to map technical topics to their members. This Topics/Interest (T/I) database grew out of a collection of charts and spreadsheets maintained by the TCs. Since its inception, the tool has evolved into a multi-dimensional database whose dimensions include the importance, interest and expertise of TC members and whether or not a member and/or a TC is actively involved with the topic. In 2005, the database was expanded to include the TCs in AIAA s Information Systems Group and then expanded further to include all AIAA TCs. It was field tested at an AIAA Technical Activities Committee (TAC) Workshop in early 2006 through live access by over 80 users. Through the use of the topics database, TC and program committee (PC) members can accomplish relevant tasks such as: to identify topic experts (for Aerospace America articles or external contacts), to determine the interest of its members, to identify overlapping topics between diverse TCs and PCs, to guide new member drives and to reveal emerging topics. This paper will describe the origins, inception, initial development, field test and current version of the tool as well as elucidate the benefits and insights gained by using the database to aid the management of various TC functions. Suggestions will be provided to guide future development of the database for the purpose of providing dynamics and system level benefits to AIAA that currently do not exist in any technical organization.

  4. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kellie, C.L.

    This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.

  5. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, A.G.

    This plan establishes the integrated configuration management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford site technical baseline.

  6. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Dezaki, Kyoko; Saeki, Makoto

    Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.

  7. International forensic automotive paint database

    NASA Astrophysics Data System (ADS)

    Bishea, Gregory A.; Buckle, Joe L.; Ryland, Scott G.

    1999-02-01

    The Technical Working Group for Materials Analysis (TWGMAT) is supporting an international forensic automotive paint database. The Federal Bureau of Investigation and the Royal Canadian Mounted Police (RCMP) are collaborating on this effort through TWGMAT. This paper outlines the support and further development of the RCMP's Automotive Paint Database, `Paint Data Query'. This cooperative agreement augments and supports a current, validated, searchable, automotive paint database that is used to identify make(s), model(s), and year(s) of questioned paint samples in hit-and-run fatalities and other associated investigations involving automotive paint.

  8. Summary Report for the Technical Interchange Meeting on Development of Baseline Material Properties and Design Guidelines for In-Space Manufacturing Activities

    NASA Technical Reports Server (NTRS)

    Prater, T. J.; Bean, Q. A.; Werkheiser, N. J.; Johnston, M. M.; Ordonez, E. A.; Ledbetter, F. E.; Risdon, D. L.; Stockman, T. J.; Sandridge, S. K. R.; Nelson, G. M.

    2016-01-01

    NASA Marshall Space Flight Center (MSFC) and the Agency as a whole are currently engaged in a number of in-space manufacturing (ISM) activities that have the potential to reduce launch costs, enhance crew safety, and provide the capabilities needed to undertake long-duration spaceflight. The recent 3D Printing in Zero-G experiment conducted on board the International Space Station (ISS) demonstrated that parts of acrylonitrile butadiene styrene (ABS) plastic can be manufactured in microgravity using fused deposition modeling (FDM). This project represents the beginning of the development of a capability that is critical to future NASA missions. Current and future ISM activities will require the development of baseline material properties to facilitate design, analysis, and certification of materials manufactured using in-space techniques. The purpose of this technical interchange meeting (TIM) was to bring together MSFC practitioners and experts in materials characterization and development of baseline material properties for emerging technologies to advise the ISM team as we progress toward the development of material design values, standards, and acceptance criteria for materials manufactured in space. The overall objective of the TIM was to leverage MSFC's shared experiences and collective knowledge in advanced manufacturing and materials development to construct a path forward for the establishment of baseline material properties, standards development, and certification activities related to ISM. Participants were asked to help identify research and development activities that will (1) accelerate acceptance and adoption of ISM techniques among the aerospace design community; (2) benefit future NASA programs, commercial technology developments, and national needs; and (3) provide opportunities and avenues for further collaboration.

  9. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Fujii, Yohzo

    The author outlines the inhouse technical information system, OSTI of Osaka Research Institute, Sumitomo Chemical company as an example of inhouse database construction and use at a chemical industry. This system is to compile database for technical information generated inside the Laboratory and to provide online searching as well as title lists of the latest data output from it aiming at effective use of information among the departments, prevention from overlapped research thema, and support of research activities. The system outline, characteristics, materials to be covered, input items and search examples are described.

  10. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Tamura, Haruki; Mezaki, Koji

    This paper describes fundamental idea of technical information management in Mitsubishi Heavy Industries, Ltd., and present status of the activities. Then it introduces the background and history of the development, problems and countermeasures against them regarding Mitsubishi Heavy Industries Technical Information Retrieval System (called MARON) which started its service in May, 1985. The system deals with databases which cover information common to the whole company (in-house research and technical reports, holding information of books, journals and so on), and local information held in each business division or department. Anybody from any division can access to these databases through the company-wide network. The in-house interlibrary loan subsystem called Orderentry is available, which supports acquiring operation of original materials.

  11. NREL: U.S. Life Cycle Inventory Database - Advisory Committee

    Science.gov Websites

    Advisory Committee The U.S. Life Cycle Inventory (LCI) Database established an advisory committee to provide technical and financial guidance to the NREL database management team. The committee will Assessing and responding to user feedback to ensure that the database meets the needs of data providers

  12. CB Database: A change blindness database for objects in natural indoor scenes.

    PubMed

    Sareen, Preeti; Ehinger, Krista A; Wolfe, Jeremy M

    2016-12-01

    Change blindness has been a topic of interest in cognitive sciences for decades. Change detection experiments are frequently used for studying various research topics such as attention and perception. However, creating change detection stimuli is tedious and there is no open repository of such stimuli using natural scenes. We introduce the Change Blindness (CB) Database with object changes in 130 colored images of natural indoor scenes. The size and eccentricity are provided for all the changes as well as reaction time data from a baseline experiment. In addition, we have two specialized satellite databases that are subsets of the 130 images. In one set, changes are seen in rooms or in mirrors in those rooms (Mirror Change Database). In the other, changes occur in a room or out a window (Window Change Database). Both the sets have controlled background, change size, and eccentricity. The CB Database is intended to provide researchers with a stimulus set of natural scenes with defined stimulus parameters that can be used for a wide range of experiments. The CB Database can be found at http://search.bwh.harvard.edu/new/CBDatabase.html .

  13. Coordinating Council. First Meeting: NASA/RECON database

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A Council of NASA Headquarters, American Institute of Aeronautics and Astronautics (AIAA), and the NASA Scientific and Technical Information (STI) Facility management met (1) to review and discuss issues of NASA concern, and (2) to promote new and better ways to collect and disseminate scientific and technical information. Topics mentioned for study and discussion at subsequent meetings included the pros and cons of transferring the NASA/RECON database to the commercial sector, the quality of the database, and developing ways to increase foreign acquisitions. The input systems at AIAA and the STI Facility were described. Also discussed were the proposed RECON II retrieval system, the transmittal of document orders received by the Facility and sent to AIAA, and the handling of multimedia input by the Departments of Defense and Commerce. A second meeting was scheduled for six weeks later to discuss database quality and international foreign input.

  14. A Framework for Mapping User-Designed Forms to Relational Databases

    ERIC Educational Resources Information Center

    Khare, Ritu

    2011-01-01

    In the quest for database usability, several applications enable users to design custom forms using a graphical interface, and forward engineer the forms into new databases. The path-breaking aspect of such applications is that users are completely shielded from the technicalities of database creation. Despite this innovation, the process of…

  15. Coordinating Council. Fourth Meeting: NACA Documents Database Project

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This NASA Scientific and Technical Information Coordination Council meeting dealt with the topic 'NACA Documents Database Project'. The following presentations were made and reported on: NACA documents database project study plan, AIAA study, the Optimal NACA database, Deficiencies in online file, NACA documents: Availability and Preservation, the NARA Collection: What is in it? and What to do about it?, and NACA foreign documents and availability. Visuals are available for most presentations.

  16. Method of identification of patent trends based on descriptions of technical functions

    NASA Astrophysics Data System (ADS)

    Korobkin, D. M.; Fomenkov, S. A.; Golovanchikov, A. B.

    2018-05-01

    The use of the global patent space to determine the scientific and technological priorities for the technical systems development (identifying patent trends) allows one to forecast the direction of the technical systems development and, accordingly, select patents of priority technical subjects as a source for updating the technical functions database and physical effects database. The authors propose an original method that uses as trend terms not individual unigrams or n-gram (usually for existing methods and systems), but structured descriptions of technical functions in the form “Subject-Action-Object” (SAO), which in the authors’ opinion are the basis of the invention.

  17. Clinical Databases for Chest Physicians.

    PubMed

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  18. Long-term patterns in CD4 response are determined by an interaction between baseline CD4 cell count, viral load, and time: The Asia Pacific HIV Observational Database (APHOD).

    PubMed

    Egger, Sam; Petoumenos, Kathy; Kamarulzaman, Adeeba; Hoy, Jennifer; Sungkanuparph, Somnuek; Chuah, John; Falster, Kathleen; Zhou, Jialun; Law, Matthew G

    2009-04-15

    Random effects models were used to explore how the shape of CD4 cell count responses after commencing combination antiretroviral therapy (cART) develop over time and, in particular, the role of baseline and follow-up covariates. Patients in Asia Pacific HIV Observational Database who first commenced cART after January 1, 1997, and who had a baseline CD4 cell count and viral load measure and at least 1 follow-up measure between 6 and 24 months, were included. CD4 cell counts were determined at every 6-month period after the commencement of cART for up to 6 years. A total of 1638 patients fulfilled the inclusion criteria with a median follow-up time of 58 months. Lower post-cART mean CD4 cell counts were found to be associated with increasing age (P < 0.001), pre-cART hepatitis C coinfection (P = 0.038), prior AIDS (P = 0.019), baseline viral load < or equal to 100,000 copies per milliliter (P < 0.001), and the Asia Pacific region compared with Australia (P = 0.005). A highly significant 3-way interaction between the effects of time, baseline CD4 cell count, and post-cART viral burden (P < 0.0001) was demonstrated. Higher long-term mean CD4 cell counts were associated with lower baseline CD4 cell count and consistently undetectable viral loads. Among patients with consistently detectable viral load, CD4 cell counts seemed to converge for all baseline CD4 levels. Our analysis suggest that the long-term shape of post-cART CD4 cell count changes depends only on a 3-way interaction between baseline CD4 cell count, viral load response, and time.

  19. Technical and Organizational Considerations for the Long-Term Maintenance and Development of Digital Brain Atlases and Web-Based Databases

    PubMed Central

    Ito, Kei

    2010-01-01

    Digital brain atlas is a kind of image database that specifically provide information about neurons and glial cells in the brain. It has various advantages that are unmatched by conventional paper-based atlases. Such advantages, however, may become disadvantages if appropriate cares are not taken. Because digital atlases can provide unlimited amount of data, they should be designed to minimize redundancy and keep consistency of the records that may be added incrementally by different staffs. The fact that digital atlases can easily be revised necessitates a system to assure that users can access previous versions that might have been cited in papers at a particular period. To inherit our knowledge to our descendants, such databases should be maintained for a very long period, well over 100 years, like printed books and papers. Technical and organizational measures to enable long-term archive should be considered seriously. Compared to the initial development of the database, subsequent efforts to increase the quality and quantity of its contents are not regarded highly, because such tasks do not materialize in the form of publications. This fact strongly discourages continuous expansion of, and external contributions to, the digital atlases after its initial launch. To solve these problems, the role of the biocurators is vital. Appreciation of the scientific achievements of the people who do not write papers, and establishment of the secure academic career path for them, are indispensable for recruiting talents for this very important job. PMID:20661458

  20. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  1. Open access intrapartum CTG database.

    PubMed

    Chudáček, Václav; Spilka, Jiří; Burša, Miroslav; Janků, Petr; Hruban, Lukáš; Huptych, Michal; Lhotská, Lenka

    2014-01-13

    Cardiotocography (CTG) is a monitoring of fetal heart rate and uterine contractions. Since 1960 it is routinely used by obstetricians to assess fetal well-being. Many attempts to introduce methods of automatic signal processing and evaluation have appeared during the last 20 years, however still no significant progress similar to that in the domain of adult heart rate variability, where open access databases are available (e.g. MIT-BIH), is visible. Based on a thorough review of the relevant publications, presented in this paper, the shortcomings of the current state are obvious. A lack of common ground for clinicians and technicians in the field hinders clinically usable progress. Our open access database of digital intrapartum cardiotocographic recordings aims to change that. The intrapartum CTG database consists in total of 552 intrapartum recordings, which were acquired between April 2010 and August 2012 at the obstetrics ward of the University Hospital in Brno, Czech Republic. All recordings were stored in electronic form in the OB TraceVue®;system. The recordings were selected from 9164 intrapartum recordings with clinical as well as technical considerations in mind. All recordings are at most 90 minutes long and start a maximum of 90 minutes before delivery. The time relation of CTG to delivery is known as well as the length of the second stage of labor which does not exceed 30 minutes. The majority of recordings (all but 46 cesarean sections) is - on purpose - from vaginal deliveries. All recordings have available biochemical markers as well as some more general clinical features. Full description of the database and reasoning behind selection of the parameters is presented in the paper. A new open-access CTG database is introduced which should give the research community common ground for comparison of results on reasonably large database. We anticipate that after reading the paper, the reader will understand the context of the field from clinical and

  2. A complete database for the Einstein imaging proportional counter

    NASA Technical Reports Server (NTRS)

    Helfand, David J.

    1991-01-01

    A complete database for the Einstein Imaging Proportional Counter (IPC) was completed. The original data that makes up the archive is described as well as the structure of the database, the Op-Ed analysis system, the technical advances achieved relative to the analysis of (IPC) data, the data products produced, and some uses to which the database has been put by scientists outside Columbia University over the past year.

  3. Development of a database for Louisiana highway bridge scour data : technical summary.

    DOT National Transportation Integrated Search

    1999-10-01

    The objectives of the project included: 1) developed a database with manipulation capabilities such as data retrieval, visualization, and update; 2) Input the existing scour data from DOTD files into the database.

  4. NASA scientific and technical publications: A catalog of special publications, reference publications, conference publications, and technical papers, 1989

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This catalog lists 190 citations of all NASA Special Publications, NASA Reference Publications, NASA Conference Publications, and NASA Technical Papers that were entered into the NASA scientific and technical information database during accession year 1989. The entries are grouped by subject category. Indexes of subject terms, personal authors, and NASA report numbers are provided.

  5. NASA scientific and technical publications: A catalog of Special Publications, Reference Publications, Conference Publications, and Technical Papers, 1987

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This catalog lists 239 citations of all NASA Special Publications, NASA Reference Publications, NASA Conference Publications, and NASA Technical Papers that were entered in the NASA scientific and technical information database during accession year 1987. The entries are grouped by subject category. Indexes of subject terms, personal authors, and NASA report numbers are provided.

  6. technical Hexachlorocyclohexane (t-HCH)

    Integrated Risk Information System (IRIS)

    technical Hexachlorocyclohexane ( t - HCH ) ; CASRN 608 - 73 - 1 Human health assessment information on a chemical substance is included in the IRIS database only after a comprehensive review of toxicity data , as outlined in the IRIS assessment development process . Sections I ( Health Hazard Asses

  7. A Database for Compounding Stress Intensity Factors.

    DTIC Science & Technology

    1985-04-01

    8217"’.’’" . ." ’" " " " "".. .... .... .... . .. .... . : ,.’. .. MICROCOP RE O UT O ," EST CHART...8217 - .. . , ., . . ,. . .,\\ .- . , , . -, . .- . . . , * TR 85046 o UNLIMITED __ ROYAL AIRCRAFT ESTABLISHMENT O Technical Report 85046 April 1985 A DATABASE FOR COMPOUNDING STRESS INTENSITY FACTORS L W by...DATABASE FOR COMPOUNDING STRESS INTENSITY FACTORS by A. M. Prior . ..- - . .. _ D. P. Rooke D. J. Cartwright* SUMMARY . The compounding method enables

  8. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    PubMed Central

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  9. In-Space Manufacturing Baseline Property Development

    NASA Technical Reports Server (NTRS)

    Stockman, Tom; Schneider, Judith; Prater, Tracie; Bean, Quincy; Werkheiser, Nicki

    2016-01-01

    The In-Space Manufacturing (ISM) project at NASA Marshall Space Flight Center currently operates a 3D FDM (fused deposition modeling) printer onboard the International Space Station. In order to enable utilization of this capability by designer, the project needs to establish characteristic material properties for materials produced using the process. This is difficult for additive manufacturing since standards and specifications do not yet exist for these technologies. Due to availability of crew time, there are limitations to the sample size which in turn limits the application of the traditional design allowables approaches to develop a materials property database for designers. In this study, various approaches to development of material databases were evaluated for use by designers of space systems who wish to leverage in-space manufacturing capabilities. This study focuses on alternative statistical techniques for baseline property development to support in-space manufacturing.

  10. 77 FR 12234 - Changes in Hydric Soils Database Selection Criteria

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... Conservation Service [Docket No. NRCS-2011-0026] Changes in Hydric Soils Database Selection Criteria AGENCY... Changes to the National Soil Information System (NASIS) Database Selection Criteria for Hydric Soils of the United States. SUMMARY: The National Technical Committee for Hydric Soils (NTCHS) has updated the...

  11. [The theme of disaster in health care: profile of technical and scientific production in the specialized database on disasters of the Virtual Health Library - VHL].

    PubMed

    Rocha, Vania; Ximenes, Elisa Francioli; Carvalho, Mauren Lopes de; Alpino, Tais de Moura Ariza; Freitas, Carlos Machado de

    2014-09-01

    In the specialized database of the Virtual Health Library (VHL), the DISASTER database highlights the importance of the theme for the health sector. The scope of this article is to identify the profiles of technical and scientific publications in the specialized database. Based on systematic searches and the analysis of results it is possible to determine: the type of publication; the main topics addressed; the most common type of disasters mentioned in published materials, countries and regions as subjects, historic periods with the most publications and the current trend of publications. When examining the specialized data in detail, it soon becomes clear that the number of major topics is very high, making a specific search process in this database a challenging exercise. On the other hand, it is encouraging that the disaster topic is discussed and assessed in a broad and diversified manner, associated with different aspects of the natural and social sciences. The disaster issue requires the production of interdisciplinary knowledge development to reduce the impacts of disasters and for risk management. In this way, since the health sector is a interdisciplinary area, it can contribute to knowledge production.

  12. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  13. National INFOSEC technical baseline: multi-level secure systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, J P

    1998-09-28

    The purpose of this report is to provide a baseline description of the state of multilevel processor/processing to the INFOSEC Research Council and at their discretion to the R&D community at large. From the information in the report, it is hoped that the members of the IRC will be aware of gaps in MLS research. A primary purpose is to bring IRC and the research community members up to date on what is happening in the MLS arena. The review will attempt to cover what MLS products are still available, and to identify companies who still offer MLS products. Wemore » have also attempted to identify requirements for MLS by interviewing senior officers of the Intelligence community as well as those elements of DoD and DOE who are or may be interested in procuring MLS products for various applications. The balance of the report consists of the following sections; a background review of the highlights of the developments of MLS, a quick summary of where we are today in terms of products, installations, and companies who are still in the business of supplying MLS systems [or who are developing MLS system], the requirements as expressed by senior members of the Intelligence community and DoD and DOE, issues and unmet R&D challenges surrounding MLS, and finally a set of recommended research topics.« less

  14. Baseline estimation in flame's spectra by using neural networks and robust statistics

    NASA Astrophysics Data System (ADS)

    Garces, Hugo; Arias, Luis; Rojas, Alejandro

    2014-09-01

    This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.

  15. Weight change by baseline BMI from three-year observational data: findings from the Worldwide Schizophrenia Outpatient Health Outcomes Database.

    PubMed

    Bushe, Chris J; Slooff, Cees J; Haddad, Peter M; Karagianis, Jamie L

    2013-04-01

    The aim was to explore weight and body mass index (BMI) changes by baseline BMI in patients completing three years of monotherapy with various first- and second-generation antipsychotics in a large cohort in a post hoc analysis of three-year observational data. Data were analyzed by antipsychotic and three baseline BMI bands: underweight/normal weight (BMI <25 kg/m²), overweight (25-30 kg/m²) and obese (>30 kg/m²). Baseline BMI was associated with subsequent weight change irrespective of the antipsychotic given. Specifically, a smaller proportion of patients gained ≥7% baseline bodyweight, and a greater proportion of patients lost ≥7% baseline bodyweight with increasing baseline BMI. For olanzapine (the antipsychotic associated with highest mean weight gain in the total drug cohort), the percentage of patients gaining ≥7% baseline weight was 45% (95% CI: 43-48) in the underweight/normal weight BMI cohort and 20% (95% CI: 15-27) in the obese BMI cohort; 7% (95% CI: 6-8) of the underweight/normal cohort and 19% (95% CI: 13-27) of the obese cohort lost ≥7% baseline weight. BMI has an association with the likelihood of weight gain or loss and should be considered in analyses of antipsychotic weight change.

  16. NASA scientific and technical publications: A catalog of special publications, reference publications, conference publications, and technical papers, 1987-1990

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This catalog lists 783 citations of all NASA Special Publications, NASA Reference Publications, NASA Conference Publications, and NASA Technical Papers that were entered into NASA Scientific and Technical Information Database during the year's 1987 through 1990. The entries are grouped by subject category. Indexes of subject terms, personal authors, and NASA report numbers are provided.

  17. New baseline correction algorithm for text-line recognition with bidirectional recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Morillot, Olivier; Likforman-Sulem, Laurence; Grosicki, Emmanuèle

    2013-04-01

    Many preprocessing techniques have been proposed for isolated word recognition. However, recently, recognition systems have dealt with text blocks and their compound text lines. In this paper, we propose a new preprocessing approach to efficiently correct baseline skew and fluctuations. Our approach is based on a sliding window within which the vertical position of the baseline is estimated. Segmentation of text lines into subparts is, thus, avoided. Experiments conducted on a large publicly available database (Rimes), with a BLSTM (bidirectional long short-term memory) recurrent neural network recognition system, show that our baseline correction approach highly improves performance.

  18. NASA scientific and technical publications: A catalog of special publications, reference publications, conference publications, and technical papers, 1991-1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This catalog lists 458 citations of all NASA Special Publications, NASA Reference Publications, NASA Conference Publications, and NASA Technical Papers that were entered into the NASA Scientific and Technical Information database during accession year 1991 through 1992. The entries are grouped by subject category. Indexes of subject terms, personal authors, and NASA report numbers are provided.

  19. Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Humphries, W. R.

    2001-01-01

    Engineering design is a challenging activity for any product. Since launch vehicles are highly complex and interconnected and have extreme energy densities, their design represents a challenge of the highest order. The purpose of this document is to delineate and clarify the design process associated with the launch vehicle for space flight transportation. The goal is to define and characterize a baseline for the space transportation design process. This baseline can be used as a basis for improving effectiveness and efficiency of the design process. The baseline characterization is achieved via compartmentalization and technical integration of subsystems, design functions, and discipline functions. First, a global design process overview is provided in order to show responsibility, interactions, and connectivity of overall aspects of the design process. Then design essentials are delineated in order to emphasize necessary features of the design process that are sometimes overlooked. Finally the design process characterization is presented. This is accomplished by considering project technical framework, technical integration, process description (technical integration model, subsystem tree, design/discipline planes, decision gates, and tasks), and the design sequence. Also included in the document are a snapshot relating to process improvements, illustrations of the process, a survey of recommendations from experienced practitioners in aerospace, lessons learned, references, and a bibliography.

  20. The GraVent DDT database

    NASA Astrophysics Data System (ADS)

    Boeck, Lorenz R.; Katzy, Peter; Hasslberger, Josef; Kink, Andreas; Sattelmayer, Thomas

    2016-09-01

    An open-access online platform containing data from experiments on deflagration-to-detonation transition conducted at the Institute of Thermodynamics, Technical University of Munich, has been developed and is accessible at http://www.td.mw.tum.de/ddt. The database provides researchers working on explosion dynamics with data for theoretical analyses and for the validation of numerical simulations.

  1. Technical Basis for PNNL Beryllium Inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Michelle Lynn

    2014-07-09

    The Department of Energy (DOE) issued Title 10 of the Code of Federal Regulations Part 850, “Chronic Beryllium Disease Prevention Program” (the Beryllium Rule) in 1999 and required full compliance by no later than January 7, 2002. The Beryllium Rule requires the development of a baseline beryllium inventory of the locations of beryllium operations and other locations of potential beryllium contamination at DOE facilities. The baseline beryllium inventory is also required to identify workers exposed or potentially exposed to beryllium at those locations. Prior to DOE issuing 10 CFR 850, Pacific Northwest Nuclear Laboratory (PNNL) had documented the beryllium characterizationmore » and worker exposure potential for multiple facilities in compliance with DOE’s 1997 Notice 440.1, “Interim Chronic Beryllium Disease.” After DOE’s issuance of 10 CFR 850, PNNL developed an implementation plan to be compliant by 2002. In 2014, an internal self-assessment (ITS #E-00748) of PNNL’s Chronic Beryllium Disease Prevention Program (CBDPP) identified several deficiencies. One deficiency is that the technical basis for establishing the baseline beryllium inventory when the Beryllium Rule was implemented was either not documented or not retrievable. In addition, the beryllium inventory itself had not been adequately documented and maintained since PNNL established its own CBDPP, separate from Hanford Site’s program. This document reconstructs PNNL’s baseline beryllium inventory as it would have existed when it achieved compliance with the Beryllium Rule in 2001 and provides the technical basis for the baseline beryllium inventory.« less

  2. CALS Baseline Architecture Analysis of Weapons System. Technical Information: Army, Draft. Volume 8

    DOT National Transportation Integrated Search

    1989-09-01

    This effort was performed to provide a common framework for analysis and planning of CALS initiatives across the military services, leading eventually to the development of a common DoD-wide architecture for CALS. This study addresses Army technical ...

  3. Potentials of Advanced Database Technology for Military Information Systems

    DTIC Science & Technology

    2001-04-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010866 TITLE: Potentials of Advanced Database Technology for Military... Technology for Military Information Systems Sunil Choennia Ben Bruggemanb a National Aerospace Laboratory, NLR, P.O. Box 90502, 1006 BM Amsterdam...application of advanced information tech- nology, including database technology , as underpin- actions X and Y as dangerous or not? ning is

  4. GEOMAGIA50: An archeointensity database with PHP and MySQL

    NASA Astrophysics Data System (ADS)

    Korhonen, K.; Donadini, F.; Riisager, P.; Pesonen, L. J.

    2008-04-01

    The GEOMAGIA50 database stores 3798 archeomagnetic and paleomagnetic intensity determinations dated to the past 50,000 years. It also stores details of the measurement setup for each determination, which are used for ranking the data according to prescribed reliability criteria. The ranking system aims to alleviate the data reliability problem inherent in this kind of data. GEOMAGIA50 is based on two popular open source technologies. The MySQL database management system is used for storing the data, whereas the functionality and user interface are provided by server-side PHP scripts. This technical brief gives a detailed description of GEOMAGIA50 from a technical viewpoint.

  5. Scale out databases for CERN use cases

    NASA Astrophysics Data System (ADS)

    Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper

    2015-12-01

    Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.

  6. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    PubMed

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  7. NASA scientific and technical information for the 1990s

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.

    1990-01-01

    Projections for NASA scientific and technical information (STI) in the 1990s are outlined. NASA STI for the 1990s will maintain a quality bibliographic and full-text database, emphasizing electronic input and products supplemented by networked access to a wide variety of sources, particularly numeric databases.

  8. Nato-Pco Database

    NASA Astrophysics Data System (ADS)

    Wtv Gmbh

    This new CD-ROM is a reference database. It covers almost twenty years of non-military scientific/technical meetings and publications sponsored by the NATO Science Committee. It contains full references (with keywords and/or abstracts) to more than 30,000 contributions from scientists all over the world and is published in more than 1,000 volumes. With the easy-to-follow menu options of the retrieval software, access to the data is simple and fast. Updates are planned on a yearly basis.

  9. Methods for using clinical laboratory test results as baseline confounders in multi-site observational database studies when missing data are expected.

    PubMed

    Raebel, Marsha A; Shetterly, Susan; Lu, Christine Y; Flory, James; Gagne, Joshua J; Harrell, Frank E; Haynes, Kevin; Herrinton, Lisa J; Patorno, Elisabetta; Popovic, Jennifer; Selvan, Mano; Shoaibi, Azadeh; Wang, Xingmei; Roy, Jason

    2016-07-01

    Our purpose was to quantify missing baseline laboratory results, assess predictors of missingness, and examine performance of missing data methods. Using the Mini-Sentinel Distributed Database from three sites, we selected three exposure-outcome scenarios with laboratory results as baseline confounders. We compared hazard ratios (HRs) or risk differences (RDs) and 95% confidence intervals (CIs) from models that omitted laboratory results, included only available results (complete cases), and included results after applying missing data methods (multiple imputation [MI] regression, MI predictive mean matching [PMM] indicator). Scenario 1 considered glucose among second-generation antipsychotic users and diabetes. Across sites, glucose was available for 27.7-58.9%. Results differed between complete case and missing data models (e.g., olanzapine: HR 0.92 [CI 0.73, 1.12] vs 1.02 [0.90, 1.16]). Across-site models employing different MI approaches provided similar HR and CI; site-specific models provided differing estimates. Scenario 2 evaluated creatinine among individuals starting high versus low dose lisinopril and hyperkalemia. Creatinine availability: 44.5-79.0%. Results differed between complete case and missing data models (e.g., HR 0.84 [CI 0.77, 0.92] vs. 0.88 [0.83, 0.94]). HR and CI were identical across MI methods. Scenario 3 examined international normalized ratio (INR) among warfarin users starting interacting versus noninteracting antimicrobials and bleeding. INR availability: 20.0-92.9%. Results differed between ignoring INR versus including INR using missing data methods (e.g., RD 0.05 [CI -0.03, 0.13] vs 0.09 [0.00, 0.18]). Indicator and PMM methods gave similar estimates. Multi-site studies must consider site variability in missing data. Different missing data methods performed similarly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. ECOS E-MATRIX Methane and Volatile Organic Carbon (VOC) Emissions Best Practices Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisien, Lia

    2016-01-31

    This final scientific/technical report on the ECOS e-MATRIX Methane and Volatile Organic Carbon (VOC) Emissions Best Practices Database provides a disclaimer and acknowledgement, table of contents, executive summary, description of project activities, and briefing/technical presentation link.

  11. Lessons learned while building the Deepwater Horizon Database: Toward improved data sharing in coastal science

    NASA Astrophysics Data System (ADS)

    Thessen, Anne E.; McGinnis, Sean; North, Elizabeth W.

    2016-02-01

    Process studies and coupled-model validation efforts in geosciences often require integration of multiple data types across time and space. For example, improved prediction of hydrocarbon fate and transport is an important societal need which fundamentally relies upon synthesis of oceanography and hydrocarbon chemistry. Yet, there are no publically accessible databases which integrate these diverse data types in a georeferenced format, nor are there guidelines for developing such a database. The objective of this research was to analyze the process of building one such database to provide baseline information on data sources and data sharing and to document the challenges and solutions that arose during this major undertaking. The resulting Deepwater Horizon Database was approximately 2.4 GB in size and contained over 8 million georeferenced data points collected from industry, government databases, volunteer networks, and individual researchers. The major technical challenges that were overcome were reconciliation of terms, units, and quality flags which were necessary to effectively integrate the disparate data sets. Assembling this database required the development of relationships with individual researchers and data managers which often involved extensive e-mail contacts. The average number of emails exchanged per data set was 7.8. Of the 95 relevant data sets that were discovered, 38 (40%) were obtained, either in whole or in part. Over one third (36%) of the requests for data went unanswered. The majority of responses were received after the first request (64%) and within the first week of the first request (67%). Although fewer than half of the potentially relevant datasets were incorporated into the database, the level of sharing (40%) was high compared to some other disciplines where sharing can be as low as 10%. Our suggestions for building integrated databases include budgeting significant time for e-mail exchanges, being cognizant of the cost versus

  12. Core Technical Capability Laboratory Management System

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda; Dugger, Curtis; Griffin, Laurie

    2008-01-01

    The Core Technical Capability Lab - oratory Management System (CTCLMS) consists of dynamically generated Web pages used to access a database containing detailed CTC lab data with the software hosted on a server that allows users to have remote access.

  13. Functions and requirements document for interim store solidified high-level and transuranic waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith-Fewell, M.A., Westinghouse Hanford

    1996-05-17

    The functions, requirements, interfaces, and architectures contained within the Functions and Requirements (F{ampersand}R) Document are based on the information currently contained within the TWRS Functions and Requirements database. The database also documents the set of technically defensible functions and requirements associated with the solidified waste interim storage mission.The F{ampersand}R Document provides a snapshot in time of the technical baseline for the project. The F{ampersand}R document is the product of functional analysis, requirements allocation and architectural structure definition. The technical baseline described in this document is traceable to the TWRS function 4.2.4.1, Interim Store Solidified Waste, and its related requirements, architecture,more » and interfaces.« less

  14. Logistics Operations Management Center: Maintenance Support Baseline (LOMC-MSB)

    NASA Technical Reports Server (NTRS)

    Kurrus, R.; Stump, F.

    1995-01-01

    The Logistics Operations Management Center Maintenance Support Baseline is defined. A historical record of systems, applied to and deleted from, designs in support of future management and/or technical analysis is provided. All Flight elements, Ground Support Equipment, Facility Systems and Equipment and Test Support Equipment for which LOMC has responsibilities at Kennedy Space Center and other locations are listed. International Space Station Alpha Program documentation is supplemented. The responsibility of the Space Station Launch Site Support Office is established.

  15. Experimental Database with Baseline CFD Solutions: 2-D and Axisymmetric Hypersonic Shock-Wave/Turbulent-Boundary-Layer Interactions

    NASA Technical Reports Server (NTRS)

    Marvin, Joseph G.; Brown, James L.; Gnoffo, Peter A.

    2013-01-01

    A database compilation of hypersonic shock-wave/turbulent boundary layer experiments is provided. The experiments selected for the database are either 2D or axisymmetric, and include both compression corner and impinging type SWTBL interactions. The strength of the interactions range from attached to incipient separation to fully separated flows. The experiments were chosen based on criterion to ensure quality of the datasets, to be relevant to NASA's missions and to be useful for validation and uncertainty assessment of CFD Navier-Stokes predictive methods, both now and in the future. An emphasis on datasets selected was on surface pressures and surface heating throughout the interaction, but include some wall shear stress distributions and flowfield profiles. Included, for selected cases, are example CFD grids and setup information, along with surface pressure and wall heating results from simulations using current NASA real-gas Navier-Stokes codes by which future CFD investigators can compare and evaluate physics modeling improvements and validation and uncertainty assessments of future CFD code developments. The experimental database is presented tabulated in the Appendices describing each experiment. The database is also provided in computer-readable ASCII files located on a companion DVD.

  16. Bias error reduction using ratios to baseline experiments. Heat transfer case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakroun, W.; Taylor, R.P.; Coleman, H.W.

    1993-10-01

    Employing a set of experiments devoted to examining the effect of surface finish (riblets) on convective heat transfer as an example, this technical note seeks to explore the notion that precision uncertainties in experiments can be reduced by repeated trials and averaging. This scheme for bias error reduction can give considerable advantage when parametric effects are investigated experimentally. When the results of an experiment are presented as a ratio with the baseline results, a large reduction in the overall uncertainty can be achieved when all the bias limits in the variables of the experimental result are fully correlated with thosemore » of the baseline case. 4 refs.« less

  17. The role of non-technical skills in surgery

    PubMed Central

    Agha, Riaz A.; Fowler, Alexander J.; Sevdalis, Nick

    2015-01-01

    Non-technical skills are of increasing importance in surgery and surgical training. A traditional focus on technical skills acquisition and competence is no longer enough for the delivery of a modern, safe surgical practice. This review discusses the importance of non-technical skills and the values that underpin successful modern surgical practice. This narrative review used a number of sources including written and online, there was no specific search strategy of defined databases. Modern surgical practice requires; technical and non-technical skills, evidence-based practice, an emphasis on lifelong learning, monitoring of outcomes and a supportive institutional and health service framework. Finally these requirements need to be combined with a number of personal and professional values including integrity, professionalism and compassionate, patient-centred care. PMID:26904193

  18. Transport and Environment Database System (TRENDS): Maritime air pollutant emission modelling

    NASA Astrophysics Data System (ADS)

    Georgakaki, Aliki; Coffey, Robert A.; Lock, Graham; Sorenson, Spencer C.

    This paper reports the development of the maritime module within the framework of the Transport and Environment Database System (TRENDS) project. A detailed database has been constructed for the calculation of energy consumption and air pollutant emissions. Based on an in-house database of commercial vessels kept at the Technical University of Denmark, relationships between the fuel consumption and size of different vessels have been developed, taking into account the fleet's age and service speed. The technical assumptions and factors incorporated in the database are presented, including changes from findings reported in Methodologies for Estimating air pollutant Emissions from Transport (MEET). The database operates on statistical data provided by Eurostat, which describe vessel and freight movements from and towards EU 15 major ports. Data are at port to Maritime Coastal Area (MCA) level, so a bottom-up approach is used. A port to MCA distance database has also been constructed for the purpose of the study. This was the first attempt to use Eurostat maritime statistics for emission modelling; and the problems encountered, since the statistical data collection was not undertaken with a view to this purpose, are mentioned. Examples of the results obtained by the database are presented. These include detailed air pollutant emission calculations for bulk carriers entering the port of Helsinki, as an example of the database operation, and aggregate results for different types of movements for France. Overall estimates of SO x and NO x emission caused by shipping traffic between the EU 15 countries are in the area of 1 and 1.5 million tonnes, respectively.

  19. Very long baseline interferometry using a radio telescope in Earth orbit

    NASA Technical Reports Server (NTRS)

    Ulvestad, J. S.; Edwards, C. D.; Linfield, R. P.

    1987-01-01

    Successful Very Long Baseline Interferometry (VLBI) observations at 2.3 GHz were made using an antenna aboard an Earth-orbiting spacecraft as one of the receiving telescopes. These observations employed the first deployed satellite (TDRSE-E for East) of the NASA Tracking and Data Relay Satellite System (TDRSS). Fringes were found for 3 radio sources on baselines between TDRSE and telescopes in Australia and Japan. The purpose of this experiment and the characteristics of the spacecraft that are related to the VLBI observations are described. The technical obstacles to maintaining phase coherence between the orbiting antenna and the ground stations, as well as the calibration schemes for the communication link between TDRSE and its ground station at White Sands, New Mexico are explored. System coherence results and scientific results for the radio source observations are presented. Using all available calibrations, a coherence of 84% over 700 seconds was achieved for baselines to the orbiting telescope.

  20. Database of Renewable Energy and Energy Efficiency Incentives and Policies Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lips, Brian

    The Database of State Incentives for Renewables and Efficiency (DSIRE) is an online resource that provides summaries of all financial incentives and regulatory policies that support the use of renewable energy and energy efficiency across all 50 states. This project involved making enhancements to the database and website, and the ongoing research and maintenance of the policy and incentive summaries.

  1. NASA aerospace database subject scope: An overview

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Outlined here is the subject scope of the NASA Aerospace Database, a publicly available subset of the NASA Scientific and Technical (STI) Database. Topics of interest to NASA are outlined and placed within the framework of the following broad aerospace subject categories: aeronautics, astronautics, chemistry and materials, engineering, geosciences, life sciences, mathematical and computer sciences, physics, social sciences, space sciences, and general. A brief discussion of the subject scope is given for each broad area, followed by a similar explanation of each of the narrower subject fields that follow. The subject category code is listed for each entry.

  2. Recent improvements in the NASA technical report server

    NASA Technical Reports Server (NTRS)

    Maa, Ming-Hokng; Nelson, Michael L.

    1995-01-01

    The NASA Technical Report Server (NTRS), a World Wide Web (WWW) report distribution service, has been modified to allow parallel database queries, significantly decreasing user access time by an average factor of 2.3, access from clients behind firewalls and/or proxies which truncate excessively long Uniform Resource Locators (URL's), access to non-Wide Area Information Server (WAIS) databases, and compatibility with the Z39-50.3 protocol.

  3. 76 FR 60031 - Notice of Order: Revisions to Enterprise Public Use Database Incorporating High-Cost Single...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Database Incorporating High-Cost Single-Family Securitized Loan Data Fields and Technical Data Field... single-family matrix in FHFA's Public Use Database (PUDB) to include data fields for the high-cost single... of loan attributes in FHFA's databases that could be used, singularly or in some combination, to...

  4. The development of a prototype intelligent user interface subsystem for NASA's scientific database systems

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Roelofs, Larry H.; Short, Nicholas M., Jr.

    1987-01-01

    The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components the development of an Intelligent User Interface (IUI).The intent of the latter is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. The purpose is to support the large number of potential scientific and engineering users presently having need of space and land related research and technical data but who have little or no experience in query languages or understanding of the information content or architecture of the databases involved. This technical memorandum presents prototype Intelligent User Interface Subsystem (IUIS) using the Crustal Dynamics Project Database as a test bed for the implementation of the CRUDDES (Crustal Dynamics Expert System). The knowledge base has more than 200 rules and represents a single application view and the architectural view. Operational performance using CRUDDES has allowed nondatabase users to obtain useful information from the database previously accessible only to an expert database user or the database designer.

  5. A Knowledge Database on Thermal Control in Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Hirasawa, Shigeki; Satoh, Isao

    A prototype version of a knowledge database on thermal control in manufacturing processes, specifically, molding, semiconductor manufacturing, and micro-scale manufacturing has been developed. The knowledge database has search functions for technical data, evaluated benchmark data, academic papers, and patents. The database also displays trends and future roadmaps for research topics. It has quick-calculation functions for basic design. This paper summarizes present research topics and future research on thermal control in manufacturing engineering to collate the information to the knowledge database. In the molding process, the initial mold and melt temperatures are very important parameters. In addition, thermal control is related to many semiconductor processes, and the main parameter is temperature variation in wafers. Accurate in-situ temperature measurment of wafers is important. And many technologies are being developed to manufacture micro-structures. Accordingly, the knowledge database will help further advance these technologies.

  6. US and foreign alloy cross-reference database

    NASA Technical Reports Server (NTRS)

    Springer, John M.; Morgan, Steven H.

    1991-01-01

    Marshall Space Flight Center and other NASA installations have a continuing requirement for materials data from other countries involved with the development of joint international Spacelab experiments and other hardware. This need includes collecting data for common alloys to ascertain composition, physical properties, specifications, and designations. This data is scattered throughout a large number of specification statements, standards, handbooks, and other technical literature which make a manual search both tedious and often limited in extent. In recognition of this problem, a computerized database of information on alloys was developed along with the software necessary to provide the desired functions to access this data. The intention was to produce an initial database covering aluminum alloys, along with the program to provide a user-interface to the data, and then later to extend and refine the database to include other nonferrous and ferrous alloys.

  7. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    PubMed

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (<20), Borderline (20-199), Elevated (200-1999), and Highly Elevated (>2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  8. Human Variome Project Quality Assessment Criteria for Variation Databases.

    PubMed

    Vihinen, Mauno; Hancock, John M; Maglott, Donna R; Landrum, Melissa J; Schaafsma, Gerard C P; Taschner, Peter

    2016-06-01

    Numerous databases containing information about DNA, RNA, and protein variations are available. Gene-specific variant databases (locus-specific variation databases, LSDBs) are typically curated and maintained for single genes or groups of genes for a certain disease(s). These databases are widely considered as the most reliable information source for a particular gene/protein/disease, but it should also be made clear they may have widely varying contents, infrastructure, and quality. Quality is very important to evaluate because these databases may affect health decision-making, research, and clinical practice. The Human Variome Project (HVP) established a Working Group for Variant Database Quality Assessment. The basic principle was to develop a simple system that nevertheless provides a good overview of the quality of a database. The HVP quality evaluation criteria that resulted are divided into four main components: data quality, technical quality, accessibility, and timeliness. This report elaborates on the developed quality criteria and how implementation of the quality scheme can be achieved. Examples are provided for the current status of the quality items in two different databases, BTKbase, an LSDB, and ClinVar, a central archive of submissions about variants and their clinical significance. © 2016 WILEY PERIODICALS, INC.

  9. A long baseline global stereo matching based upon short baseline estimation

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  10. Speech Databases of Typical Children and Children with SLI

    PubMed Central

    Grill, Pavel; Tučková, Jana

    2016-01-01

    The extent of research on children’s speech in general and on disordered speech specifically is very limited. In this article, we describe the process of creating databases of children’s speech and the possibilities for using such databases, which have been created by the LANNA research group in the Faculty of Electrical Engineering at Czech Technical University in Prague. These databases have been principally compiled for medical research but also for use in other areas, such as linguistics. Two databases were recorded: one for healthy children’s speech (recorded in kindergarten and in the first level of elementary school) and the other for pathological speech of children with a Specific Language Impairment (recorded at a surgery of speech and language therapists and at the hospital). Both databases were sub-divided according to specific demands of medical research. Their utilization can be exoteric, specifically for linguistic research and pedagogical use as well as for studies of speech-signal processing. PMID:26963508

  11. International Linear Collider Technical Design Report (Volumes 1 through 4)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison M.

    2013-03-27

    The design report consists of four volumes: Volume 1, Executive Summary; Volume 2, Physics; Volume 3, Accelerator (Part I, R and D in the Technical Design Phase, and Part II, Baseline Design); and Volume 4, Detectors.

  12. Energy science and technology database (on the internet). Online data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Energy Science and Technology Database (EDB) is a multidisciplinary file containing worldwide references to basic and applied scientific and technical research literature. The information is collected for use by government managers, researchers at the national laboratories, and other research efforts sponsored by the U.S. Department of Energy, and the results of this research are transferred to the public. Abstracts are included for records from 1976 to the present. The EDB also contains the Nuclear Science Abstracts which is a comprehensive abstract and index collection to the international nuclear science and technology literature for the period 1948 through 1976. Includedmore » are scientific and technical reports of the U.S. Atomic Energy Commission, U.S. Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Approximately 25% of the records in the file contain abstracts. Nuclear Science Abstracts contains over 900,000 bibliographic records. The entire Energy Science and Technology Database contains over 3 million bibliographic records. This database is now available for searching through the GOV. Research-Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  13. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Senoo, Tetsuo

    As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.

  14. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Data for baseline and alternative... PROGRAMS (CONTINUED) SULFUR DIOXIDE OPT-INS Allowance Calculations for Combustion Sources § 74.20 Data for baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  15. Whither the White Knight: CDROM in Technical Services.

    ERIC Educational Resources Information Center

    Campbell, Brian

    1987-01-01

    Outlines evaluative criteria and compares optical data disk products used in library technical processes, including bibliographic records for cataloging, acquisition databases, and local public access catalogs. An extensive table provides information on specific products, including updates, interfaces, edit screens, installation help, manuals,…

  16. The Primate Life History Database: A unique shared ecological data resource

    PubMed Central

    Strier, Karen B.; Altmann, Jeanne; Brockman, Diane K.; Bronikowski, Anne M.; Cords, Marina; Fedigan, Linda M.; Lapp, Hilmar; Liu, Xianhua; Morris, William F.; Pusey, Anne E.; Stoinski, Tara S.; Alberts, Susan C.

    2011-01-01

    Summary The importance of data archiving, data sharing, and public access to data has received considerable attention. Awareness is growing among scientists that collaborative databases can facilitate these activities.We provide a detailed description of the collaborative life history database developed by our Working Group at the National Evolutionary Synthesis Center (NESCent) to address questions about life history patterns and the evolution of mortality and demographic variability in wild primates.Examples from each of the seven primate species included in our database illustrate the range of data incorporated and the challenges, decision-making processes, and criteria applied to standardize data across diverse field studies. In addition to the descriptive and structural metadata associated with our database, we also describe the process metadata (how the database was designed and delivered) and the technical specifications of the database.Our database provides a useful model for other researchers interested in developing similar types of databases for other organisms, while our process metadata may be helpful to other groups of researchers interested in developing databases for other types of collaborative analyses. PMID:21698066

  17. Two baselines are better than one: Improving the reliability of computerized testing in sports neuropsychology.

    PubMed

    Bruce, Jared; Echemendia, Ruben; Tangeman, Lindy; Meeuwisse, Willem; Comper, Paul; Hutchison, Michael; Aubry, Mark

    2016-01-01

    Computerized neuropsychological tests are frequently used to assist in return-to-play decisions following sports concussion. However, due to concerns about test reliability, the Centers for Disease Control and Prevention recommends yearly baseline testing. The standard practice that has developed in baseline/postinjury comparisons is to examine the difference between the most recent baseline test and postconcussion performance. Drawing from classical test theory, the present study investigated whether temporal stability could be improved by taking an alternate approach that uses the aggregate of 2 baselines to more accurately estimate baseline cognitive ability. One hundred fifteen English-speaking professional hockey players with 3 consecutive Immediate Postconcussion Assessment and Testing (ImPACT) baseline tests were extracted from a clinical program evaluation database overseen by the National Hockey League and National Hockey League Players' Association. The temporal stability of ImPACT composite scores was significantly increased by aggregating test performance during Sessions 1 and 2 to predict performance during Session 3. Using this approach, the 2-factor Memory (r = .72) and Speed (r = .79) composites of ImPACT showed acceptable long-term reliability. Using the aggregate of 2 baseline scores significantly improves temporal stability and allows for more accurate predictions of cognitive change following concussion. Clinicians are encouraged to estimate baseline abilities by taking into account all of an athlete's previous baseline scores.

  18. Air/Superfund national technical guidance study series, Volume 2. Estimation of baseline air emission at Superfund sites. Interim report(Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1989-01-01

    This volume is one in a series of manuals prepared for EPA to assist its Remedial Project Managers in the assessment of the air contaminant pathway and developing input data for risk assessment. The manual provides guidance on developing baseline-emission estimates from hazardous waste sites. Baseline-emission estimates (BEEs) are defined as emission rates estimated for a site in its undisturbed state. Specifically, the manual is intended to: Present a protocol for selecting the appropriate level of effort to characterize baseline air emissions; Assist site managers in designing an approach for BEEs; Describe useful technologies for developing site-specific baseline emission estimatesmore » (BEEs); Help site managers select the appropriate technologies for generating site-specific BEEs.« less

  19. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI): A Completed Reference Database of Lung Nodules on CT Scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2011-02-15

    Purpose: The development of computer-aided diagnostic (CAD) methods for lung nodule detection, classification, and quantitative assessment can be facilitated through a well-characterized repository of computed tomography (CT) scans. The Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) completed such a database, establishing a publicly available reference for the medical imaging research community. Initiated by the National Cancer Institute (NCI), further advanced by the Foundation for the National Institutes of Health (FNIH), and accompanied by the Food and Drug Administration (FDA) through active participation, this public-private partnership demonstrates the success of a consortium founded on a consensus-based process.more » Methods: Seven academic centers and eight medical imaging companies collaborated to identify, address, and resolve challenging organizational, technical, and clinical issues to provide a solid foundation for a robust database. The LIDC/IDRI Database contains 1018 cases, each of which includes images from a clinical thoracic CT scan and an associated XML file that records the results of a two-phase image annotation process performed by four experienced thoracic radiologists. In the initial blinded-read phase, each radiologist independently reviewed each CT scan and marked lesions belonging to one of three categories (''nodule{>=}3 mm,''''nodule<3 mm,'' and ''non-nodule{>=}3 mm''). In the subsequent unblinded-read phase, each radiologist independently reviewed their own marks along with the anonymized marks of the three other radiologists to render a final opinion. The goal of this process was to identify as completely as possible all lung nodules in each CT scan without requiring forced consensus. Results: The Database contains 7371 lesions marked ''nodule'' by at least one radiologist. 2669 of these lesions were marked ''nodule{>=}3 mm'' by at least one radiologist, of which 928 (34.7%) received such marks

  20. Landslide incidence in the North of Portugal: Analysis of a historical landslide database based on press releases and technical reports

    NASA Astrophysics Data System (ADS)

    Pereira, Susana; Zêzere, José Luís; Quaresma, Ivânia Daniela; Bateira, Carlos

    2014-06-01

    This work presents and explores the Northern Portugal Landslide Database (NPLD) for the period 1900-2010. NPLD was compiled from press releases (regional and local newspapers) and technical reports (reports by civil protection authorities and academic works); it includes 628 landslides, corresponding to 5.7 landslides per year on average. Although 50% of landslides occurred in the last 35 years of the series, the temporal distribution of landslides does not show any regular increase with time. The relationship between annual precipitation and landslide occurrence shows that reported landslides tend to be more frequent in wetter years. Moreover, landslides occur mostly in the wettest months of the year (December, January and February), which reflects the importance of rainfall in triggering slope instability. Most landslides cause damage that affects people and/or structures; 69.4% of the landslides in Northern Portugal caused 136 fatalities, 173 injured and left 460 persons homeless. More than half of the total landslides (321 landslides) led to railway or motorway closures and 49 landslides destroyed 126 buildings. The NPLD is compared with a landslide database for the whole of Portugal constructed from a single daily national newspaper covering the same reference period. It will be demonstrated that the regional and local newspapers are more effective than the national newspaper in reporting damaging landslides in the North of Portugal. Like other documentary-based landslide inventories, the NPLD does not accurately report non-damaging landslides. Therefore, NPLD was found unsuitable to validate municipal-scale landslide susceptibility models derived from detailed geomorphology-based landslide inventories.

  1. State Practices in the Assessment of Outcomes for Students with Disabilities. Technical Report.

    ERIC Educational Resources Information Center

    Shriner, James G.; And Others

    This technical report describes the methodology, results, and conclusions of a 1991 survey, which was conducted to determine state efforts to develop systems to assess educational outcomes, states' needs for solutions to technical/implementation problems, existing databases, and efforts of states to design a comprehensive system of indicators in…

  2. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    PubMed

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  3. Baseline Geochemical Data for Medical Researchers in Kentucky

    NASA Astrophysics Data System (ADS)

    Anderson, W.

    2017-12-01

    According to the Centers for Disease Control, Kentucky has the highest cancer incidence and death rates in the country. New efforts by geochemists and medical researchers are examining ways to diagnose the origin and sources of carcinogenesis. In an effort to determine if naturally occurring geochemical or mineral elements contributes to the cancer causation, the Kentucky Geological Survey has established a Minerals and Geochemical Database that is available to medical researchers for examination of baseline geochemistry and determine if naturally occurring mineral or chemical elements contribute to the high rate of cancers in the state. Cancer causation is complex, so if natural sources can be accounted for, then researchers can focus on the true causation. Naturally occurring minerals, metals and elements occur in many parts of the state, and their presence is valuable for evaluating causation. For example, some data in the database contain maps showing (a) statewide elemental geochemistry, (b) areas of black shale oxidation occurrence, which releases metals in soil and surface waters, (c) some clay deposits in the state that can contain high content of rare earth elements, and (d) site-specific uranium occurrences. Knowing the locations of major ore deposits in the state can also provide information related to mineral and chemical anomalies, such as for base metals and mercury. Radionuclide data in soil and water analyses are limited, so future research may involve obtaining more analyses to determine radon potential. This database also contains information on faulting and geology in the state. Although the metals content of trees may not seem relevant, the ash and humus content of degraded trees affects soil, stream sediment and water geochemistry. Many rural homes heat with wood, releasing metals into the surrounding biosphere. Stressed vegetation techniques can be used to explore for ore deposits and look for high metal contents in soils and rocks. These

  4. 76 FR 77533 - Notice of Order: Revisions to Enterprise Public Use Database Incorporating High-Cost Single...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... FEDERAL HOUSING FINANCE AGENCY [No. 2011-N-13] Notice of Order: Revisions to Enterprise Public Use Database Incorporating High-Cost Single-Family Securitized Loan Data Fields and Technical Data Field..., regarding FHFA's adoption of an Order revising FHFA's Public Use Database matrices to include certain data...

  5. NoSQL technologies for the CMS Conditions Database

    NASA Astrophysics Data System (ADS)

    Sipos, Roland

    2015-12-01

    With the restart of the LHC in 2015, the growth of the CMS Conditions dataset will continue, therefore the need of consistent and highly available access to the Conditions makes a great cause to revisit different aspects of the current data storage solutions. We present a study of alternative data storage backends for the Conditions Databases, by evaluating some of the most popular NoSQL databases to support a key-value representation of the CMS Conditions. The definition of the database infrastructure is based on the need of storing the conditions as BLOBs. Because of this, each condition can reach the size that may require special treatment (splitting) in these NoSQL databases. As big binary objects may be problematic in several database systems, and also to give an accurate baseline, a testing framework extension was implemented to measure the characteristics of the handling of arbitrary binary data in these databases. Based on the evaluation, prototypes of a document store, using a column-oriented and plain key-value store, are deployed. An adaption layer to access the backends in the CMS Offline software was developed to provide transparent support for these NoSQL databases in the CMS context. Additional data modelling approaches and considerations in the software layer, deployment and automatization of the databases are also covered in the research. In this paper we present the results of the evaluation as well as a performance comparison of the prototypes studied.

  6. JDD, Inc. Database

    NASA Technical Reports Server (NTRS)

    Miller, David A., Jr.

    2004-01-01

    JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the

  7. NASA STI program database: Journal coverage (1990-1992)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Data are given in tabular form on the extent of recent journal accessions (1990-1992) to the NASA Scientific and Technical Information (STI) Database. Journals are presented by country in two ways: first by an alphabetical listing; and second, by the decreasing number of citations extracted from these journals during this period. An appendix containing a statistical summary is included.

  8. Indexing of Patents of Pharmaceutical Composition in Online Databases

    NASA Astrophysics Data System (ADS)

    Online searching of patents of pharmaceutical composition is generally considered to be very difficult. It is due to the fact that the patent databases include extensive technical information as well as legal information so that they are not likely to have index proper to the pharmaceutical composition or even if they have such index, the scope and coverage of indexing is ambiguous. This paper discusses how patents of pharmaceutical composition are indexed in online databases such as WPl, CA, CLAIMS, USP and PATOLIS. Online searching of patents of pharmaceutical composition are also discussed in some detail.

  9. An intermediary's perspective of online databases for local governments

    NASA Technical Reports Server (NTRS)

    Jack, R. F.

    1984-01-01

    Numerous public administration studies have indicated that local government agencies for a variety of reasons lack access to comprehensive information resources; furthermore, such entities are often unwilling or unable to share information regarding their own problem-solving innovations. The NASA/University of Kentucky Technology Applications Program devotes a considerable effort to providing scientific and technical information and assistance to local agencies, relying on its access to over 500 distinct online databases offered by 20 hosts. The author presents a subjective assessment, based on his own experiences, of several databases which may prove useful in obtaining information for this particular end-user community.

  10. Lessons Learned and Technical Standards: A Logical Marriage for Future Space Systems Design

    NASA Technical Reports Server (NTRS)

    Gill, Paul S.; Garcia, Danny; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    A comprehensive database of engineering lessons learned that corresponds with relevant technical standards will be a valuable asset to those engaged in studies on future space vehicle developments, especially for structures, materials, propulsion, control, operations and associated elements. In addition, this will enable the capturing of technology developments applicable to the design, development, and operation of future space vehicles as planned in the Space Launch Initiative. Using the time-honored tradition of passing on lessons learned while utilizing the newest information technology, NASA has launched an intensive effort to link lessons learned acquired through various Internet databases with applicable technical standards. This paper will discuss the importance of lessons learned, the difficulty in finding relevant lessons learned while engaged in a space vehicle development, and the new NASA effort to relate them to technical standards that can help alleviate this difficulty.

  11. BIOLEFF: three databases on air pollution effects on vegetation.

    PubMed

    Bennett, J P; Buchen, M J

    1995-01-01

    Three databases on air pollution effects on vegetation were developed by storing bibliographic and abstract data for technical literature on the subject in a free-form database program, 'askSam'. Approximately 4 000 journal articles have been computerized in three separate database files: BIOLEFF, LICHENS and METALS. BIOLEFF includes over 2 800 articles on the effects of approximately 25 gaseous and particulate pollutants on over 2 000 species of vascular plants. LICHENS includes almost 400 papers on the effects of gaseous and heavy metal pollutants on over 735 species of lichens and mosses. METALS includes over 465 papers on the effects of heavy metals on over 830 species of vascular plants. The combined databases include articles from about 375 different journals spanning 1905 to the present. Picea abies and Phaseolus vulgaris are the most studied vascular plants in BIOLEFF, while Hypogymnia physodes is the most studied lichen species in LICHENS. Ozone and sulfur dioxide are the most studied gaseous pollutants with about two thirds of the records in BIOLEFF. The combined size of the databases is now about 5.5 megabytes.

  12. The Office of Environmental Management technical reports: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-07-01

    The Office of Environmental Management`s (EM) technical reports bibliography is an annual publication that contains information on scientific and technical reports sponsored by the Office of Environmental Management added to the Energy Science and Technology Database from July 1, 1995 through Sept. 30, 1996. This information is divided into the following categories: Focus Areas and Crosscutting Programs. Support Programs, Technology Integration and International Technology Exchange are now included in the General category. EM`s Office of Science and Technology sponsors this bibliography.

  13. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Kato, Toshio

    Osaka Gas Co., Ltd. constructed Osaka Gas Technical Information System (OGTIS) in 1979, which stores and retrieves the in-house technical information and provides even primary materials by unifying optical disk files, facsimile system and so on. The major information sources are technical materials, survey materials, planning documents, design materials, research reports, business tour reports which are all generated inside the Company. At the present moment it amounts to 25,000 items in total adding 1,000 items annually. The data file is updated once in a month and also outputs the abstract journal OGTIS Report monthly. In 1983 it constructed System for International Exchange of Personal Information (SIP) as a subsystem of OGTIS in order to compile SIP database which covers exchange outlines with oversea enterprises or organizations. The data size is 2,600 totally adding about 500 annually with monthly data updating.

  14. WaveNet: A Web-Based Metocean Data Access, Processing, and Analysis Tool. Part 3 - CDIP Database

    DTIC Science & Technology

    2014-06-01

    and Analysis Tool; Part 3 – CDIP Database by Zeki Demirbilek, Lihwa Lin, and Derek Wilson PURPOSE: This Coastal and Hydraulics Engineering...Technical Note (CHETN) describes coupling of the Coastal Data Information Program ( CDIP ) database to WaveNet, the first module of MetOcnDat (Meteorological...provides a step-by-step procedure to access, process, and analyze wave and wind data from the CDIP database. BACKGROUND: WaveNet addresses a basic

  15. Can We Predict Technical Aptitude?: A Systematic Review.

    PubMed

    Louridas, Marisa; Szasz, Peter; de Montbrun, Sandra; Harris, Kenneth A; Grantcharov, Teodor P

    2016-04-01

    To identify background characteristics and cognitive tests that may predict surgical trainees' future technical performance, and therefore be used to supplement existing surgical residency selection criteria. Assessment of technical skills is not commonly incorporated as part of the selection process for surgical trainees in North America. Emerging evidence, however, suggests that not all trainees are capable of reaching technical competence. Therefore, incorporating technical aptitude into selection processes may prove useful. A systematic search was carried out of the MEDLINE, PsycINFO, and Embase online databases to identify all studies that assessed associations between surrogate markers of innate technical abilities in surgical trainees, and whether these abilities correlate with technical performance. The quality of each study was evaluated using the Grading of Recommendations, Assessment, Development, and Evaluation system. A total of 8035 records were identified. After screening by title, abstract, and full text, 52 studies were included. Very few surrogate markers were found to predict technical performance. Significant associations with technical performance were seen for 1 of 23 participant-reported surrogate markers, 2 of 25 visual spatial tests, and 2 of 19 dexterity tests. The assessment of trainee Basic Performance Resources predicted technical performance in 62% and 75% of participants. To date, no single test has been shown to reliably predict the technical performance of surgical trainees. Strategies that rely on assessing multiple innate abilities, their interaction, and their relationship with technical skill may ultimately be more likely to serve as reliable predictors of future surgical performance.

  16. Technical Assessment of Maglev System Concepts

    DTIC Science & Technology

    1998-10-01

    pressurizes the loop but retains sufficient heat capacity for the day’s cooling needs. Magneplane uses a cryorefrigerator to keep its supercritical helium in...comparative baselines. the technical and economic viability of maglev in * Apply this process to alternative U.S. maglev the U.S. and to recommend...output/joules- heat the same data as in Figure 119 with the aforemen- input). In effect, applying this factor implies that tioned efficiencies applied

  17. Healthy People 2010 and Asian Americans/Pacific Islanders: defining a baseline of information.

    PubMed

    Ghosh, Chandak

    2003-12-01

    Healthy People 2010: Understanding and Improving Health lists 6 areas of disparity in minority health services: infant mortality, cancer, cardiovascular disease, HIV/AIDS, diabetes, and immunizations. This study compiles existing Asian American and Pacific Islander (AAPI) health data to establish a baseline. For federally-sponsored research (1986-2000), the Computer Retrieval of Information on Specific Projects (CRISP) database was analyzed. AAPI initiatives were divided by subpopulation and disparity area. MEDLINE articles (1966-2000) were similarly scrutinized. Few federal health-related grants (0.2%) and MEDLINE articles (0.01%) mention AAPIs. For the 6 disparity areas, significant AAPI data gaps remain. To reach the Healthy People 2010 goals and have useful data, researchers and grant makers must focus on obtaining baseline data for disaggregated AAPI subgroups.

  18. Recovery of baseline lung function after pulmonary exacerbation in children with primary ciliary dyskinesia.

    PubMed

    Sunther, Meera; Bush, Andrew; Hogg, Claire; McCann, Lauren; Carr, Siobhán B

    2016-12-01

    Spirometry in children with cystic fibrosis (CF) frequently fails to return to baseline after treatment for a pulmonary exacerbation. It is unclear whether the same is true for children with primary ciliary dyskinesia (PCD). To determine in children with PCD treated with intravenous antibiotics for a pulmonary exacerbation: (1) the proportion who recover to baseline forced expiratory volume at 1 sec (FEV 1 ) within 3 months after treatment and (2) to try to identify factors which are associated with failure to regain pre-exacerbation FEV 1 . Cohort study using the PCD database for children at the Royal Brompton Hospital, 2003-2013. We selected the first pulmonary exacerbation treated with intravenous antibiotics. The best FEV 1 within 3 months after treatment was compared to the best FEV 1 in the 12 months before treatment (baseline). Recovery to baseline was defined as any FEV 1 after treatment that was greater than or equal to 90% of the baseline FEV 1 . 32/150 children (21%) had at least one pulmonary exacerbation. 23/30 (77%) regained baseline spirometry within 3 months of treatment. There was no difference between responders and non-responders in any baseline characteristics. Around 25% of children with PCD fail to recover to baseline lung function within 3 months following treatment for a pulmonary exacerbation, similar to CF. Better treatment strategies are needed, and the results also suggest that prevention of exacerbations would be a useful end-point in clinical trials. Pediatr Pulmonol. 2016;51:1362-1366. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Translation from the collaborative OSM database to cartography

    NASA Astrophysics Data System (ADS)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  20. Integrated Space Asset Management Database and Modeling

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd; Gagliano, Larry; Percy, Thomas; Mason, Shane

    2015-01-01

    Effective Space Asset Management is one key to addressing the ever-growing issue of space congestion. It is imperative that agencies around the world have access to data regarding the numerous active assets and pieces of space junk currently tracked in orbit around the Earth. At the center of this issues is the effective management of data of many types related to orbiting objects. As the population of tracked objects grows, so too should the data management structure used to catalog technical specifications, orbital information, and metadata related to those populations. Marshall Space Flight Center's Space Asset Management Database (SAM-D) was implemented in order to effectively catalog a broad set of data related to known objects in space by ingesting information from a variety of database and processing that data into useful technical information. Using the universal NORAD number as a unique identifier, the SAM-D processes two-line element data into orbital characteristics and cross-references this technical data with metadata related to functional status, country of ownership, and application category. The SAM-D began as an Excel spreadsheet and was later upgraded to an Access database. While SAM-D performs its task very well, it is limited by its current platform and is not available outside of the local user base. Further, while modeling and simulation can be powerful tools to exploit the information contained in SAM-D, the current system does not allow proper integration options for combining the data with both legacy and new M&S tools. This paper provides a summary of SAM-D development efforts to date and outlines a proposed data management infrastructure that extends SAM-D to support the larger data sets to be generated. A service-oriented architecture model using an information sharing platform named SIMON will allow it to easily expand to incorporate new capabilities, including advanced analytics, M&S tools, fusion techniques and user interface for

  1. STI Handbook: Guidelines for Producing, Using, and Managing Scientific and Technical Information in the Department of the Navy. A Handbook for Navy Scientists and Engineers on the Use of Scientific and Technical Information

    DTIC Science & Technology

    1992-02-01

    6 What Information Should Be Included in the TR Database? 2-6 What Types of Media Can Be Used to Submit Information to the TR Database? 2-9 How Is...reports. Contract administration documents. Regulations. Commercially published books. WHAT TYPES OF MEDIA CAN BE USED TO SUBMIT INFORMATION TO THE TR...TOWARD DTIC’S WUIS DATA- BASE ? The WUIS database, used to control and report technical and management data, summarizes ongoing research and technology

  2. Analysis of factors affecting baseline SF-36 Mental Component Summary in Adult Spinal Deformity and its impact on surgical outcomes.

    PubMed

    Mmopelwa, Tiro; Ayhan, Selim; Yuksel, Selcen; Nabiyev, Vugar; Niyazi, Asli; Pellise, Ferran; Alanay, Ahmet; Sanchez Perez Grueso, Francisco Javier; Kleinstuck, Frank; Obeid, Ibrahim; Acaroglu, Emre

    2018-03-01

    To identify the factors that affect SF-36 mental component summary (MCS) in patients with adult spinal deformity (ASD) at the time of presentation, and to analyse the effect of SF-36 MCS on clinical outcomes in surgically treated patients. Prospectively collected data from a multicentric ASD database was analysed for baseline parameters. Then, the same database for surgically treated patients with a minimum of 1-year follow-up was analysed to see the effect of baseline SF-36 MCS on treatment results. A clinically useful SF-36 MCS was determined by ROC Curve analysis. A total of 229 patients with the baseline parameters were analysed. A strong correlation between SF-36 MCS and SRS-22, ODI, gender, and diagnosis were found (p < 0.05). For the second part of the study, a total of 186 surgically treated patients were analysed. Only for SF-36 PCS, the un-improved cohort based on minimum clinically important differences had significantly lower mean baseline SF-36 MCS (p < 0.001). SF-36 MCS was found to have an odds ratio of 0.914 in improving SF-36 PCS score (unit by unit) (p < 0.001). A cut-off point of 43.97 for SF-36 MCS was found to be predictive of SF-36 PCS (AUC = 0.631; p < 0.001). The factors effective on the baseline SF-36 MCS in an ASD population are other HRQOL parameters such as SRS-22 and ODI as well as the baseline thoracic kyphosis and gender. This study has also demonstrated that baseline SF-36 MCS does not necessarily have any effect on the treatment results by surgery as assessed by SRS-22 or ODI. Level III, prognostic study. Copyright © 2018 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  3. NASA Scientific and Technical Publications: A Catalog of Special Publications, Reference Publications, Conference Publications, and Technical Papers 1991-1992

    DTIC Science & Technology

    1993-02-01

    Scientific and Technical Information EXOBIOLOGY. HEALTH. MICROBIOLOGY . MICROOR- System during September 1990. Subject coverage includes: GANISMS...Houston. TX N91-24731 National Aeronautics and Space Administration. MICROBIOLOGY ON SPACE STATION FREEDOM Washington, DCr DUANE L. PIERSON, ed...and solution INASA-SP-7011(345)) p 37 N91-16547 Beyond the Baseline 1991ý Proceedings of the Space [NASA-TP-3242) p 43 N92-33483 Microbiology on Space

  4. PMAG: Relational Database Definition

    NASA Astrophysics Data System (ADS)

    Keizer, P.; Koppers, A.; Tauxe, L.; Constable, C.; Genevey, A.; Staudigel, H.; Helly, J.

    2002-12-01

    site location (latitude, longitude, elevation), geography (continent, country, region), geological setting (lithospheric plate or block, tectonic setting), geological age (age range, timescale name, stratigraphic position) and materials (rock type, classification, alteration state). Each data point and method description is also related to its peer-reviewed reference [citation ID] as archived in the EarthRef Reference Database (ERR). This guarantees direct traceability all the way to its original source, where the user can find the bibliography of each PMAG reference along with every abstract, data table, technical note and/or appendix that are available in digital form and that can be downloaded as PDF/JPEG images and Microsoft Excel/Word data files. This may help scientists and teachers in performing their research since they have easy access to all the scientific data. It also allows for checking potential errors during the digitization process. Please visit the PMAG website at http://earthref.org/PMAG/ for more information.

  5. [Technical improvement of cohort constitution in administrative health databases: Providing a tool for integration and standardization of data applicable in the French National Health Insurance Database (SNIIRAM)].

    PubMed

    Ferdynus, C; Huiart, L

    2016-09-01

    Administrative health databases such as the French National Heath Insurance Database - SNIIRAM - are a major tool to answer numerous public health research questions. However the use of such data requires complex and time-consuming data management. Our objective was to develop and make available a tool to optimize cohort constitution within administrative health databases. We developed a process to extract, transform and load (ETL) data from various heterogeneous sources in a standardized data warehouse. This data warehouse is architected as a star schema corresponding to an i2b2 star schema model. We then evaluated the performance of this ETL using data from a pharmacoepidemiology research project conducted in the SNIIRAM database. The ETL we developed comprises a set of functionalities for creating SAS scripts. Data can be integrated into a standardized data warehouse. As part of the performance assessment of this ETL, we achieved integration of a dataset from the SNIIRAM comprising more than 900 million lines in less than three hours using a desktop computer. This enables patient selection from the standardized data warehouse within seconds of the request. The ETL described in this paper provides a tool which is effective and compatible with all administrative health databases, without requiring complex database servers. This tool should simplify cohort constitution in health databases; the standardization of warehouse data facilitates collaborative work between research teams. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  6. The Identification of People with Disabilities in National Databases: A Failure to Communicate. Technical Report 6.

    ERIC Educational Resources Information Center

    McGrew, Kevin; And Others

    This research analyzes similarities and differences in how students with disabilities are identified in national databases, through examination of 19 national data collection programs in the U.S. Departments of Education, Commerce, Justice, and Health and Human Services, as well as databases from the National Science Foundation. The study found…

  7. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

    DOE Data Explorer

    Joe Iovenitti

    2013-05-15

    The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

  8. School-to-Work Transition of Career and Technical Education Graduates

    ERIC Educational Resources Information Center

    Packard, Becky Wai-Ling; Leach, Miki; Ruiz, Yedalis; Nelson, Consuelo; DiCocco, Hannah

    2012-01-01

    This study analyzed the career development of career and technical education (CTE) high school graduates during their school-to-work transition, specifically their adaptability in the face of barriers. Forty graduates (22 men, 18 women) from working-class backgrounds participated in baseline surveys at graduation and phenomenological interviews 1…

  9. Adapting an Agent-Based Model of Socio-Technical Systems to Analyze Security Failures

    DTIC Science & Technology

    2016-10-17

    total number of non-blackouts differed from the total number in the baseline data to a statistically significant extent with a p- valueɘ.0003...the total number of non-blackouts differed from the total number in the baseline data to a statistically significant extent with a p-valueɘ.0003...I. Nikolic, and Z. Lukszo, Eds., Agent-based modelling of socio-technical systems. Springer Science & Business Media, 2013, vol. 9. [12] A. P. Shaw

  10. Linking Bibliographic Data Bases: A Discussion of the Battelle Technical Report.

    ERIC Educational Resources Information Center

    Jones, C. Lee

    This document establishes the context, summarizes the contents, and discusses the Battelle technical report, noting certain constraints of the study. Further steps for the linking of bibliographic databases for use by academic and public libraries are suggested. (RAA)

  11. PRODORIC2: the bacterial gene regulation database in 2018

    PubMed Central

    Dudek, Christian-Alexander; Hartlich, Juliane; Brötje, David; Jahn, Dieter

    2018-01-01

    Abstract Bacteria adapt to changes in their environment via differential gene expression mediated by DNA binding transcriptional regulators. The PRODORIC2 database hosts one of the largest collections of DNA binding sites for prokaryotic transcription factors. It is the result of the thoroughly redesigned PRODORIC database. PRODORIC2 is more intuitive and user-friendly. Besides significant technical improvements, the new update offers more than 1000 new transcription factor binding sites and 110 new position weight matrices for genome-wide pattern searches with the Virtual Footprint tool. Moreover, binding sites deduced from high-throughput experiments were included. Data for 6 new bacterial species including bacteria of the Rhodobacteraceae family were added. Finally, a comprehensive collection of sigma- and transcription factor data for the nosocomial pathogen Clostridium difficile is now part of the database. PRODORIC2 is publicly available at http://www.prodoric2.de. PMID:29136200

  12. STP 4-06 Model-Based Technical Data in Procurement, 3D PDF Technology Data Demonstration Project. Phase 1 Summary

    DTIC Science & Technology

    2015-07-01

    O R G STP 4-06 MODEL-BASED TECHNICAL DATA IN PROCUREMENT 3D PDF TECHNOLOGY DATA DEMONSTRATION PROJECT PHASE 1 SUMMARY REPORT DL309T2...LMI’s ISO- certified quality management procedures. J U L Y 2 0 1 5 STP 4-06 MODEL-BASED TECHNICAL DATA IN PROCUREMENT 3D PDF TECHNICAL DATA...Based Technical Data ..................................................................................... 5 3D PDF Demonstration Team

  13. [A systematic evaluation of application of the web-based cancer database].

    PubMed

    Huang, Tingting; Liu, Jialin; Li, Yong; Zhang, Rui

    2013-10-01

    In order to support the theory and practice of the web-based cancer database development in China, we applied a systematic evaluation to assess the development condition of the web-based cancer databases at home and abroad. We performed computer-based retrieval of the Ovid-MEDLINE, Springerlink, EBSCOhost, Wiley Online Library and CNKI databases, the papers of which were published between Jan. 1995 and Dec. 2011, and retrieved the references of these papers by hand. We selected qualified papers according to the pre-established inclusion and exclusion criteria, and carried out information extraction and analysis of the papers. Eventually, searching the online database, we obtained 1244 papers, and checking the reference lists, we found other 19 articles. Thirty-one articles met the inclusion and exclusion criteria and we extracted the proofs and assessed them. Analyzing these evidences showed that the U.S.A. counted for 26% in the first place. Thirty-nine percent of these web-based cancer databases are comprehensive cancer databases. As for single cancer databases, breast cancer and prostatic cancer are on the top, both counting for 10% respectively. Thirty-two percent of the cancer database are associated with cancer gene information. For the technical applications, MySQL and PHP applied most widely, nearly 23% each.

  14. BBN technical memorandum W1291 infrasound model feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrell, T., BBN Systems and Technologies

    1998-05-01

    The purpose of this study is to determine the need and level of effort required to add existing atmospheric databases and infrasound propagation models to the DOE`s Hydroacoustic Coverage Assessment Model (HydroCAM) [1,2]. The rationale for the study is that the performance of the infrasound monitoring network will be an important factor for both the International Monitoring System (IMS) and US national monitoring capability. Many of the technical issues affecting the design and performance of the infrasound network are directly related to the variability of the atmosphere and the corresponding uncertainties in infrasound propagation. It is clear that the studymore » of these issues will be enhanced by the availability of software tools for easy manipulation and interfacing of various atmospheric databases and infrasound propagation models. In addition, since there are many similarities between propagation in the oceans and in the atmosphere, it is anticipated that much of the software infrastructure developed for hydroacoustic database manipulation and propagation modeling in HydroCAM will be directly extendible to an infrasound capability. The study approach was to talk to the acknowledged domain experts in the infrasound monitoring area to determine: 1. The major technical issues affecting infrasound monitoring network performance. 2. The need for an atmospheric database/infrasound propagation modeling capability similar to HydroCAM. 3. The state of existing infrasound propagation codes and atmospheric databases. 4. A recommended approach for developing the required capabilities. A list of the people who contributed information to this study is provided in Table 1. We also relied on our knowledge of oceanographic and meteorological data sources to determine the availability of atmospheric databases and the feasibility of incorporating this information into the existing HydroCAM geographic database software. This report presents a summary of the need for an

  15. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  16. A national database for essential drugs in South Africa.

    PubMed

    Zweygarth, M; Summers, R S

    2000-06-01

    In the process of drafting standard treatment guidelines for adults and children at hospital level, the Secretariat of the National Essential Drugs List Committee made use of a database designed with technical support from the School of Pharmacy, MEDUNSA. The database links the current 697 drugs on the Essential Drugs List with Standard Treatment Guidelines for over 400 conditions. It served to streamline the inclusion of different drugs and dosage forms in the various guidelines, and provided concise, updated information to other departments involved in drug procurement. From information on drug prices and morbidity, it can also be used to calculate drug consumption and cost estimates and compare them with actual figures.

  17. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    PubMed Central

    Kauppi, Tomi; Kämäräinen, Joni-Kristian; Kalesnykiene, Valentina; Sorri, Iiris; Uusitalo, Hannu; Kälviäinen, Heikki

    2013-01-01

    We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions. PMID:23956787

  18. Project Manager’s Guide to the Scientific and Technical Information (STINFO) Program and Technical Publications Process

    DTIC Science & Technology

    1993-12-01

    Iaporta .. y be definitive for the tubjoct proaentod, exploratory in natura, or an evaluation of critical Aubayato• or of technical problema , 4...International Security 9 Social and Natural Science Studies Field 41 Edit: (Type 3) -Entry of an invalid code when Performance Type is "C" or "M" will...analysis SF Foreign area social science research SP Foreign area policy planAing research BF Identifies databases with data on foreign forces or

  19. Neuroinformatics Database (NiDB) – A Modular, Portable Database for the Storage, Analysis, and Sharing of Neuroimaging Data

    PubMed Central

    Anderson, Beth M.; Stevens, Michael C.; Glahn, David C.; Assaf, Michal; Pearlson, Godfrey D.

    2013-01-01

    We present a modular, high performance, open-source database system that incorporates popular neuroimaging database features with novel peer-to-peer sharing, and a simple installation. An increasing number of imaging centers have created a massive amount of neuroimaging data since fMRI became popular more than 20 years ago, with much of that data unshared. The Neuroinformatics Database (NiDB) provides a stable platform to store and manipulate neuroimaging data and addresses several of the impediments to data sharing presented by the INCF Task Force on Neuroimaging Datasharing, including 1) motivation to share data, 2) technical issues, and 3) standards development. NiDB solves these problems by 1) minimizing PHI use, providing a cost effective simple locally stored platform, 2) storing and associating all data (including genome) with a subject and creating a peer-to-peer sharing model, and 3) defining a sample, normalized definition of a data storage structure that is used in NiDB. NiDB not only simplifies the local storage and analysis of neuroimaging data, but also enables simple sharing of raw data and analysis methods, which may encourage further sharing. PMID:23912507

  20. Advancing Precambrian palaeomagnetism with the PALEOMAGIA and PINT(QPI) databases

    PubMed Central

    Veikkolainen, Toni H.; Biggin, Andrew J.; Pesonen, Lauri J.; Evans, David A.; Jarboe, Nicholas A.

    2017-01-01

    State-of-the-art measurements of the direction and intensity of Earth’s ancient magnetic field have made important contributions to our understanding of the geology and palaeogeography of Precambrian Earth. The PALEOMAGIA and PINT(QPI) databases provide thorough public collections of important palaeomagnetic data of this kind. They comprise more than 4,100 observations in total and have been essential in supporting our international collaborative efforts to understand Earth's magnetic history on a timescale far longer than that of the present Phanerozoic Eon. Here, we provide an overview of the technical structure and applications of both databases, paying particular attention to recent improvements and discoveries. PMID:28534869

  1. Advancing Precambrian palaeomagnetism with the PALEOMAGIA and PINT(QPI) databases.

    PubMed

    Veikkolainen, Toni H; Biggin, Andrew J; Pesonen, Lauri J; Evans, David A; Jarboe, Nicholas A

    2017-05-23

    State-of-the-art measurements of the direction and intensity of Earth's ancient magnetic field have made important contributions to our understanding of the geology and palaeogeography of Precambrian Earth. The PALEOMAGIA and PINT( QPI ) databases provide thorough public collections of important palaeomagnetic data of this kind. They comprise more than 4,100 observations in total and have been essential in supporting our international collaborative efforts to understand Earth's magnetic history on a timescale far longer than that of the present Phanerozoic Eon. Here, we provide an overview of the technical structure and applications of both databases, paying particular attention to recent improvements and discoveries.

  2. HLLV avionics requirements study and electronic filing system database development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This final report provides a summary of achievements and activities performed under Contract NAS8-39215. The contract's objective was to explore a new way of delivering, storing, accessing, and archiving study products and information and to define top level system requirements for Heavy Lift Launch Vehicle (HLLV) avionics that incorporate Vehicle Health Management (VHM). This report includes technical objectives, methods, assumptions, recommendations, sample data, and issues as specified by DPD No. 772, DR-3. The report is organized into two major subsections, one specific to each of the two tasks defined in the Statement of Work: the Index Database Task and the HLLV Avionics Requirements Task. The Index Database Task resulted in the selection and modification of a commercial database software tool to contain the data developed during the HLLV Avionics Requirements Task. All summary information is addressed within each task's section.

  3. Uniformly Processed Strong Motion Database for Himalaya and Northeast Region of India

    NASA Astrophysics Data System (ADS)

    Gupta, I. D.

    2018-03-01

    This paper presents the first uniformly processed comprehensive database on strong motion acceleration records for the extensive regions of western Himalaya, northeast India, and the alluvial plains juxtaposing the Himalaya. This includes 146 three components of old analog records corrected for the instrument response and baseline distortions and 471 three components of recent digital records corrected for baseline errors. The paper first provides a background of the evolution of strong motion data in India and the seismotectonics of the areas of recording, then describes the details of the recording stations and the contributing earthquakes, which is finally followed by the methodology used to obtain baseline corrected data in a uniform and consistent manner. Two different schemes in common use for baseline correction are based on the application of the Ormsby filter without zero pads (Trifunac 1971) and that on the Butterworth filter with zero pads at the start as well as at the end (Converse and Brady 1992). To integrate the advantages of both the schemes, Ormsby filter with zero pads at the start only is used in the present study. A large number of typical example results are presented to illustrate that the methodology adopted is able to provide realistic velocity and displacement records with much smaller number of zero pads. The present strong motion database of corrected acceleration records will be useful for analyzing the ground motion characteristics of engineering importance, developing prediction equations for various strong motion parameters, and calibrating the seismological source model approach for ground motion simulation for seismically active and risk prone areas of India.

  4. The Israeli National Genetic database: a 10-year experience.

    PubMed

    Zlotogora, Joël; Patrinos, George P

    2017-03-16

    The Israeli National and Ethnic Mutation database ( http://server.goldenhelix.org/israeli ) was launched in September 2006 on the ETHNOS software to include clinically relevant genomic variants reported among Jewish and Arab Israeli patients. In 2016, the database was reviewed and corrected according to ClinVar ( https://www.ncbi.nlm.nih.gov/clinvar ) and ExAC ( http://exac.broadinstitute.org ) database entries. The present article summarizes some key aspects from the development and continuous update of the database over a 10-year period, which could serve as a paradigm of successful database curation for other similar resources. In September 2016, there were 2444 entries in the database, 890 among Jews, 1376 among Israeli Arabs, and 178 entries among Palestinian Arabs, corresponding to an ~4× data content increase compared to when originally launched. While the Israeli Arab population is much smaller than the Jewish population, the number of pathogenic variants causing recessive disorders reported in the database is higher among Arabs (934) than among Jews (648). Nevertheless, the number of pathogenic variants classified as founder mutations in the database is smaller among Arabs (175) than among Jews (192). In 2016, the entire database content was compared to that of other databases such as ClinVar and ExAC. We show that a significant difference in the percentage of pathogenic variants from the Israeli genetic database that were present in ExAC was observed between the Jewish population (31.8%) and the Israeli Arab population (20.6%). The Israeli genetic database was launched in 2006 on the ETHNOS software and is available online ever since. It allows querying the database according to the disorder and the ethnicity; however, many other features are not available, in particular the possibility to search according to the name of the gene. In addition, due to the technical limitations of the previous ETHNOS software, new features and data are not included in the

  5. A global baseline for spawning aggregations of reef fishes.

    PubMed

    Sadovy De Mitcheson, Yvonne; Cornish, Andrew; Domeier, Michael; Colin, Patrick L; Russell, Martin; Lindeman, Kenyon C

    2008-10-01

    Species that periodically and predictably congregate on land or in the sea can be extremely vulnerable to overexploitation. Many coral reef fishes form spawning aggregations that are increasingly the target of fishing. Although serious declines are well known for a few species, the extent of this behavior among fishes and the impacts of aggregation fishing are not appreciated widely. To profile aggregating species globally, establish a baseline for future work, and strengthen the case for protection, we (as members of the Society for the Conservation of Reef Fish Aggregations) developed a global database on the occurrence, history, and management of spawning aggregations. We complemented the database with information from interviews with over 300 fishers in Asia and the western Pacific. Sixty-seven species, mainly commercial, in 9 families aggregate to spawn in the 29 countries or territories considered in the database. Ninety percent of aggregation records were from reef pass channels, promontories, and outer reef-slope drop-offs. Multispecies aggregation sites were common, and spawning seasons of most species typically lasted <3 months. The best-documented species in the database, the Nassau grouper (Epinephelus striatus), has undergone substantial declines in aggregations throughout its range and is now considered threatened. Our findings have important conservation and management implications for aggregating species given that exploitation pressures on them are increasing, there is little effective management, and 79% of those aggregations sufficiently well documented were reported to be in decline. Nonetheless, a few success stories demonstrate the benefits of aggregation management. A major shift in perspective on spawning aggregations of reef fish, from being seen as opportunities for exploitation to acknowledging them as important life-history phenomena in need of management, is urgently needed.

  6. Cardiovascular (CV) Risk after Initiation of Abatacept versus TNF Inhibitors in Rheumatoid Arthritis Patients with and without Baseline CV Disease.

    PubMed

    Jin, Yinzhu; Kang, Eun Ha; Brill, Gregory; Desai, Rishi J; Kim, Seoyoung C

    2018-05-15

    To evaluate the cardiovascular safety of abatacept (ABA) versus tumor necrosis factor inhibitors (TNFi) in rheumatoid arthritis (RA) patients with and without underlying cardiovascular disease (CVD). We identified RA patients with and without baseline CVD who initiated ABA or TNFi by using data from 2 large US insurance claims databases: Medicare (2008-2013) and Truven MarketScan (2006-2015). After stratifying by baseline CVD, ABA initiators were 1:1 propensity score (PS) matched to TNFi initiators to control for > 60 baseline covariates. Cox proportional hazards regression estimated the HR and 95% CI for a composite endpoint of CVD including myocardial infarction, stroke/transient ischemic stroke, or coronary revascularization in the PS-matched cohorts. HR from 2 databases were combined through an inverse variance-weighted fixed-effects model. We included 6102 PS-matched pairs of ABA and TNFi initiators from Medicare and 6934 pairs from MarketScan. Of these, 35.3% in Medicare and 14.0% in MarketScan had baseline CVD. HR (95% CI) for composite CVD in the overall ABA group versus TNFi was 0.67 (0.55-0.81) in Medicare and 1.08 (0.83-1.41) in MarketScan with the combined HR of 0.79 (0.67-0.92). Among patients with baseline CVD, the HR (95% CI) was 0.71 (0.55-0.92) in Medicare and 1.02 (0.68-1.51) in MarketScan, with the combined HR of 0.79 (0.64-0.98). In this large cohort of publicly or privately insured patients with RA in the United States, ABA was associated with a 20% reduced risk of CVD versus TNFi. While this observational study is subject to potential residual confounding, our results were consistent in patients with baseline CVD.

  7. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  8. Incremental Aerodynamic Coefficient Database for the USA2

    NASA Technical Reports Server (NTRS)

    Richardson, Annie Catherine

    2016-01-01

    In March through May of 2016, a wind tunnel test was conducted by the Aerosciences Branch (EV33) to visually study the unsteady aerodynamic behavior over multiple transition geometries for the Universal Stage Adapter 2 (USA2) in the MSFC Aerodynamic Research Facility's Trisonic Wind Tunnel (TWT). The purpose of the test was to make a qualitative comparison of the transonic flow field in order to provide a recommended minimum transition radius for manufacturing. Additionally, 6 Degree of Freedom force and moment data for each configuration tested was acquired in order to determine the geometric effects on the longitudinal aerodynamic coefficients (Normal Force, Axial Force, and Pitching Moment). In order to make a quantitative comparison of the aerodynamic effects of the USA2 transition geometry, the aerodynamic coefficient data collected during the test was parsed and incorporated into a database for each USA2 configuration tested. An incremental aerodynamic coefficient database was then developed using the generated databases for each USA2 geometry as a function of Mach number and angle of attack. The final USA2 coefficient increments will be applied to the aerodynamic coefficients of the baseline geometry to adjust the Space Launch System (SLS) integrated launch vehicle force and moment database based on the transition geometry of the USA2.

  9. Meta-Analysis of the Relation of Baseline Right Ventricular Function to Response to Cardiac Resynchronization Therapy.

    PubMed

    Sharma, Abhishek; Bax, Jerome J; Vallakati, Ajay; Goel, Sunny; Lavie, Carl J; Kassotis, John; Mukherjee, Debabrata; Einstein, Andrew; Warrier, Nikhil; Lazar, Jason M

    2016-04-15

    Right ventricular (RV) dysfunction has been associated with adverse clinical outcomes in patients with heart failure (HF). Cardiac resynchronization therapy (CRT) improves left ventricular (LV) size and function in patients with markedly abnormal electrocardiogram QRS duration. However, relation of baseline RV function with response to CRT has not been well described. In this study, we aim to investigate the relation of baseline RV function with response to CRT as assessed by change in LV ejection fraction (EF). A systematic search of studies published from 1966 to May 31, 2015 was conducted using PubMed, CINAHL, Cochrane CENTRAL, and the Web of Science databases. Studies were included if they have reported (1) parameters of baseline RV function (tricuspid annular plane systolic excursion [TAPSE] or RVEF or RV basal strain or RV fractional area change [FAC]) and (2) LVEF before and after CRT. Random-effects metaregression was used to evaluate the effect of baseline RV function parameters and change in LVEF. Sixteen studies (n = 1,764) were selected for final analysis. Random-effects metaregression analysis showed no significant association between the magnitude of the difference in EF before and after CRT with baseline TAPSE (β = 0.005, p = 0.989); baseline RVEF (β = 0.270, p = 0.493); baseline RVFAC (β = -0.367, p = 0.06); baseline basal strain (β = -0.342, p = 0.462) after a mean follow-up period of 10.5 months. In conclusion, baseline RV function as assessed by TAPSE, FAC, basal strain, or RVEF does not determine response to CRT as assessed by change in LVEF. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false How are the baseline toxics value and... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Toxics Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined? (a...

  11. 40 CFR 80.915 - How are the baseline toxics value and baseline toxics volume determined?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false How are the baseline toxics value and... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Toxics Baseline Determination § 80.915 How are the baseline toxics value and baseline toxics volume determined? (a...

  12. Development and application of basis database for materials life cycle assessment in china

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqing; Gong, Xianzheng; Liu, Yu

    2017-03-01

    As the data intensive method, high quality environmental burden data is an important premise of carrying out materials life cycle assessment (MLCA), and the reliability of data directly influences the reliability of the assessment results and its application performance. Therefore, building Chinese MLCA database is the basic data needs and technical supports for carrying out and improving LCA practice. Firstly, some new progress on database which related to materials life cycle assessment research and development are introduced. Secondly, according to requirement of ISO 14040 series standards, the database framework and main datasets of the materials life cycle assessment are studied. Thirdly, MLCA data platform based on big data is developed. Finally, the future research works were proposed and discussed.

  13. Effectiveness, safety and costs of thromboembolic prevention in patients with non-valvular atrial fibrillation: phase I ESC-FA protocol study and baseline characteristics of a cohort from a primary care electronic database

    PubMed Central

    Vedia Urgell, Cristina; Roso-Llorach, Albert; Morros, Rosa; Capellà, Dolors; Castells, Xavier; Ferreira-González, Ignacio; Troncoso Mariño, Amelia; Diògene, Eduard; Elorza, Josep Mª; Casajuana, Marc; Bolíbar, Bonaventura; Violan, Concepció

    2016-01-01

    Purpose Atrial fibrillation is the most common arrhythmia. Its management aims to reduce symptoms and to prevent complications through rate and rhythm control, management of concomitant cardiac diseases and prevention of related complications, mainly stroke. The main objective of Effectiveness, Safety and Costs in Atrial Fibrillation (ESC-FA) study is to analyse the drugs used for the management of the disease in real-use conditions, particularly the antithrombotic agents for stroke prevention. The aim of this work is to present the study protocol of phase I of the ESC-FA study and the baseline characteristics of newly diagnosed patients with atrial fibrillation in Catalonia, Spain. Participants The data source is System for the Improvement of Research in Primary Care (SIDIAP) database. The population included are all patients with non-valvular atrial fibrillation diagnosis registered in the electronic health records during 2007–2012. Findings to date A total of 22 585 patients with non-valvular atrial fibrillation were included in the baseline description. Their mean age was 72.8 years and 51.6% were men. The most commonly prescribed antithrombotics were vitamin K antagonists (40.1% of patients) and platelet aggregation inhibitors (32.9%); 25.3% had not been prescribed antithrombotic treatment. Age, gender, comorbidities and co-medication at baseline were similar to those reported for previous studies. Future plans The next phase in the ESC-FA study will involve assessing the effectiveness and safety of antithrombotic treatments, analysing stroke events and bleeding episodes’ rates in our patients (rest of phase I), describing the current management of the disease and its costs in our setting, and assessing how the introduction of new oral anticoagulants changes the stroke prevention in non-valvular atrial fibrillation. PMID:26823179

  14. 2017 Annual Technology Baseline (ATB): Cost and Performance Data for Electricity Generation Technologies

    DOE Data Explorer

    Hand, Maureen; Augustine, Chad; Feldman, David; Kurup, Parthiv; Beiter, Philipp; O'Connor, Patrick

    2017-08-21

    Each year since 2015, NREL has presented Annual Technology Baseline (ATB) in a spreadsheet that contains detailed cost and performance data (both current and projected) for renewable and conventional technologies. The spreadsheet includes a workbook for each technology. This spreadsheet provides data for the 2017 ATB. In this edition of the ATB, offshore wind power has been updated to include 15 technical resource groups. And, two options are now provided for representing market conditions for project financing, including current market conditions and long-term historical conditions. For more information, see https://atb.nrel.gov/.

  15. Concepts and data model for a co-operative neurovascular database.

    PubMed

    Mansmann, U; Taylor, W; Porter, P; Bernarding, J; Jäger, H R; Lasjaunias, P; Terbrugge, K; Meisel, J

    2001-08-01

    Problems of clinical management of neurovascular diseases are very complex. This is caused by the chronic character of the diseases, a long history of symptoms and diverse treatments. If patients are to benefit from treatment, then treatment decisions have to rely on reliable and accurate knowledge of the natural history of the disease and the various treatments. Recent developments in statistical methodology and experience from electronic patient records are used to establish an information infrastructure based on a centralized register. A protocol to collect data on neurovascular diseases with technical as well as logistical aspects of implementing a database for neurovascular diseases are described. The database is designed as a co-operative tool of audit and research available to co-operating centres. When a database is linked to a systematic patient follow-up, it can be used to study prognosis. Careful analysis of patient outcome is valuable for decision-making.

  16. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases.

    PubMed

    Weycker, Derek; Sofrygin, Oleg; Seefeld, Kim; Deeter, Robert G; Legg, Jason; Edelsberg, John

    2013-02-13

    Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive "gold standard" (ANC <1.0×10(9)/L, and body temperature ≥38.3°C or receipt of antibiotics) and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24-45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78-95) and sensitivity was 57% (46-68). For the definition including neutropenia in any position (n=71), PPV was 77% (68-87) and sensitivity was 67% (56-77). Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever.

  17. Baseline information development for energy smart schools -- applied research, field testing and technology integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Tengfang; Piette, Mary Ann

    2004-08-05

    The original scope of work was to obtain and analyze existing and emerging data in four states: California, Florida, New York, and Wisconsin. The goal of this data collection was to deliver a baseline database or recommendations for such a database that could possibly contain window and daylighting features and energy performance characteristics of Kindergarten through 12th grade (K-12) school buildings (or those of classrooms when available). In particular, data analyses were performed based upon the California Commercial End-Use Survey (CEUS) databases to understand school energy use, features of window glazing, and availability of daylighting in California K-12 schools. Themore » outcomes from this baseline task can be used to assist in establishing a database of school energy performance, assessing applications of existing technologies relevant to window and daylighting design, and identifying future R&D needs. These are in line with the overall project goals as outlined in the proposal. Through the review and analysis of this data, it is clear that there are many compounding factors impacting energy use in K-12 school buildings in the U.S., and that there are various challenges in understanding the impact of K-12 classroom energy use associated with design features of window glazing and skylight. First, the energy data in the existing CEUS databases has, at most, provided the aggregated electricity and/or gas usages for the building establishments that include other school facilities on top of the classroom spaces. Although the percentage of classroom floor area in schools is often available from the databases, there is no additional information that can be used to quantitatively segregate the EUI for classroom spaces. In order to quantify the EUI for classrooms, sub-metering of energy usage by classrooms must be obtained. Second, magnitudes of energy use for electricity lighting are not attainable from the existing databases, nor are the lighting levels

  18. Development of the Orion Crew Module Static Aerodynamic Database. Par 2; Supersonic/Subsonic

    NASA Technical Reports Server (NTRS)

    Bibb, Karen L.; Walker, Eric L.; Brauckmann, Gregory J.; Robinson, Phil

    2011-01-01

    This work describes the process of developing the nominal static aerodynamic coefficients and associated uncertainties for the Orion Crew Module for Mach 8 and below. The database was developed from wind tunnel test data and computational simulations of the smooth Crew Module geometry, with no asymmetries or protuberances. The database covers the full range of Reynolds numbers seen in both entry and ascent abort scenarios. The basic uncertainties were developed as functions of Mach number and total angle of attack from variations in the primary data as well as computations at lower Reynolds numbers, on the baseline geometry, and using different flow solvers. The resulting aerodynamic database represents the Crew Exploration Vehicle Aerosciences Project's best estimate of the nominal aerodynamics for the current Crew Module vehicle.

  19. Database for chemical contents of streams on the White Mountain National Forest.

    Treesearch

    James W. Hornbeck; Michelle M. Alexander; Christopher Eagar; Joan Y. Carlson; Robert B. Smith

    2001-01-01

    Producing and protecting high-quality streamwater requires background or baseline data from which one can evaluate the impacts of natural and human disturbances. A database was created for chemical analyses of streamwater samples collected during the past several decades from 446 locations on the White Mountain National Forest (304,000 ha in New Hampshire and Maine)....

  20. The baseline pressure of intracranial pressure (ICP) sensors can be altered by electrostatic discharges.

    PubMed

    Eide, Per K; Bakken, André

    2011-08-22

    The monitoring of intracranial pressure (ICP) has a crucial role in the surveillance of patients with brain injury. During long-term monitoring of ICP, we have seen spontaneous shifts in baseline pressure (ICP sensor zero point), which are of technical and not physiological origin. The aim of the present study was to explore whether or not baseline pressures of ICP sensors can be affected by electrostatics discharges (ESD's), when ESD's are delivered at clinically relevant magnitudes. We performed bench-testing of a set of commercial ICP sensors. In our experimental setup, the ICP sensor was placed in a container with 0.9% NaCl solution. A test person was charged 0.5-10 kV, and then delivered ESD's to the sensor by touching a metal rod that was located in the container. The continuous pressure signals were recorded continuously before/after the ESD's, and the pressure readings were stored digitally using a computerized system A total of 57 sensors were tested, including 25 Codman ICP sensors and 32 Raumedic sensors. When charging the test person in the range 0.5-10 kV, typically ESD's in the range 0.5-5 kV peak pulse were delivered to the ICP sensor. Alterations in baseline pressure ≥ 2 mmHg was seen in 24 of 25 (96%) Codman sensors and in 17 of 32 (53%) Raumedic sensors. Lasting changes in baseline pressure > 10 mmHg that in the clinical setting would affect patient management, were seen frequently for both sensor types. The changes in baseline pressure were either characterized by sudden shifts or gradual drifts in baseline pressure. The baseline pressures of commercial solid ICP sensors can be altered by ESD's at discharge magnitudes that are clinically relevant. Shifts in baseline pressure change the ICP levels visualised to the physician on the monitor screen, and thereby reveal wrong ICP values, which likely represent a severe risk to the patient.

  1. The baseline pressure of intracranial pressure (ICP) sensors can be altered by electrostatic discharges

    PubMed Central

    2011-01-01

    Background The monitoring of intracranial pressure (ICP) has a crucial role in the surveillance of patients with brain injury. During long-term monitoring of ICP, we have seen spontaneous shifts in baseline pressure (ICP sensor zero point), which are of technical and not physiological origin. The aim of the present study was to explore whether or not baseline pressures of ICP sensors can be affected by electrostatics discharges (ESD's), when ESD's are delivered at clinically relevant magnitudes. Methods We performed bench-testing of a set of commercial ICP sensors. In our experimental setup, the ICP sensor was placed in a container with 0.9% NaCl solution. A test person was charged 0.5 - 10 kV, and then delivered ESD's to the sensor by touching a metal rod that was located in the container. The continuous pressure signals were recorded continuously before/after the ESD's, and the pressure readings were stored digitally using a computerized system Results A total of 57 sensors were tested, including 25 Codman ICP sensors and 32 Raumedic sensors. When charging the test person in the range 0.5-10 kV, typically ESD's in the range 0.5 - 5 kV peak pulse were delivered to the ICP sensor. Alterations in baseline pressure ≥ 2 mmHg was seen in 24 of 25 (96%) Codman sensors and in 17 of 32 (53%) Raumedic sensors. Lasting changes in baseline pressure > 10 mmHg that in the clinical setting would affect patient management, were seen frequently for both sensor types. The changes in baseline pressure were either characterized by sudden shifts or gradual drifts in baseline pressure. Conclusions The baseline pressures of commercial solid ICP sensors can be altered by ESD's at discharge magnitudes that are clinically relevant. Shifts in baseline pressure change the ICP levels visualised to the physician on the monitor screen, and thereby reveal wrong ICP values, which likely represent a severe risk to the patient. PMID:21859487

  2. Technical writing versus technical writing

    NASA Technical Reports Server (NTRS)

    Dillingham, J. W.

    1981-01-01

    Two terms, two job categories, 'technical writer' and 'technical author' are discussed in terms of industrial and business requirements and standards. A distinction between 'technical writing' and technical 'writing' is made. The term 'technical editor' is also considered. Problems inherent in the design of programs to prepare and train students for these jobs are discussed. A closer alliance between industry and academia is suggested as a means of preparing students with competent technical communication skills (especially writing and editing skills) and good technical skills.

  3. Technical issues for the eye image database creation at distance

    NASA Astrophysics Data System (ADS)

    Oropesa Morales, Lester Arturo; Maldonado Cano, Luis Alejandro; Soto Aldaco, Andrea; García Vázquez, Mireya Saraí; Zamudio Fuentes, Luis Miguel; Rodríguez Vázquez, Manuel Antonio; Pérez Rosas, Osvaldo Gerardo; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; Ramírez Acosta, Alejandro Álvaro

    2016-09-01

    Biometrics refers to identify people through their physical characteristics or behavior such as fingerprints, face, DNA, hand geometries, retina and iris patterns. Typically, the iris pattern is to acquire in short distance to recognize a person, however, in the past few years is a challenge identify a person by its iris pattern at certain distance in non-cooperative environments. This challenge comprises: 1) high quality iris image, 2) light variation, 3) blur reduction, 4) specular reflections reduction, 5) the distance from the acquisition system to the user, and 6) standardize the iris size and the density pixel of iris texture. The solution of the challenge will add robustness and enhance the iris recognition rates. For this reason, we describe the technical issues that must be considered during iris acquisition. Some of these considerations are the camera sensor, lens, the math analysis of depth of field (DOF) and field of view (FOV) for iris recognition. Finally, based on this issues we present experiment that show the result of captures obtained with our camera at distance and captures obtained with cameras in very short distance.

  4. Methods and means used in programming intelligent searches of technical documents

    NASA Technical Reports Server (NTRS)

    Gross, David L.

    1993-01-01

    In order to meet the data research requirements of the Safety, Reliability & Quality Assurance activities at Kennedy Space Center (KSC), a new computer search method for technical data documents was developed. By their very nature, technical documents are partially encrypted because of the author's use of acronyms, abbreviations, and shortcut notations. This problem of computerized searching is compounded at KSC by the volume of documentation that is produced during normal Space Shuttle operations. The Centralized Document Database (CDD) is designed to solve this problem. It provides a common interface to an unlimited number of files of various sizes, with the capability to perform any diversified types and levels of data searches. The heart of the CDD is the nature and capability of its search algorithms. The most complex form of search that the program uses is with the use of a domain-specific database of acronyms, abbreviations, synonyms, and word frequency tables. This database, along with basic sentence parsing, is used to convert a request for information into a relational network. This network is used as a filter on the original document file to determine the most likely locations for the data requested. This type of search will locate information that traditional techniques, (i.e., Boolean structured key-word searching), would not find.

  5. Technical evaluation of methods for identifying chemotherapy-induced febrile neutropenia in healthcare claims databases

    PubMed Central

    2013-01-01

    Background Healthcare claims databases have been used in several studies to characterize the risk and burden of chemotherapy-induced febrile neutropenia (FN) and effectiveness of colony-stimulating factors against FN. The accuracy of methods previously used to identify FN in such databases has not been formally evaluated. Methods Data comprised linked electronic medical records from Geisinger Health System and healthcare claims data from Geisinger Health Plan. Subjects were classified into subgroups based on whether or not they were hospitalized for FN per the presumptive “gold standard” (ANC <1.0×109/L, and body temperature ≥38.3°C or receipt of antibiotics) and claims-based definition (diagnosis codes for neutropenia, fever, and/or infection). Accuracy was evaluated principally based on positive predictive value (PPV) and sensitivity. Results Among 357 study subjects, 82 (23%) met the gold standard for hospitalized FN. For the claims-based definition including diagnosis codes for neutropenia plus fever in any position (n=28), PPV was 100% and sensitivity was 34% (95% CI: 24–45). For the definition including neutropenia in the primary position (n=54), PPV was 87% (78–95) and sensitivity was 57% (46–68). For the definition including neutropenia in any position (n=71), PPV was 77% (68–87) and sensitivity was 67% (56–77). Conclusions Patients hospitalized for chemotherapy-induced FN can be identified in healthcare claims databases--with an acceptable level of mis-classification--using diagnosis codes for neutropenia, or neutropenia plus fever. PMID:23406481

  6. EPA Office of Water (OW): 2002 Impaired Waters Baseline NHDPlus Indexed Dataset

    EPA Pesticide Factsheets

    This dataset consists of geospatial and attribute data identifying the spatial extent of state-reported impaired waters (EPA's Integrated Reporting categories 4a, 4b, 4c and 5)* available in EPA's Reach Address Database (RAD) at the time of extraction. For the 2002 baseline reporting year, EPA compiled state-submitted GIS data to create a seamless and nationally consistent picture of the Nation's impaired waters for measuring progress. EPA's Assessment and TMDL Tracking and Implementation System (ATTAINS) is a national compilation of states' 303(d) listings and TMDL development information, spanning several years of tracking over 40,000 impaired waters.

  7. The London low emission zone baseline study.

    PubMed

    Kelly, Frank; Armstrong, Ben; Atkinson, Richard; Anderson, H Ross; Barratt, Ben; Beevers, Sean; Cook, Derek; Green, Dave; Derwent, Dick; Mudway, Ian; Wilkinson, Paul

    2011-11-01

    On February 4, 2008, the world's largest low emission zone (LEZ) was established. At 2644 km2, the zone encompasses most of Greater London. It restricts the entry of the oldest and most polluting diesel vehicles, including heavy-goods vehicles (haulage trucks), buses and coaches, larger vans, and minibuses. It does not apply to cars or motorcycles. The LEZ scheme will introduce increasingly stringent Euro emissions standards over time. The creation of this zone presented a unique opportunity to estimate the effects of a stepwise reduction in vehicle emissions on air quality and health. Before undertaking such an investigation, robust baseline data were gathered on air quality and the oxidative activity and metal content of particulate matter (PM) from air pollution monitors located in Greater London. In addition, methods were developed for using databases of electronic primary-care records in order to evaluate the zone's health effects. Our study began in 2007, using information about the planned restrictions in an agreed-upon LEZ scenario and year-on-year changes in the vehicle fleet in models to predict air pollution concentrations in London for the years 2005, 2008, and 2010. Based on this detailed emissions and air pollution modeling, the areas in London were then identified that were expected to show the greatest changes in air pollution concentrations and population exposures after the implementation of the LEZ. Using these predictions, the best placement of a pollution monitoring network was determined and the feasibility of evaluating the health effects using electronic primary-care records was assessed. To measure baseline pollutant concentrations before the implementation of the LEZ, a comprehensive monitoring network was established close to major roadways and intersections. Output-difference plots from statistical modeling for 2010 indicated seven key areas likely to experience the greatest change in concentrations of nitrogen dioxide (NO2) (at least 3

  8. Baseline Tumor Lipiodol Uptake after Transarterial Chemoembolization for Hepatocellular Carcinoma: Identification of a Threshold Value Predicting Tumor Recurrence.

    PubMed

    Matsui, Yusuke; Horikawa, Masahiro; Jahangiri Noudeh, Younes; Kaufman, John A; Kolbeck, Kenneth J; Farsad, Khashayar

    2017-12-01

    The aim of the study was to evaluate the association between baseline Lipiodol uptake in hepatocellular carcinoma (HCC) after transarterial chemoembolization (TACE) with early tumor recurrence, and to identify a threshold baseline uptake value predicting tumor response. A single-institution retrospective database of HCC treated with Lipiodol-TACE was reviewed. Forty-six tumors in 30 patients treated with a Lipiodol-chemotherapy emulsion and no additional particle embolization were included. Baseline Lipiodol uptake was measured as the mean Hounsfield units (HU) on a CT within one week after TACE. Washout rate was calculated dividing the difference in HU between the baseline CT and follow-up CT by time (HU/month). Cox proportional hazard models were used to correlate baseline Lipiodol uptake and other variables with tumor response. A receiver operating characteristic (ROC) curve was used to identify the optimal threshold for baseline Lipiodol uptake predicting tumor response. During the follow-up period (mean 5.6 months), 19 (41.3%) tumors recurred (mean time to recurrence = 3.6 months). In a multivariate model, low baseline Lipiodol uptake and higher washout rate were significant predictors of early tumor recurrence ( P = 0.001 and < 0.0001, respectively). On ROC analysis, a threshold Lipiodol uptake of 270.2 HU was significantly associated with tumor response (95% sensitivity, 93% specificity). Baseline Lipiodol uptake and washout rate on follow-up were independent predictors of early tumor recurrence. A threshold value of baseline Lipiodol uptake > 270.2 HU was highly sensitive and specific for tumor response. These findings may prove useful for determining subsequent treatment strategies after Lipiodol TACE.

  9. Satellite Communications Technology Database. Part 2

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The Satellite Communications Technology Database is a compilation of data on state-of-the-art Ka-band technologies current as of January 2000. Most U.S. organizations have not published much of their Ka-band technology data, and so the great majority of this data is drawn largely from Japanese, European, and Canadian publications and Web sites. The data covers antennas, high power amplifiers, low noise amplifiers, MMIC devices, microwave/IF switch matrices, SAW devices, ASIC devices, power and data storage. The data herein is raw, and is often presented simply as the download of a table or figure from a site, showing specified technical characteristics, with no further explanation.

  10. High Rates of Baseline Drug Resistance and Virologic Failure Among ART-naive HIV-infected Children in Mali.

    PubMed

    Crowell, Claudia S; Maiga, Almoustapha I; Sylla, Mariam; Taiwo, Babafemi; Kone, Niaboula; Oron, Assaf P; Murphy, Robert L; Marcelin, Anne-Geneviève; Traore, Ban; Fofana, Djeneba B; Peytavin, Gilles; Chadwick, Ellen G

    2017-11-01

    Limited data exist on drug resistance and antiretroviral treatment (ART) outcomes in HIV-1-infected children in West Africa. We determined the prevalence of baseline resistance and correlates of virologic failure (VF) in a cohort of ART-naive HIV-1-infected children <10 years of age initiating ART in Mali. Reverse transcriptase and protease genes were sequenced at baseline (before ART) and at 6 months. Resistance was defined according to the Stanford HIV Genotypic Resistance database. VF was defined as viral load ≥1000 copies/mL after 6 months of ART. Logistic regression was used to evaluate factors associated with VF or death >1 month after enrollment. Post hoc, antiretroviral concentrations were assayed on baseline samples of participants with baseline resistance. One-hundred twenty children with a median age 2.6 years (interquartile range: 1.6-5.0) were included. Eighty-eight percent reported no prevention of mother-to-child transmission exposure. At baseline, 27 (23%), 4 (3%) and none had non-nucleoside reverse transcriptase inhibitor (NNRTI), nucleoside reverse transcriptase inhibitor or protease inhibitor resistance, respectively. Thirty-nine (33%) developed VF and 4 died >1 month post-ART initiation. In multivariable analyses, poor adherence [odds ratio (OR): 6.1, P = 0.001], baseline NNRTI resistance among children receiving NNRTI-based ART (OR: 22.9, P < 0.001) and protease inhibitor-based ART initiation among children without baseline NNRTI resistance (OR: 5.8, P = 0.018) were significantly associated with VF/death. Ten (38%) with baseline resistance had detectable levels of nevirapine or efavirenz at baseline; 7 were currently breastfeeding, but only 2 reported maternal antiretroviral use. Baseline NNRTI resistance was common in children without reported NNRTI exposure and was associated with increased risk of treatment failure. Detectable NNRTI concentrations were present despite few reports of maternal/infant antiretroviral use.

  11. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to

  12. Fox Valley Technical College New Occupational Markets: Market Research Project Results.

    ERIC Educational Resources Information Center

    Mishler, Carol

    This is a report on the results of a market research project conducted by Fox Valley Technical College (FVTC) (Wisconsin) to identify new occupational markets in the college's service area. The college scanned national job lists, employment databases, and northeastern Wisconsin employer advertisements. The college also surveyed employers in the…

  13. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    NASA Astrophysics Data System (ADS)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  14. The Russian effort in establishing large atomic and molecular databases

    NASA Astrophysics Data System (ADS)

    Presnyakov, Leonid P.

    1998-07-01

    The database activities in Russia have been developed in connection with UV and soft X-ray spectroscopic studies of extraterrestrial and laboratory (magnetically confined and laser-produced) plasmas. Two forms of database production are used: i) a set of computer programs to calculate radiative and collisional data for the general atom or ion, and ii) development of numeric database systems with the data stored in the computer. The first form is preferable for collisional data. At the Lebedev Physical Institute, an appropriate set of the codes has been developed. It includes all electronic processes at collision energies from the threshold up to the relativistic limit. The ion -atom (and -ion) collisional data are calculated with the methods developed recently. The program for the calculations of the level populations and line intensities is used for spectrical diagnostics of transparent plasmas. The second form of database production is widely used at the Institute of Physico-Technical Measurements (VNIIFTRI), and the Troitsk Center: the Institute of Spectroscopy and TRINITI. The main results obtained at the centers above are reviewed. Plans for future developments jointly with international collaborations are discussed.

  15. An engineering database management system for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Cipollone, Gregorio; Mckay, Michael H.; Paris, Joseph

    1993-01-01

    Studies at ESOC have demonstrated the feasibility of a flexible and powerful Engineering Database Management System in support for spacecraft operations documentation. The objectives set out were three-fold: first an analysis of the problems encountered by the Operations team in obtaining and managing operations documents; secondly, the definition of a concept for operations documentation and the implementation of prototype to prove the feasibility of the concept; and thirdly, definition of standards and protocols required for the exchange of data between the top-level partners in a satellite project. The EDMS prototype was populated with ERS-l satellite design data and has been used by the operations team at ESOC to gather operational experience. An operational EDMS would be implemented at the satellite prime contractor's site as a common database for all technical information surrounding a project and would be accessible by the cocontractor's and ESA teams.

  16. DeitY-TU face database: its design, multiple camera capturing, characteristics, and evaluation

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Saha, Kankan; Saha, Priya; Bhattacharjee, Debotosh

    2014-10-01

    The development of the latest face databases is providing researchers different and realistic problems that play an important role in the development of efficient algorithms for solving the difficulties during automatic recognition of human faces. This paper presents the creation of a new visual face database, named the Department of Electronics and Information Technology-Tripura University (DeitY-TU) face database. It contains face images of 524 persons belonging to different nontribes and Mongolian tribes of north-east India, with their anthropometric measurements for identification. Database images are captured within a room with controlled variations in illumination, expression, and pose along with variability in age, gender, accessories, make-up, and partial occlusion. Each image contains the combined primary challenges of face recognition, i.e., illumination, expression, and pose. This database also represents some new features: soft biometric traits such as mole, freckle, scar, etc., and facial anthropometric variations that may be helpful for researchers for biometric recognition. It also gives an equivalent study of the existing two-dimensional face image databases. The database has been tested using two baseline algorithms: linear discriminant analysis and principal component analysis, which may be used by other researchers as the control algorithm performance score.

  17. Quality Attribute-Guided Evaluation of NoSQL Databases: An Experience Report

    DTIC Science & Technology

    2014-10-18

    detailed technical evaluations of NoSQL databases specifically, and big data systems in general, that have become apparent during our study... big data , software systems [Agarwal 2011]. Internet-born organizations such as Google and Amazon are at the cutting edge of this revolution...Chang 2008], along with those of numerous other big data innovators, have made a variety of open source and commercial data management technologies

  18. Ambiguity and variability of database and software names in bioinformatics.

    PubMed

    Duck, Geraint; Kovacevic, Aleksandar; Robertson, David L; Stevens, Robert; Nenadic, Goran

    2015-01-01

    There are numerous options available to achieve various tasks in bioinformatics, but until recently, there were no tools that could systematically identify mentions of databases and tools within the literature. In this paper we explore the variability and ambiguity of database and software name mentions and compare dictionary and machine learning approaches to their identification. Through the development and analysis of a corpus of 60 full-text documents manually annotated at the mention level, we report high variability and ambiguity in database and software mentions. On a test set of 25 full-text documents, a baseline dictionary look-up achieved an F-score of 46 %, highlighting not only variability and ambiguity but also the extensive number of new resources introduced. A machine learning approach achieved an F-score of 63 % (with precision of 74 %) and 70 % (with precision of 83 %) for strict and lenient matching respectively. We characterise the issues with various mention types and propose potential ways of capturing additional database and software mentions in the literature. Our analyses show that identification of mentions of databases and tools is a challenging task that cannot be achieved by relying on current manually-curated resource repositories. Although machine learning shows improvement and promise (primarily in precision), more contextual information needs to be taken into account to achieve a good degree of accuracy.

  19. U.S. Air Force Scientific and Technical Information Program - The STINFO Program

    NASA Technical Reports Server (NTRS)

    Blados, Walter R.

    1991-01-01

    The U.S. Air Force STINFO (Scientific and Technical Information) program has as its main goal the proper use of all available scientific and technical information in the development of programs. The organization of STINFO databases, the use of STINFO in the development and advancement of aerospace science and technology and the acquisition of superior systems at lowest cost, and the application to public and private sectors of technologies developed for military uses are examined. STINFO user training is addressed. A project for aerospace knowledge diffusion is discussed.

  20. Applying cognitive load theory to the redesign of a conventional database systems course

    NASA Astrophysics Data System (ADS)

    Mason, Raina; Seton, Carolyn; Cooper, Graham

    2016-01-01

    Cognitive load theory (CLT) was used to redesign a Database Systems course for Information Technology students. The redesign was intended to address poor student performance and low satisfaction, and to provide a more relevant foundation in database design and use for subsequent studies and industry. The original course followed the conventional structure for a database course, covering database design first, then database development. Analysis showed the conventional course content was appropriate but the instructional materials used were too complex, especially for novice students. The redesign of instructional materials applied CLT to remove split attention and redundancy effects, to provide suitable worked examples and sub-goals, and included an extensive re-sequencing of content. The approach was primarily directed towards mid- to lower performing students and results showed a significant improvement for this cohort with the exam failure rate reducing by 34% after the redesign on identical final exams. Student satisfaction also increased and feedback from subsequent study was very positive. The application of CLT to the design of instructional materials is discussed for delivery of technical courses.

  1. Searching fee and non-fee toxicology information resources: an overview of selected databases.

    PubMed

    Wright, L L

    2001-01-12

    Toxicology profiles organize information by broad subjects, the first of which affirms identity of the agent studied. Studies here show two non-fee databases (ChemFinder and ChemIDplus) verify the identity of compounds with high efficiency (63% and 73% respectively) with the fee-based Chemical Abstracts Registry file serving well to fill data gaps (100%). Continued searching proceeds using knowledge of structure, scope and content to select databases. Valuable sources for information are factual databases that collect data and facts in special subject areas organized in formats available for analysis or use. Some sources representative of factual files are RTECS, CCRIS, HSDB, GENE-TOX and IRIS. Numerous factual databases offer a wealth of reliable information; however, exhaustive searches probe information published in journal articles and/or technical reports with records residing in bibliographic databases such as BIOSIS, EMBASE, MEDLINE, TOXLINE and Web of Science. Listed with descriptions are numerous factual and bibliographic databases supplied by 11 producers. Given the multitude of options and resources, it is often necessary to seek service desk assistance. Questions were posed by telephone and e-mail to service desks at DIALOG, ISI, MEDLARS, Micromedex and STN International. Results of the survey are reported.

  2. Historical seismometry database project: A comprehensive relational database for historical seismic records

    NASA Astrophysics Data System (ADS)

    Bono, Andrea

    2007-01-01

    The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical

  3. The Regional Structure of Technical Innovation

    NASA Astrophysics Data System (ADS)

    O'Neale, Dion

    2014-03-01

    There is strong evidence that the productivity per capita of cities and regions increases with population. One likely explanation for this phenomenon is that densely populated regions bring together otherwise unlikely combinations of individuals and organisations with diverse, specialised capabilities, leading to increased innovation and productivity. We have used the REGPAT patent database to construct a bipartite network of geographic regions and the patent classes for which those regions display a revealed comparative advantage. By analysing this network, we can infer relationships between different types of patent classes - and hence the structure of (patentable) technology. The network also provides a novel perspective for studying the combinations of technical capabilities in different geographic regions. We investigate measures such as the diversity and ubiquity of innovations within regions and find that diversity (resp. ubiquity) is positively (resp. negatively) correlated with population. We also find evidence of a nested structure for technical innovation. That is, specialised innovations tend to occur only when other more general innovations are already present.

  4. A Circular Dichroism Reference Database for Membrane Proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace,B.; Wien, F.; Stone, T.

    2006-01-01

    Membrane proteins are a major product of most genomes and the target of a large number of current pharmaceuticals, yet little information exists on their structures because of the difficulty of crystallising them; hence for the most part they have been excluded from structural genomics programme targets. Furthermore, even methods such as circular dichroism (CD) spectroscopy which seek to define secondary structure have not been fully exploited because of technical limitations to their interpretation for membrane embedded proteins. Empirical analyses of circular dichroism (CD) spectra are valuable for providing information on secondary structures of proteins. However, the accuracy of themore » results depends on the appropriateness of the reference databases used in the analyses. Membrane proteins have different spectral characteristics than do soluble proteins as a result of the low dielectric constants of membrane bilayers relative to those of aqueous solutions (Chen & Wallace (1997) Biophys. Chem. 65:65-74). To date, no CD reference database exists exclusively for the analysis of membrane proteins, and hence empirical analyses based on current reference databases derived from soluble proteins are not adequate for accurate analyses of membrane protein secondary structures (Wallace et al (2003) Prot. Sci. 12:875-884). We have therefore created a new reference database of CD spectra of integral membrane proteins whose crystal structures have been determined. To date it contains more than 20 proteins, and spans the range of secondary structures from mostly helical to mostly sheet proteins. This reference database should enable more accurate secondary structure determinations of membrane embedded proteins and will become one of the reference database options in the CD calculation server DICHROWEB (Whitmore & Wallace (2004) NAR 32:W668-673).« less

  5. Recent Developments of the GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Berthier, E.; Bolch, T.; Kargel, J. S.; Paul, F.; Racoviteanu, A.

    2017-12-01

    Earth's glaciers are shrinking almost without exception, leading to changes in water resources, timing of runoff, sea level, and hazard potential. Repeat mapping of glacier outlines, lakes, and glacier topography, along with glacial processes, is critically needed to understand how glaciers will react to a changing climate, and how those changes will impact humans. To understand the impacts and processes behind the observed changes, it is crucial to monitor glaciers through time by mapping their areal extent, snow lines, ice flow velocities, associated water bodies, and thickness changes. The glacier database of the Global Land Ice Measurements from Space (GLIMS) initiative is the only multi-temporal glacier database capable of tracking all these glacier measurements and providing them to the scientific community and broader public.Recent developments in GLIMS include improvements in the database and web applications and new activities in the international GLIMS community. The coverage of the GLIMS database has recently grown geographically and temporally by drawing on the Randolph Glacier Inventory (RGI) and other new data sets. The GLIMS database is globally complete, and approximately one third of glaciers have outlines from more than one time. New tools for visualizing and downloading GLIMS data in a choice of formats and data models have been developed, and a new data model for handling multiple glacier records through time while avoiding double-counting of glacier number or area is nearing completion. A GLIMS workshop was held in Boulder, Colorado this year to facilitate two-way communication with the greater community on future needs.The result of this work is a more complete and accurate glacier data repository that shows both the current state of glaciers on Earth and how they have changed in recent decades. Needs for future scientific and technical developments were identified and prioritized at the GLIMS Workshop, and are reported here.

  6. Modeling Powered Aerodynamics for the Orion Launch Abort Vehicle Aerodynamic Database

    NASA Technical Reports Server (NTRS)

    Chan, David T.; Walker, Eric L.; Robinson, Philip E.; Wilson, Thomas M.

    2011-01-01

    Modeling the aerodynamics of the Orion Launch Abort Vehicle (LAV) has presented many technical challenges to the developers of the Orion aerodynamic database. During a launch abort event, the aerodynamic environment around the LAV is very complex as multiple solid rocket plumes interact with each other and the vehicle. It is further complicated by vehicle separation events such as between the LAV and the launch vehicle stack or between the launch abort tower and the crew module. The aerodynamic database for the LAV was developed mainly from wind tunnel tests involving powered jet simulations of the rocket exhaust plumes, supported by computational fluid dynamic simulations. However, limitations in both methods have made it difficult to properly capture the aerodynamics of the LAV in experimental and numerical simulations. These limitations have also influenced decisions regarding the modeling and structure of the aerodynamic database for the LAV and led to compromises and creative solutions. Two database modeling approaches are presented in this paper (incremental aerodynamics and total aerodynamics), with examples showing strengths and weaknesses of each approach. In addition, the unique problems presented to the database developers by the large data space required for modeling a launch abort event illustrate the complexities of working with multi-dimensional data.

  7. Aviation Trends Related to Atmospheric Environment Safety Technologies Project Technical Challenges

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Barr, Lawrence C.; Evans, Joni K.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    Current and future aviation safety trends related to the National Aeronautics and Space Administration's Atmospheric Environment Safety Technologies Project's three technical challenges (engine icing characterization and simulation capability; airframe icing simulation and engineering tool capability; and atmospheric hazard sensing and mitigation technology capability) were assessed by examining the National Transportation Safety Board (NTSB) accident database (1989 to 2008), incidents from the Federal Aviation Administration (FAA) accident/incident database (1989 to 2006), and literature from various industry and government sources. The accident and incident data were examined for events involving fixed-wing airplanes operating under Federal Aviation Regulation (FAR) Parts 121, 135, and 91 for atmospheric conditions related to airframe icing, ice-crystal engine icing, turbulence, clear air turbulence, wake vortex, lightning, and low visibility (fog, low ceiling, clouds, precipitation, and low lighting). Five future aviation safety risk areas associated with the three AEST technical challenges were identified after an exhaustive survey of a variety of sources and include: approach and landing accident reduction, icing/ice detection, loss of control in flight, super density operations, and runway safety.

  8. The Amordad database engine for metagenomics.

    PubMed

    Behnam, Ehsan; Smith, Andrew D

    2014-10-15

    Several technical challenges in metagenomic data analysis, including assembling metagenomic sequence data or identifying operational taxonomic units, are both significant and well known. These forms of analysis are increasingly cited as conceptually flawed, given the extreme variation within traditionally defined species and rampant horizontal gene transfer. Furthermore, computational requirements of such analysis have hindered content-based organization of metagenomic data at large scale. In this article, we introduce the Amordad database engine for alignment-free, content-based indexing of metagenomic datasets. Amordad places the metagenome comparison problem in a geometric context, and uses an indexing strategy that combines random hashing with a regular nearest neighbor graph. This framework allows refinement of the database over time by continual application of random hash functions, with the effect of each hash function encoded in the nearest neighbor graph. This eliminates the need to explicitly maintain the hash functions in order for query efficiency to benefit from the accumulated randomness. Results on real and simulated data show that Amordad can support logarithmic query time for identifying similar metagenomes even as the database size reaches into the millions. Source code, licensed under the GNU general public license (version 3) is freely available for download from http://smithlabresearch.org/amordad andrewds@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Integrated sequence and immunology filovirus database at Los Alamos

    PubMed Central

    Yoon, Hyejin; Foley, Brian; Feng, Shihai; Macke, Jennifer; Dimitrijevic, Mira; Abfalterer, Werner; Szinger, James; Fischer, Will; Kuiken, Carla; Korber, Bette

    2016-01-01

    The Ebola outbreak of 2013–15 infected more than 28 000 people and claimed more lives than all previous filovirus outbreaks combined. Governmental agencies, clinical teams, and the world scientific community pulled together in a multifaceted response ranging from prevention and disease control, to evaluating vaccines and therapeutics in human trials. As this epidemic is finally coming to a close, refocusing on long-term prevention strategies becomes paramount. Given the very real threat of future filovirus outbreaks, and the inherent uncertainty of the next outbreak virus and geographic location, it is prudent to consider the extent and implications of known natural diversity in advancing vaccines and therapeutic approaches. To facilitate such consideration, we have updated and enhanced the content of the filovirus portion of Los Alamos Hemorrhagic Fever Viruses Database. We have integrated and performed baseline analysis of all family Filoviridae sequences deposited into GenBank, with associated immune response data, and metadata, and we have added new computational tools with web-interfaces to assist users with analysis. Here, we (i) describe the main features of updated database, (ii) provide integrated views and some basic analyses summarizing evolutionary patterns as they relate to geo-temporal data captured in the database and (iii) highlight the most conserved regions in the proteome that may be useful for a T cell vaccine strategy. Database URL: www.hfv.lanl.gov PMID:27103629

  10. JICST Factual Database JICST DNA Database

    NASA Astrophysics Data System (ADS)

    Shirokizawa, Yoshiko; Abe, Atsushi

    Japan Information Center of Science and Technology (JICST) has started the on-line service of DNA database in October 1988. This database is composed of EMBL Nucleotide Sequence Library and Genetic Sequence Data Bank. The authors outline the database system, data items and search commands. Examples of retrieval session are presented.

  11. EML Regional Baseline Station at Chester, NJ, 1987--1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-01

    The Environmental Measurements Laboratory (EML) has maintained a field station at Chester, NJ, since 1976. Located 64 kilometers west of EML, the site is on the property of Bell Communications Research and is a little more than 900 square meters in area. The Chester site is a rural facility which provides us with an opportunity to carry out regional baseline'' research and to test field instruments. In order to meet our special need the technical criteria for such a field station area the following. The site should represent a regional'' pollution condition; that is, no significant local pollution sources shouldmore » be present. The terrain should be relatively flat with minimal rock outcroppings. The soil cover should not have been disturbed for at least 30 years. This report updates the various programs underway at Chester and presents data that have become available since the last report.« less

  12. EML Regional Baseline Station at Chester, NJ, 1987--1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-01

    The Environmental Measurements Laboratory (EML) has maintained a field station at Chester, NJ, since 1976. Located 64 kilometers west of EML, the site is on the property of Bell Communications Research and is a little more than 900 square meters in area. The Chester site is a rural facility which provides us with an opportunity to carry out ``regional baseline`` research and to test field instruments. In order to meet our special need the technical criteria for such a field station area the following. The site should represent a ``regional`` pollution condition; that is, no significant local pollution sources shouldmore » be present. The terrain should be relatively flat with minimal rock outcroppings. The soil cover should not have been disturbed for at least 30 years. This report updates the various programs underway at Chester and presents data that have become available since the last report.« less

  13. A Web-based database for pathology faculty effort reporting.

    PubMed

    Dee, Fred R; Haugen, Thomas H; Wynn, Philip A; Leaven, Timothy C; Kemp, John D; Cohen, Michael B

    2008-04-01

    To ensure appropriate mission-based budgeting and equitable distribution of funds for faculty salaries, our compensation committee developed a pathology-specific effort reporting database. Principles included the following: (1) measurement should be done by web-based databases; (2) most entry should be done by departmental administration or be relational to other databases; (3) data entry categories should be aligned with funding streams; and (4) units of effort should be equal across categories of effort (service, teaching, research). MySQL was used for all data transactions (http://dev.mysql.com/downloads), and scripts were constructed using PERL (http://www.perl.org). Data are accessed with forms that correspond to fields in the database. The committee's work resulted in a novel database using pathology value units (PVUs) as a standard quantitative measure of effort for activities in an academic pathology department. The most common calculation was to estimate the number of hours required for a specific task, divide by 2080 hours (a Medicare year) and then multiply by 100. Other methods included assigning a baseline PVU for program, laboratory, or course directorship with an increment for each student or staff in that unit. With these methods, a faculty member should acquire approximately 100 PVUs. Some outcomes include (1) plotting PVUs versus salary to identify outliers for salary correction, (2) quantifying effort in activities outside the department, (3) documenting salary expenditure for unfunded research, (4) evaluating salary equity by plotting PVUs versus salary by sex, and (5) aggregating data by category of effort for mission-based budgeting and long-term planning.

  14. Second thoughts on the final rule: An analysis of baseline participant characteristics reports on ClinicalTrials.gov.

    PubMed

    Cahan, Amos; Anand, Vibha

    2017-01-01

    ClinicalTrials.gov is valuable for aggregate-level analysis of trials. The recently published final rule aims to improve reporting of trial results. We aimed to assess variability in ClinicalTirals.gov records reporting participants' baseline measures. The September 2015 edition of the database for Aggregate Analysis of ClinicalTrials.gov (AACT), was used in this study. To date, AACT contains 186,941 trials of which 16,660 trials reporting baseline (participant) measures were analyzed. We also analyzed a subset of 13,818 Highly Likely Applicable Clinical Trials (HLACT), for which reporting of results is likely mandatory and compared a random sample of 30 trial records to their journal articles. We report counts for each mandatory baseline measure and variability reporting in their formats. The AACT dataset contains 8,161 baseline measures with 1206 unique measurement units. However, of these 6,940 (85%) variables appear only once in the dataset. Age and Gender are reported using many different formats (178 and 49 respectively). "Age" as the variable name is reported in 60 different formats. HLACT subset reports measures using 3,931 variables. The most frequent Age format (i.e. mean (years) ± sd) is found in only 45% of trials. Overall only 4 baseline measures (Region of Enrollment, Age, Number of Participants, and Gender) are reported by > 10% of trials. Discrepancies are found in both the types and formats of ClinicalTrials.gov records and their corresponding journal articles. On average, journal articles include twice the number of baseline measures (13.6±7.1 (sd) vs. 6.6±7.6) when compared to the ClinicalTrials.gov records that report any results. We found marked variability in baseline measures reporting. This is not addressed by the final rule. To support secondary use of ClinicalTrials.gov, a uniform format for baseline measures reporting is warranted.

  15. The opportunities and obstacles in developing a vascular birthmark database for clinical and research use.

    PubMed

    Sharma, Vishal K; Fraulin, Frankie Og; Harrop, A Robertson; McPhalen, Donald F

    2011-01-01

    Databases are useful tools in clinical settings. The authors review the benefits and challenges associated with the development and implementation of an efficient electronic database for the multidisciplinary Vascular Birthmark Clinic at the Alberta Children's Hospital, Calgary, Alberta. The content and structure of the database were designed using the technical expertise of a data analyst from the Calgary Health Region. Relevant clinical and demographic data fields were included with the goal of documenting ongoing care of individual patients, and facilitating future epidemiological studies of this patient population. After completion of this database, 10 challenges encountered during development were retrospectively identified. Practical solutions for these challenges are presented. THE CHALLENGES IDENTIFIED DURING THE DATABASE DEVELOPMENT PROCESS INCLUDED: identification of relevant data fields; balancing simplicity and user-friendliness with complexity and comprehensive data storage; database expertise versus clinical expertise; software platform selection; linkage of data from the previous spreadsheet to a new data management system; ethics approval for the development of the database and its utilization for research studies; ensuring privacy and limited access to the database; integration of digital photographs into the database; adoption of the database by support staff in the clinic; and maintaining up-to-date entries in the database. There are several challenges involved in the development of a useful and efficient clinical database. Awareness of these potential obstacles, in advance, may simplify the development of clinical databases by others in various surgical settings.

  16. Ultra Pure Water Cleaning Baseline Study on NASA JSC Astromaterial Curation Gloveboxes

    NASA Technical Reports Server (NTRS)

    Calaway, Michael J.; Burkett, P. J.; Allton, J. H.; Allen, C. C.

    2013-01-01

    Future sample return missions will require strict protocols and procedures for reducing inorganic and organic contamination in isolation containment systems. In 2012, a baseline study was orchestrated to establish the current state of organic cleanliness in gloveboxes used by NASA JSC astromaterials curation labs [1, 2]. As part of this in-depth organic study, the current curatorial technical support procedure (TSP) 23 was used for cleaning the gloveboxes with ultra pure water (UPW) [3-5]. Particle counts and identification were obtained that could be used as a benchmark for future mission designs that require glovebox decontamination. The UPW baseline study demonstrates that TSP 23 works well for gloveboxes that have been thoroughly degreased. However, TSP 23 could be augmented to provide even better glovebox decontamination. JSC 03243 could be used as a starting point for further investigating optimal cleaning techniques and procedures. DuPont Vertrel XF or other chemical substitutes to replace Freon- 113, mechanical scrubbing, and newer technology could be used to enhance glovebox cleanliness in addition to high purity UPW final rinsing. Future sample return missions will significantly benefit from further cleaning studies to reduce inorganic and organic contamination.

  17. The Relationship of Baseline Prostate Specific Antigen and Risk of Future Prostate Cancer and Its Variance by Race.

    PubMed

    Verges, Daniel P; Dani, Hasan; Sterling, William A; Weedon, Jeremy; Atallah, William; Mehta, Komal; Schreiber, David; Weiss, Jeffrey P; Karanikolas, Nicholas T

    2017-01-01

    Several studies suggest that a baseline prostate specific antigen (PSA) measured in young men predicts future risk of prostate cancer. Considering recent recommendations against PSA screening, high-risk populations (e.g. black men, men with a high baseline PSA) may be particularly vulnerable in the coming years. Thus, we investigated the relationship between baseline PSA and future prostate cancer in a black majority-minority urban population. A retrospective analysis was performed of the prostate biopsy database (n = 994) at the Brooklyn Veterans Affairs Hospital. These men were referred to urology clinic for elevated PSA and biopsied between 2007 and 2014. Multivariate logistic regression was used to predict positive prostate biopsy from log-transformed baseline PSA, race (black, white, or other), and several other variables. The majority of men identified as black (50.2%). Median age at time of baseline PSA and biopsy was 58.6 and 64.8, respectively. Median baseline PSA was similar among black men and white men (2.70 vs 2.91 for black men vs white men, p = 0.232). Even so, black men were more likely than white men to be diagnosed with prostate cancer (OR 1.62, p < 0.0001). Black men less than age 70 were at particularly greater risk than their white counterparts. Baseline PSA was not a statistically significant predictor of future prostate cancer (p = 0.101). Black men were more likely to be diagnosed with prostate cancer than were white men, despite comparable baseline PSA. In our pre-screened population at the urology clinic, a retrospective examination of baseline PSA did not predict future prostate cancer. Copyright © 2016 National Medical Association. Published by Elsevier Inc. All rights reserved.

  18. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  19. Technical and policy approaches to balancing patient privacy and data sharing in clinical and translational research.

    PubMed

    Malin, Bradley; Karp, David; Scheuermann, Richard H

    2010-01-01

    Clinical researchers need to share data to support scientific validation and information reuse and to comply with a host of regulations and directives from funders. Various organizations are constructing informatics resources in the form of centralized databases to ensure reuse of data derived from sponsored research. The widespread use of such open databases is contingent on the protection of patient privacy. We review privacy-related problems associated with data sharing for clinical research from technical and policy perspectives. We investigate existing policies for secondary data sharing and privacy requirements in the context of data derived from research and clinical settings. In particular, we focus on policies specified by the US National Institutes of Health and the Health Insurance Portability and Accountability Act and touch on how these policies are related to current and future use of data stored in public database archives. We address aspects of data privacy and identifiability from a technical, although approachable, perspective and summarize how biomedical databanks can be exploited and seemingly anonymous records can be reidentified using various resources without hacking into secure computer systems. We highlight which clinical and translational data features, specified in emerging research models, are potentially vulnerable or exploitable. In the process, we recount a recent privacy-related concern associated with the publication of aggregate statistics from pooled genome-wide association studies that have had a significant impact on the data sharing policies of National Institutes of Health-sponsored databanks. Based on our analysis and observations we provide a list of recommendations that cover various technical, legal, and policy mechanisms that open clinical databases can adopt to strengthen data privacy protection as they move toward wider deployment and adoption.

  20. Real-world Canagliflozin Utilization: Glycemic Control Among Patients With Type 2 Diabetes Mellitus-A Multi-Database Synthesis.

    PubMed

    Chow, Wing; Miyasato, Gavin; Kokkotos, Fotios K; Bailey, Robert A; Buysman, Erin K; Henk, Henry J

    2016-09-01

    Randomized controlled trials have found that treatment of type 2 diabetes mellitus with canagliflozin, a sodium glucose co-transporter 2 inhibitor, is associated with significant reductions in glycosylated hemoglobin (HbA1c) levels. However, very few studies have evaluated the effectiveness of sodium glucose co-transporter 2 inhibitors in a real-world context. This data synthesis aims to examine the demographic characteristics and glycemic control among patients treated with canagliflozin in clinical practice, using results obtained from 2 US-specific retrospective administrative claims databases. Data included in the synthesis were derived from 2 large claims databases (the Optum Research Database and the Inovalon MORE(2) Registry, Research Edition) and were obtained from 3 recently published retrospective observational studies of adult patients with type 2 diabetes mellitus who were treated with canagliflozin. Two of the studies used the Optum database (3-month and 6-month follow-up) and 1 study used the Inovalon database (mean follow-up of 4 months). Patient demographic characteristics, clinical characteristics, treatment utilization, and achievement of glycemic goals at baseline and after canagliflozin treatment were evaluated across the 3 studies. Results were assessed using univariate descriptive statistics. Baseline demographic characteristics were generally similar between the Optum and Inovalon cohorts. Mean baseline HbA1c was 8.7% in the Optum and 8.3% in the Inovalon cohort. Seventy-five percent of the Optum (3-month study) cohort and 74% of the Inovalon cohort used 2 or more antihyperglycemic agents. During follow-up, in both cohorts, the proportion of patients who achieved tight glycemic control (HbA1c <7.0%) more than doubled, while the proportion who had poor control (HbA1c ≥9.0%) decreased by approximately 50%. Among patients who had baseline HbA1c ≥7.0%, 21% of the Optum cohort and 24% of the Inovalon cohort achieved tight glycemic control (Hb

  1. Development of a database system for operational use in the selection of titanium alloys

    NASA Astrophysics Data System (ADS)

    Han, Yuan-Fei; Zeng, Wei-Dong; Sun, Yu; Zhao, Yong-Qing

    2011-08-01

    The selection of titanium alloys has become a complex decision-making task due to the growing number of creation and utilization for titanium alloys, with each having its own characteristics, advantages, and limitations. In choosing the most appropriate titanium alloys, it is very essential to offer a reasonable and intelligent service for technical engineers. One possible solution of this problem is to develop a database system (DS) to help retrieve rational proposals from different databases and information sources and analyze them to provide useful and explicit information. For this purpose, a design strategy of the fuzzy set theory is proposed, and a distributed database system is developed. Through ranking of the candidate titanium alloys, the most suitable material is determined. It is found that the selection results are in good agreement with the practical situation.

  2. Technical structure of the global nanoscience and nanotechnology literature

    NASA Astrophysics Data System (ADS)

    Kostoff, Ronald N.; Koytcheff, Raymond G.; Lau, Clifford G. Y.

    2007-10-01

    Text mining was used to extract technical intelligence from the open source global nanotechnology and nanoscience research literature. An extensive nanotechnology/nanoscience-focused query was applied to the Science Citation Index/Social Science Citation Index (SCI/SSCI) databases. The nanotechnology/nanoscience research literature technical structure (taxonomy) was obtained using computational linguistics/document clustering and factor analysis. The infrastructure (prolific authors, key journals/institutions/countries, most cited authors/journals/documents) for each of the clusters generated by the document clustering algorithm was obtained using bibliometrics. Another novel addition was the use of phrase auto-correlation maps to show technical thrust areas based on phrase co-occurrence in Abstracts, and the use of phrase-phrase cross-correlation maps to show technical thrust areas based on phrase relations due to the sharing of common co-occurring phrases. The ˜400 most cited nanotechnology papers since 1991 were grouped, and their characteristics generated. Whereas the main analysis provided technical thrusts of all nanotechnology papers retrieved, analysis of the most cited papers allowed their characteristics to be displayed. Finally, most cited papers from selected time periods were extracted, along with all publications from those time periods, and the institutions and countries were compared based on their representation in the most cited documents list relative to their representation in the most publications list.

  3. E-KIT: An Electronic-Knowledge Information Tool for Organizing Site Information and Improving Technical Communication with Stakeholders - 13082

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kautsky, Mark; Findlay, Richard C.; Hodges, Rex A.

    2013-07-01

    Managing technical references for projects that have long histories is hampered by the large collection of documents, each of which might contain discrete pieces of information relevant to the site conceptual model. A database application has been designed to improve the efficiency of retrieving technical information for a project. Although many databases are currently used for accessing analytical and geo-referenced data, applications designed specifically to manage technical reference material for projects are scarce. Retrieving site data from the array of available references becomes an increasingly inefficient use of labor. The electronic-Knowledge Information Tool (e-KIT) is designed as a project-level resourcemore » to access and communicate technical information. The e-KIT is a living tool that grows as new information becomes available, and its value to the project increases as the volume of site information increases. Having all references assembled in one location with complete reference citations and links to elements of the site conceptual model offers a way to enhance communication with outside groups. The published and unpublished references are incorporated into the e-KIT, while the compendium of references serves as a complete bibliography for the project. (authors)« less

  4. Baseline geochemistry of soil and bedrock Tshirege Member of the Bandelier Tuff at MDA-P

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, R.G.; McDonald, E.V.; Ryti, R.T.

    1997-08-01

    This report provides baseline geochemistry for soils (including fill), and for bedrock within three specific areas that are planned for use in the remediation of Material Disposal Area P (MDA-P) at Technical Area 16 (TA-16). The baseline chemistry includes leachable element concentrations for both soils and bedrock and total element concentrations for all soil samples and for two selected bedrock samples. MDA-P operated from the early 1950s to 1984 as a landfill for rubble and debris generated by the burning of high explosives (HE) at the TA-16 Burning Ground, HE-contaminated equipment and material, barium nitrate sand, building materials, and trash.more » The aim of this report is to establish causes for recognizable chemical differences between the background and baseline data sets. In many cases, the authors conclude that recognizable differences represent natural enrichments. In other cases, differences are best attributed to analytical problems. But most importantly, the comparison of background and baseline geochemistry demonstrates significant contamination for several elements not only at the two remedial sites near the TA-16 Burning Ground, but also within the entire region of the background study. This contamination is highly localized very near to the surface in soil and fill, and probably also in bedrock; consequently, upper tolerance limits (UTLs) calculated as upper 95% confidence limits of the 95th percentile are of little value and thus are not provided. This report instead provides basic statistical summaries and graphical comparisons for background and baseline samples to guide strategies for remediation of the three sites to be used in the restoration of MDA-P.« less

  5. The Mouse Heart Attack Research Tool (mHART) 1.0 Database.

    PubMed

    DeLeon-Pennell, Kristine Y; Iyer, Rugmani Padmanabhan; Ma, Yonggang; Yabluchanskiy, Andriy; Zamilpa, Rogelio; Chiao, Ying Ann; Cannon, Presley; Cates, Courtney; Flynn, Elizabeth R; Halade, Ganesh V; de Castro Bras, Lisandra E; Lindsey, Merry L

    2018-05-18

    The generation of Big Data has enabled systems-level dissections into the mechanisms of cardiovascular pathology. Integration of genetic, proteomic, and pathophysiological variables across platforms and laboratories fosters discoveries through multidisciplinary investigations and minimizes unnecessary redundancy in research efforts. The Mouse Heart Attack Research Tool (mHART) consolidates a large dataset of over 10 years of experiments from a single laboratory for cardiovascular investigators to generate novel hypotheses and identify new predictive markers of progressive left ventricular remodeling following myocardial infarction (MI) in mice. We designed the mHART REDCap database using our own data to integrate cardiovascular community participation. We generated physiological, biochemical, cellular, and proteomic outputs from plasma and left ventricles obtained from post-MI and no MI (naïve) control groups. We included both male and female mice ranging in age from 3 to 36 months old. After variable collection, data underwent quality assessment for data curation (e.g. eliminate technical errors, check for completeness, remove duplicates, and define terms). Currently, mHART 1.0 contains >888,000 data points and includes results from >2,100 unique mice. Database performance was tested and an example provided to illustrate database utility. This report explains how the first version of the mHART database was established and provides researchers with a standard framework to aid in the integration of their data into our database or in the development of a similar database.

  6. Baseline practices and user needs for Web dissemination of geotechnical data

    USGS Publications Warehouse

    Turner, L.L.; Brown, M.P.; Chambers, D.; Davis, C.A.; Diehl, J.; Hitchcock, C.S.; Holzer, T.L.; Nigbor, R.L.; Plumb, C.; Real, C.; Reimer, M.; Steidl, J.H.; Sun, J.I.; Tinsley, J.C.; Vaughn, D.; ,

    2004-01-01

    This paper presents the findings and recommendations of the User Scenario Work Group (USWG) in identifying a baseline of current practices within the geo-professional community and prioritizing desired functional requirements in the development of a comprehensive geotechnical information management system. This work was conducted as an initial phase of a larger project to demonstrate the effectiveness of a web based virtual data center for the dissemination of geotechnical data from multiple linked databases of various government and private sector organizations. An online survey was administered over the course of several months to practitioners across the nation. The results from the survey were compiled and examined to provide direction to the other project teams in the development of user-driven prototype data system.

  7. Environmental Baseline Survey Parcel E2, F, and I, Military Housing Areas Nellis Air Force Base, Nevada. Phase 1

    DTIC Science & Technology

    2011-09-01

    SEP 2011 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Final Phase I Environmental Baseline Survey Parcel E2, F, and I...lead-based paint LUST leaking underground storage tank M.D.M. Mount Diablo Meridian MFH military family housing MHPI Military Housing...northwest OWS oil/water separator PADS PCB Activity Database PCB polychorinated biphenyl PCR Physical Condition Report PDF portable

  8. Phase 1 Environmental Baseline Survey Parcels E2, F, and I, Military Housing Areas, Nellis Air Force Base, Nevada

    DTIC Science & Technology

    2011-09-01

    21 SEP 2011 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Final Phase I Environmental Baseline Survey Parcels E2, F, and I...leaking underground storage tank M.D.M. Mount Diablo Meridian MFH military family housing MHPI Military Housing Privatization Initiative MSL...water separator PADS PCB Activity Database PCB polychorinated biphenyl PCR Physical Condition Report PDF portable document format PPV

  9. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  10. How can the research potential of the clinical quality databases be maximized? The Danish experience.

    PubMed

    Nørgaard, M; Johnsen, S P

    2016-02-01

    In Denmark, the need for monitoring of clinical quality and patient safety with feedback to the clinical, administrative and political systems has resulted in the establishment of a network of more than 60 publicly financed nationwide clinical quality databases. Although primarily devoted to monitoring and improving quality of care, the potential of these databases as data sources in clinical research is increasingly being recognized. In this review, we describe these databases focusing on their use as data sources for clinical research, including their strengths and weaknesses as well as future concerns and opportunities. The research potential of the clinical quality databases is substantial but has so far only been explored to a limited extent. Efforts related to technical, legal and financial challenges are needed in order to take full advantage of this potential. © 2016 The Association for the Publication of the Journal of Internal Medicine.

  11. Natural radioactivity in building materials in the European Union: a database and an estimate of radiological significance.

    PubMed

    Trevisi, R; Risica, S; D'Alessandro, M; Paradiso, D; Nuccetelli, C

    2012-02-01

    The authors set up a database of activity concentration measurements of natural radionuclides (²²⁶Ra, ²³²Th and ⁴⁰K) in building material. It contains about 10,000 samples of both bulk material (bricks, concrete, cement, natural- and phosphogypsum, sedimentary and igneous bulk stones) and superficial material (igneous and metamorphic stones) used in the construction industry in most European Union Member States. The database allowed the authors to calculate the activity concentration index I--suggested by a European technical guidance document and recently used as a basis for elaborating the draft Euratom Basic Safety Standards Directive--for bricks, concrete and phosphogypsum used in the European Union. Moreover, the percentage could be assessed of materials possibly subject to restrictions, if either of the two dose criteria proposed by the technical guidance were to be adopted. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Examining the Factors That Contribute to Successful Database Application Implementation Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Nworji, Alexander O.

    2013-01-01

    Most organizations spend millions of dollars due to the impact of improperly implemented database application systems as evidenced by poor data quality problems. The purpose of this quantitative study was to use, and extend, the technology acceptance model (TAM) to assess the impact of information quality and technical quality factors on database…

  13. Sex Differences and Self-Reported Attention Problems During Baseline Concussion Testing.

    PubMed

    Brooks, Brian L; Iverson, Grant L; Atkins, Joseph E; Zafonte, Ross; Berkner, Paul D

    2016-01-01

    Amateur athletic programs often use computerized cognitive testing as part of their concussion management programs. There is evidence that athletes with preexisting attention problems will have worse cognitive performance and more symptoms at baseline testing. The purpose of this study was to examine whether attention problems affect assessments differently for male and female athletes. Participants were drawn from a database that included 6,840 adolescents from Maine who completed Immediate Postconcussion Assessment and Cognitive Testing (ImPACT) at baseline (primary outcome measure). The final sample included 249 boys and 100 girls with self-reported attention problems. Each participant was individually matched for sex, age, number of past concussions, and sport to a control participant (249 boys, 100 girls). Boys with attention problems had worse reaction time than boys without attention problems. Girls with attention problems had worse visual-motor speed than girls without attention problems. Boys with attention problems reported more total symptoms, including more cognitive-sensory and sleep-arousal symptoms, compared with boys without attention problems. Girls with attention problems reported more cognitive-sensory, sleep-arousal, and affective symptoms than girls without attention problems. When considering the assessment, management, and outcome from concussions in adolescent athletes, it is important to consider both sex and preinjury attention problems regarding cognitive test results and symptom reporting.

  14. PENNSYLVANIA BASELINE

    EPA Science Inventory

    This report was prepared as part of the Ohio River Basin Energy Study (ORBES), a multidisciplinary policy research program supported by the Environmental Protection Agency. Its purpose is to provide baseline information on Pennsylvania, one of the six states included partly or to...

  15. Jet aircraft engine exhaust emissions database development: Year 1990 and 2015 scenarios

    NASA Technical Reports Server (NTRS)

    Landau, Z. Harry; Metwally, Munir; Vanalstyne, Richard; Ward, Clay A.

    1994-01-01

    Studies relating to environmental emissions associated with the High Speed Civil Transport (HSCT) military jet and charter jet aircraft were conducted by McDonnell Douglas Aerospace Transport Aircraft. The report includes engine emission results for baseline 1990 charter and military scenario and the projected jet engine emissions results for a 2015 scenario for a Mach 1.6 HSCT charter and military fleet. Discussions of the methodology used in formulating these databases are provided.

  16. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  17. Mining of high utility-probability sequential patterns from uncertain databases

    PubMed Central

    Zhang, Binbin; Fournier-Viger, Philippe; Li, Ting

    2017-01-01

    High-utility sequential pattern mining (HUSPM) has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs). They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM) for mining high utility-probability sequential patterns (HUPSPs) in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds. PMID:28742847

  18. Assisting Scientific and Technical Research Through Subject Oriented Bibliographies of NTIS Reports.

    ERIC Educational Resources Information Center

    Schwarzwalder, Robert N., Jr.

    A program combining cost-free searching of the National Technical Information Service (NTIS) database and document delivery to faculty members was offered at the Kansas State University Libraries. NTIS report usage was monitored from May 1987, five months prior to the onset of the study, until May 1988, at which time the program was terminated.…

  19. The International Linear Collider Technical Design Report - Volume 3.I: Accelerator \\& in the Technical Design Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adolphsen, Chris

    2013-06-26

    The International Linear Collider Technical Design Report (TDR) describes in four volumes the physics case and the design of a 500 GeV centre-of-mass energy linear electron-positron collider based on superconducting radio-frequency technology using Niobium cavities as the accelerating structures. The accelerator can be extended to 1 TeV and also run as a Higgs factory at around 250 GeV and on the Z0 pole. A comprehensive value estimate of the accelerator is give, together with associated uncertainties. It is shown that no significant technical issues remain to be solved. Once a site is selected and the necessary site-dependent engineering is carriedmore » out, construction can begin immediately. The TDR also gives baseline documentation for two high-performance detectors that can share the ILC luminosity by being moved into and out of the beam line in a "push-pull" configuration. These detectors, ILD and SiD, are described in detail. They form the basis for a world-class experimental programme that promises to increase significantly our understanding of the fundamental processes that govern the evolution of the Universe.« less

  20. Catalog of databases and reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtis, M.D.

    1997-04-01

    This catalog provides information about the many reports and materials made available by the US Department of Energy`s (DOE`s) Global Change Research Program (GCRP) and the Carbon Dioxide Information Analysis Center (CDIAC). The catalog is divided into nine sections plus the author and title indexes: Section A--US Department of Energy Global Change Research Program Research Plans and Summaries; Section B--US Department of Energy Global Change Research Program Technical Reports; Section C--US Department of Energy Atmospheric Radiation Measurement (ARM) Program Reports; Section D--Other US Department of Energy Reports; Section E--CDIAC Reports; Section F--CDIAC Numeric Data and Computer Model Distribution; Section G--Othermore » Databases Distributed by CDIAC; Section H--US Department of Agriculture Reports on Response of Vegetation to Carbon Dioxide; and Section I--Other Publications.« less

  1. [International bibliographic databases--Current Contents on disk and in FTP format (Internet): presentation and guide].

    PubMed

    Bloch-Mouillet, E

    1999-01-01

    This paper aims to provide technical and practical advice about finding references using Current Contents on disk (Macintosh or PC) or via the Internet (FTP). Seven editions are published each week. They are all organized in the same way and have the same search engine. The Life Sciences edition, extensively used in medical research, is presented here in detail, as an example. This methodological note explains, in French, how to use this reference database. It is designed to be a practical guide for browsing and searching the database, and particularly for creating search profiles adapted to the needs of researchers.

  2. Advancing Commercialization of Algal Biofuel through Increased Biomass Productivity and Technical Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anton, David

    The proposed project built on the foundation of over several years years of intensive and ground-breaking R&D work at Cellana's Kona Demonstration Facility (KDF). Phycological and engineering solutions were provided to tackle key cultivation issues and technical barriers limiting algal biomass productivity identified through work conducted outdoors at industrial (1 acre) scale. The objectives of this project were to significantly improve algal biomass productivity and reduce operational cost in a seawater-based system, using results obtained from two top-performing algal strains as the baseline while technically advancing and more importantly, integrating the various unit operations involved in algal biomass production, processing,more » and refining.« less

  3. Technical and Policy Approaches to Balancing Patient Privacy and Data Sharing in Clinical and Translational Research

    PubMed Central

    Malin, Bradley; Karp, David; Scheuermann, Richard H.

    2010-01-01

    Clinical researchers need to share data to support scientific validation and information reuse, and to comply with a host of regulations and directives from funders. Various organizations are constructing informatics resources in the form of centralized databases to ensure widespread availability of data derived from sponsored research. The widespread use of such open databases is contingent on the protection of patient privacy. In this paper, we review several aspects of the privacy-related problems associated with data sharing for clinical research from technical and policy perspectives. We begin with a review of existing policies for secondary data sharing and privacy requirements in the context of data derived from research and clinical settings. In particular, we focus on policies specified by the U.S. National Institutes of Health and the Health Insurance Portability and Accountability Act and touch upon how these policies are related to current, as well as future, use of data stored in public database archives. Next, we address aspects of data privacy and “identifiability” from a more technical perspective, and review how biomedical databanks can be exploited and seemingly anonymous records can be “re-identified” using various resources without compromising or hacking into secure computer systems. We highlight which data features specified in clinical research data models are potentially vulnerable or exploitable. In the process, we recount a recent privacy-related concern associated with the publication of aggregate statistics from pooled genome-wide association studies that has had a significant impact on the data sharing policies of NIH-sponsored databanks. Finally, we conclude with a list of recommendations that cover various technical, legal, and policy mechanisms that open clinical databases can adopt to strengthen data privacy protections as they move toward wider deployment and adoption. PMID:20051768

  4. Using linked administrative and disease-specific databases to study end-of-life care on a population level.

    PubMed

    Maetens, Arno; De Schreye, Robrecht; Faes, Kristof; Houttekier, Dirk; Deliens, Luc; Gielen, Birgit; De Gendt, Cindy; Lusyne, Patrick; Annemans, Lieven; Cohen, Joachim

    2016-10-18

    The use of full-population databases is under-explored to study the use, quality and costs of end-of-life care. Using the case of Belgium, we explored: (1) which full-population databases provide valid information about end-of-life care, (2) what procedures are there to use these databases, and (3) what is needed to integrate separate databases. Technical and privacy-related aspects of linking and accessing Belgian administrative databases and disease registries were assessed in cooperation with the database administrators and privacy commission bodies. For all relevant databases, we followed procedures in cooperation with database administrators to link the databases and to access the data. We identified several databases as fitting for end-of-life care research in Belgium: the InterMutualistic Agency's national registry of health care claims data, the Belgian Cancer Registry including data on incidence of cancer, and databases administrated by Statistics Belgium including data from the death certificate database, the socio-economic survey and fiscal data. To obtain access to the data, approval was required from all database administrators, supervisory bodies and two separate national privacy bodies. Two Trusted Third Parties linked the databases via a deterministic matching procedure using multiple encrypted social security numbers. In this article we describe how various routinely collected population-level databases and disease registries can be accessed and linked to study patterns in the use, quality and costs of end-of-life care in the full population and in specific diagnostic groups.

  5. Advanced light source: Compendium of user abstracts and technical reports,1993-1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    1997-04-01

    This compendium contains abstracts written by users summarizing research completed or in progress from 1993-1996, ALS technical reports describing ongoing efforts related to improvement in machine operations and research and development projects, and information on ALS beamlines planned through 1998. Two tables of contents organize the user abstracts by beamline and by area of research, and an author index makes abstracts accessible by author and by principal investigator. Technical details for each beamline including whom to contact for additional information can be found in the beamline information section. Separate abstracts have been indexed into the database for contributions to thismore » compendium.« less

  6. A Database of Historical Information on Landslides and Floods in Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, F.; Tonelli, G.

    2003-04-01

    For the past 12 years we have maintained and updated a database of historical information on landslides and floods in Italy, known as the National Research Council's AVI (Damaged Urban Areas) Project archive. The database was originally designed to respond to a specific request of the Minister of Civil Protection, and was aimed at helping the regional assessment of landslide and flood risk in Italy. The database was first constructed in 1991-92 to cover the period 1917 to 1990. Information of damaging landslide and flood event was collected by searching archives, by screening thousands of newspaper issues, by reviewing the existing technical and scientific literature on landslides and floods in Italy, and by interviewing landslide and flood experts. The database was then updated chiefly through the analysis of hundreds of newspaper articles, and it now covers systematically the period 1900 to 1998, and non-systematically the periods 1900 to 1916 and 1999 to 2002. Non systematic information on landslide and flood events older than 20th century is also present in the database. The database currently contains information on more than 32,000 landslide events occurred at more than 25,700 sites, and on more than 28,800 flood events occurred at more than 15,600 sites. After a brief outline of the history and evolution of the AVI Project archive, we present and discuss: (a) the present structure of the database, including the hardware and software solutions adopted to maintain, manage, use and disseminate the information stored in the database, (b) the type and amount of information stored in the database, including an estimate of its completeness, and (c) examples of recent applications of the database, including a web-based GIS systems to show the location of sites historically affected by landslides and floods, and an estimate of geo-hydrological (i.e., landslide and flood) risk in Italy based on the available historical information.

  7. Analysis of Users' Searches of CD-ROM Databases in the National and University Library in Zagreb.

    ERIC Educational Resources Information Center

    Jokic, Maja

    1997-01-01

    Investigates the search behavior of CD-ROM database users in Zagreb (Croatia) libraries: one group needed a minimum of technical assistance, and the other was completely independent. Highlights include the use of questionnaires and transaction log analysis and the need for end-user education. The questionnaire and definitions of search process…

  8. Evaluating Technical Efficiency of Nursing Care Using Data Envelopment Analysis and Multilevel Modeling.

    PubMed

    Min, Ari; Park, Chang Gi; Scott, Linda D

    2016-05-23

    Data envelopment analysis (DEA) is an advantageous non-parametric technique for evaluating relative efficiency of performance. This article describes use of DEA to estimate technical efficiency of nursing care and demonstrates the benefits of using multilevel modeling to identify characteristics of efficient facilities in the second stage of analysis. Data were drawn from LTCFocUS.org, a secondary database including nursing home data from the Online Survey Certification and Reporting System and Minimum Data Set. In this example, 2,267 non-hospital-based nursing homes were evaluated. Use of DEA with nurse staffing levels as inputs and quality of care as outputs allowed estimation of the relative technical efficiency of nursing care in these facilities. In the second stage, multilevel modeling was applied to identify organizational factors contributing to technical efficiency. Use of multilevel modeling avoided biased estimation of findings for nested data and provided comprehensive information on differences in technical efficiency among counties and states. © The Author(s) 2016.

  9. Technical literature review.

    PubMed

    Nußbeck, Gunnar; Gök, Murat

    2013-01-01

    This review gives a comprehensive overview on the technical perspective of personal health monitoring. It is designed to build a mutual basis for the project partners of the PHM-Ethics project. A literature search was conducted to screen pertinent literature databases for relevant publications. All review papers that were retrieved were analyzed. The increasing number of publications that are published per year shows that the field of personal health monitoring is of growing interest in the research community. Most publications deal with telemonitoring, thus forming the core technology of personal health monitoring. Measured parameters, fields of application, participants and stakeholders are described. Moreover an outlook on information and communication technology that foster the integration possibilities of personal health monitoring into decision making and remote monitoring of individual people's health is provided. The removal of the technological barriers opens new perspectives in health and health care delivery using home monitoring applications.

  10. Improvement of medication event interventions through use of an electronic database.

    PubMed

    Merandi, Jenna; Morvay, Shelly; Lewe, Dorcas; Stewart, Barb; Catt, Char; Chanthasene, Phillip P; McClead, Richard; Kappeler, Karl; Mirtallo, Jay M

    2013-10-01

    Patient safety enhancements achieved through the use of an electronic Web-based system for responding to adverse drug events (ADEs) are described. A two-phase initiative was carried out at an academic pediatric hospital to improve processes related to "medication event huddles" (interdisciplinary meetings focused on ADE interventions). Phase 1 of the initiative entailed a review of huddles and interventions over a 16-month baseline period during which multiple databases were used to manage the huddle process and staff interventions were assigned via manually generated e-mail reminders. Phase 1 data collection included ADE details (e.g., medications and staff involved, location and date of event) and the types and frequencies of interventions. Based on the phase 1 analysis, an electronic database was created to eliminate the use of multiple systems for huddle scheduling and documentation and to automatically generate e-mail reminders on assigned interventions. In phase 2 of the initiative, the impact of the database during a 5-month period was evaluated; the primary outcome was the percentage of interventions documented as completed after database implementation. During the postimplementation period, 44.7% of assigned interventions were completed, compared with a completion rate of 21% during the preimplementation period, and interventions documented as incomplete decreased from 77% to 43.7% (p < 0.0001). Process changes, education, and medication order improvements were the most frequently documented categories of interventions. Implementation of a user-friendly electronic database improved intervention completion and documentation after medication event huddles.

  11. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  12. 4. International reservoir characterization technical conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-04-01

    This volume contains the Proceedings of the Fourth International Reservoir Characterization Technical Conference held March 2-4, 1997 in Houston, Texas. The theme for the conference was Advances in Reservoir Characterization for Effective Reservoir Management. On March 2, 1997, the DOE Class Workshop kicked off with tutorials by Dr. Steve Begg (BP Exploration) and Dr. Ganesh Thakur (Chevron). Tutorial presentations are not included in these Proceedings but may be available from the authors. The conference consisted of the following topics: data acquisition; reservoir modeling; scaling reservoir properties; and managing uncertainty. Selected papers have been processed separately for inclusion in the Energymore » Science and Technology database.« less

  13. Genetics and Forensics: Making the National DNA Database

    PubMed Central

    Johnson, Paul; Williams, Robin; Martin, Paul

    2005-01-01

    This paper is based on a current study of the growing police use of the epistemic authority of molecular biology for the identification of criminal suspects in support of crime investigation. It discusses the development of DNA profiling and the establishment and development of the UK National DNA Database (NDNAD) as an instance of the ‘scientification of police work’ (Ericson and Shearing 1986) in which the police uses of science and technology have a recursive effect on their future development. The NDNAD, owned by the Association of Chief Police Officers of England and Wales, is the first of its kind in the world and currently contains the genetic profiles of more than 2 million people. The paper provides a framework for the examination of this socio-technical innovation, begins to tease out the dense and compact history of the database and accounts for the way in which changes and developments across disparate scientific, governmental and policing contexts, have all contributed to the range of uses to which it is put. PMID:16467921

  14. An Evaluation of Selected NASA Scientific and Technical Information Products: Results of a Pilot Study.

    ERIC Educational Resources Information Center

    Pinelli, Thomas E.; Glassman, Myron

    A pilot study was conducted to evaluate selected NASA (National Aeronautics and Space Administration) scientific and technical information (STI) products. The study, which utilized survey research in the form of a self-administered mail questionnaire, had a two-fold purpose--to gather baseline data on the use and perceived usefulness of selected…

  15. Using Online Databases to Determine the Correlation between Ranked Lists of Journals.

    DTIC Science & Technology

    1984-12-30

    CLASSIFICATION UNCLASSIFIED/UNLIMITED EZ SAME AS RPT. 0 OTIC USERS 0l UNCLASSIFIED 22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE NUMBER 22c. OFFICE SYMBOL...Communications Agency. The purpose of the study was to use citation analysis and statistical testing in journal selection. Bibliographic databases...sources to justify the cost of the journals selected. The research procedures used in this study included th compilation of a list of 157 technical

  16. An online database for informing ecological network models: http://kelpforest.ucsc.edu.

    PubMed

    Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H; Tinker, Martin T; Black, August; Caselle, Jennifer E; Hoban, Michael; Malone, Dan; Iles, Alison

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui).

  17. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. Duringmore » the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)« less

  18. Preliminary development of a diabetic foot ulcer database from a wound electronic medical record: a tool to decrease limb amputations.

    PubMed

    Golinko, Michael S; Margolis, David J; Tal, Adit; Hoffstad, Ole; Boulton, Andrew J M; Brem, Harold

    2009-01-01

    Our objective was to create a practical standardized database of clinically relevant variables in the care of patients with diabetes and foot ulcers. Numerical clinical variables such as age, baseline laboratory values, and wound area were extracted from the wound electronic medical record (WEMR). A coding system was developed to translate narrative data, culture, and pathology reports into discrete, quantifiable variables. Using data extracted from the WEMR, a diabetic foot ulcer-specific database incorporated the following tables: (1) demographics, medical history, and baseline laboratory values; (2) vascular testing data; (3) radiology data; (4) wound characteristics; and (5) wound debridement data including pathology, culture results, and amputation data. The database contains variables that can be easily exported for analysis. Amputation was studied in 146 patients who had at least two visits (e.g., two entries in the database). Analysis revealed that 19 (13%) patients underwent 32 amputations (nine major and 23 minor) in 23 limbs. There was a decreased risk of amputation, 0.87 (0.78, 1.00), using a proportional hazards model, associated with an increased number of visits and entries in the WEMR. Further analysis revealed no significant difference in age, gender, HbA1c%, cholesterol, white blood cell count, or prealbumin at baseline, whereas hemoglobin and albumin were significantly lower in the amputee group (p<0.05) than the nonamputee group. Fifty-nine percent of amputees had histological osteomyelitis based on operating room biopsy vs. 45% of nonamputees. In conclusion, tracking patients with a WEMR is a tool that could potentially increase patient safety and quality of care, allowing clinicians to more easily identify a nonhealing wound and intervene. This report describes a method of capturing data relevant to clinical care of a patient with a diabetic foot ulcer, and may enable clinicians to adapt such a system to their own patient population.

  19. PSD Increment Baseline

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  20. PSD Determination - Baseline

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  1. Mental practice enhances surgical technical skills: a randomized controlled study.

    PubMed

    Arora, Sonal; Aggarwal, Rajesh; Sirimanna, Pramudith; Moran, Aidan; Grantcharov, Teodor; Kneebone, Roger; Sevdalis, Nick; Darzi, Ara

    2011-02-01

    To assess the effects of mental practice on surgical performance. Increasing concerns for patient safety have highlighted a need for alternative training strategies outside the operating room. Mental practice (MP), "the cognitive rehearsal of a task before performance," has been successful in sport and music to enhance skill. This study investigates whether MP enhances performance in laparoscopic surgery. After baseline skills testing, 20 novice surgeons underwent training on an evidence-based virtual reality curriculum. After randomization using the closed envelope technique, all participants performed 5 Virtual Reality (VR) laparoscopic cholecystectomies (LC). Mental practice participants performed 30 minutes of MP before each LC; control participants viewed an online lecture. Technical performance was assessed using video Objective Structured Assessment of Technical Skills-based global ratings scale (scored from 7 to 35). Mental imagery was assessed using a previously validated Mental Imagery Questionnaire. Eighteen participants completed the study. There were no intergroup differences in baseline technical ability. Learning curves were demonstrated for both MP and control groups. Mental practice was superior to control (global ratings) for the first LC (median 20 vs 15, P = 0.005), second LC (20.5 vs 13.5, P = 0.001), third LC (24 vs 15.5, P < 0.001), fourth LC (25.5 vs 15.5, P < 0.001) and the fifth LC (27.5 vs 19.5, P = 0.00). The imagery for the MP group was also significantly superior to the control group across all sessions (P < 0.05). Improved imagery significantly correlated with better quality of performance (ρ 0.51–0.62, Ps < 0.05). This is the first randomized controlled study to show that MP enhances the quality of performance based on VR laparoscopic cholecystectomy. This may be a time- and cost-effective strategy to augment traditional training in the OR thus potentially improving patient care.

  2. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    PubMed Central

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  3. NASA's experience in the international exchange of scientific and technical information in the aerospace field

    NASA Technical Reports Server (NTRS)

    Thibideau, Philip A.

    1989-01-01

    The early NASA international scientific and technical information (STI) exchange arrangements were usually detailed in correspondence with the librarians of the institutions involved. While this type of exchange, which involved only hardcopy (paper) products, grew to include some 220 organization in 43 countries, NASA's main focus shifted substantially to the STI relationship with the European Space Agency (ESA) which began in 1964. The NASA/ESA Tripartite Exchange Program, which now has more than 500 participants, provides more than 4,000 highly-relevant technical reports, fully processed, for the NASA produced 'Aerospace Database'. In turn, NASA provides an updated copy of this Database, known in Europe as the 'NASA File', for access, through ESA's Information Retrieval Service, by participating European organizations. Our experience in the evolving cooperation with ESA has established the 'model' for our more recent exchange agreements with Israel, Australia, Canada, and one under negotiation with Japan. The results of these agreements are made available to participating European organizations through the NASA File.

  4. Assessing Technical Competence in Surgical Trainees: A Systematic Review.

    PubMed

    Szasz, Peter; Louridas, Marisa; Harris, Kenneth A; Aggarwal, Rajesh; Grantcharov, Teodor P

    2015-06-01

    To systematically examine the literature describing the methods by which technical competence is assessed in surgical trainees. The last decade has witnessed an evolution away from time-based surgical education. In response, governing bodies worldwide have implemented competency-based education paradigms. The definition of competence, however, remains elusive, and the impact of these education initiatives in terms of assessment methods remains unclear. A systematic review examining the methods by which technical competence is assessed was conducted by searching MEDLINE, EMBASE, PsychINFO, and the Cochrane database of systematic reviews. Abstracts of retrieved studies were reviewed and those meeting inclusion criteria were selected for full review. Data were retrieved in a systematic manner, the validity and reliability of the assessment methods was evaluated, and quality was assessed using the Grading of Recommendations Assessment, Development and Evaluation classification. Of the 6814 studies identified, 85 studies involving 2369 surgical residents were included in this review. The methods used to assess technical competence were categorized into 5 groups; Likert scales (37), benchmarks (31), binary outcomes (11), novel tools (4), and surrogate outcomes (2). Their validity and reliability were mostly previously established. The overall Grading of Recommendations Assessment, Development and Evaluation for randomized controlled trials was high and low for the observational studies. The definition of technical competence continues to be debated within the medical literature. The methods used to evaluate technical competence predominantly include instruments that were originally created to assess technical skill. Very few studies identify standard setting approaches that differentiate competent versus noncompetent performers; subsequently, this has been identified as an area with great research potential.

  5. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  6. FAQs about Baseline Testing among Young Athletes

    MedlinePlus

    ... your league prepare for concussions both pre- and post-season. What is baseline testing? Baseline testing is a pre-season exam conducted by a trained health care professional. Baseline tests are used to assess an athlete’s balance and ...

  7. Effects of technical editing in biomedical journals: a systematic review.

    PubMed

    Wager, Elizabeth; Middleton, Philippa

    2002-06-05

    Technical editing supposedly improves the accuracy and clarity of journal articles. We examined evidence of its effects on research reports in biomedical journals. Subset of a systematic review using Cochrane methods, searching MEDLINE, EMBASE, and other databases from earliest entries to February 2000 by using inclusive search terms; hand searching relevant journals. We selected comparative studies of the effects of editorial processes on original research articles between acceptance and publication in biomedical journals. Two reviewers assessed each study and performed independent data extraction. The 11 studies on technical editing indicate that it improves the readability of articles slightly (as measured by Gunning Fog and Flesch reading ease scores), may improve other aspects of their quality, can increase the accuracy of references and quotations, and raises the quality of abstracts. Supplying authors with abstract preparation instructions had no discernible effect. Considering the time and resources devoted to technical editing, remarkably little is know about its effects or the effects of imposing different house styles. Studies performed at 3 journals employing relatively large numbers of professional technical editors suggest that their editorial processes are associated with increases in readability and quality of articles, but these findings may not be generalizable to other journals.

  8. Database for earthquake strong motion studies in Italy

    USGS Publications Warehouse

    Scasserra, G.; Stewart, J.P.; Kayen, R.E.; Lanzo, G.

    2009-01-01

    We describe an Italian database of strong ground motion recordings and databanks delineating conditions at the instrument sites and characteristics of the seismic sources. The strong motion database consists of 247 corrected recordings from 89 earthquakes and 101 recording stations. Uncorrected recordings were drawn from public web sites and processed on a record-by-record basis using a procedure utilized in the Next-Generation Attenuation (NGA) project to remove instrument resonances, minimize noise effects through low- and high-pass filtering, and baseline correction. The number of available uncorrected recordings was reduced by 52% (mostly because of s-triggers) to arrive at the 247 recordings in the database. The site databank includes for every recording site the surface geology, a measurement or estimate of average shear wave velocity in the upper 30 m (Vs30), and information on instrument housing. Of the 89 sites, 39 have on-site velocity measurements (17 of which were performed as part of this study using SASW techniques). For remaining sites, we estimate Vs30 based on measurements on similar geologic conditions where available. Where no local velocity measurements are available, correlations with surface geology are used. Source parameters are drawn from databanks maintained (and recently updated) by Istituto Nazionale di Geofisica e Vulcanologia and include hypocenter location and magnitude for small events (M< ??? 5.5) and finite source parameters for larger events. ?? 2009 A.S. Elnashai & N.N. Ambraseys.

  9. Environmental baseline conditions for impact assessment of unconventional gas exploitation: the G-Baseline project

    NASA Astrophysics Data System (ADS)

    Kloppmann, Wolfram; Mayer, Berhard; Millot, Romain; Parker, Beth L.; Gaucher, Eric; Clarkson, Christopher R.; Cherry, John A.; Humez, Pauline; Cahill, Aaron

    2015-04-01

    A major scientific challenge and an indispensible prerequisite for environmental impact assessment in the context of unconventional gas development is the determination of the baseline conditions against which potential environmental impacts on shallow freshwater resources can be accurately and quantitatively tested. Groundwater and surface water resources overlying the low-permeability hydrocarbon host rocks containing shale gas may be impacted to different extents by naturally occurring saline fluids and by natural gas emanations. Baseline assessments in areas of previous conventional hydrocarbon production may also reveal anthropogenic impacts from these activities not related to unconventional gas development. Once unconventional gas exploitation has started, the baseline may be irrevocably lost by the intricate superposition of geogenic and potential anthropogenic contamination by stray gas, formation waters and chemicals used during hydraulic fracturing. The objective of the Franco-Canadian NSERC-ANR project G-Baseline is to develop an innovative and comprehensive methodology of geochemical and isotopic characterization of the environmental baseline for water and gas samples from all three essential zones: (1) the production zone, including flowback waters, (2) the intermediate zone comprised of overlying formations, and (3) shallow aquifers and surface water systems where contamination may result from diverse natural or human impacts. The outcome will be the establishment of a methodology based on innovative tracer and monitoring techniques, including traditional and non-traditional isotopes (C, H, O, S, B, Sr, Cl, Br, N, U, Li, Cu, Zn, CSIA...) for detecting, quantifying and modeling of potential leakage of stray gas and of saline formation water mixed with flowback fluids into fresh groundwater resources and surface waters taking into account the pathways and mechanisms of fluid and gas migration. Here we present an outline of the project as well as first

  10. Technical and tactical skills related to performance levels in tennis: A systematic review.

    PubMed

    Kolman, Nikki S; Kramer, Tamara; Elferink-Gemser, Marije T; Huijgen, Barbara C H; Visscher, Chris

    2018-06-11

    The aim of this systematic review is to provide an overview of outcome measures and instruments identified in the literature for examining technical and tactical skills in tennis related to performance levels. Such instruments can be used to identify talent or the specific skill development training needs of particular players. Searches for this review were conducted using the PubMed, Web of Science, and PsycInfo databases. Out of 733 publications identified through these searches, 40 articles were considered relevant and included in this study. They were divided into three categories: (1) technical skills, (2) tactical skills and (3) integrated technical and tactical skills. There was strong evidence that technical skills (ball velocity and to a lesser extent ball accuracy) and tactical skills (decision making, anticipation, tactical knowledge and visual search strategies) differed among players according to their performance levels. However, integrated measurement of these skills is required, because winning a point largely hinges on a tactical decision to perform a particular stroke (i.e., technical execution). Therefore, future research should focus on examining the relationship between these skills and tennis performance and on the development of integrated methods for measuring these skills.

  11. The technical communication practices of Russian and U.S. aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Keene, Michael L.; Flammia, Madelyn; Kennedy, John M.

    1993-01-01

    As part of Phase 4 of the NASA/DoD Aerospace Knowledge Diffusion Research Project, two studies were conducted that investigated the technical communication practices of Russian and U.S. aerospace engineers and scientists. Both studies had the same five objectives: first, to solicit the opinions of aerospace engineers and scientists regarding the importance of technical communication to their professions; second, to determine the use and production of technical communication by aerospace engineers and scientists; third, to seek their views about the appropriate content of the undergraduate course in technical communication; fourth, to determine aerospace engineers' and scientists' use of libraries, technical information centers, and on-line databases; and fifth, to determine the use and importance of computer and information technology to them. A self administered questionnaire was distributed to Russian aerospace engineers and scientists at the Central Aero-Hydrodynamic Institute (TsAGI) and to their U.S. counterparts at the NASA Ames Research Center and the NASA Langley Research Center. The completion rates for the Russian and U.S. surveys were 64 and 61 percent, respectively. Responses of the Russian and U.S. participants to selected questions are presented in this paper.

  12. Sortie laboratory, phase B technical summary. [design and operational requirements

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The design and operational requirements which evolved from Sortie Lab (SL) analysis are summarized. A source of requirements for systems is given along with experimental support for the SL, baseline. Basic design data covered include: configuration definition, mission analysis, experimental integration, safety, and logistics. A technical summary outlines characteristics which reflect the influence of the growth in SL capability and the results of the mission and operational analysis. Each of the selected areas is described in terms of objectives, equipment, operational concept, and support requirements.

  13. Integrated sequence and immunology filovirus database at Los Alamos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusim, Karina; Yoon, Hyejin; Foley, Brian

    The Ebola outbreak of 2013–15 infected more than 28,000 people and claimed more lives than all previous filovirus outbreaks combined. Governmental agencies, clinical teams, and the world scientific community pulled together in a multifaceted response ranging from prevention and disease control, to evaluating vaccines and therapeutics in human trials. We report that as this epidemic is finally coming to a close, refocusing on long-term prevention strategies becomes paramount. Given the very real threat of future filovirus outbreaks, and the inherent uncertainty of the next outbreak virus and geographic location, it is prudent to consider the extent and implications of knownmore » natural diversity in advancing vaccines and therapeutic approaches. To facilitate such consideration, we have updated and enhanced the content of the filovirus portion of Los Alamos Hemorrhagic Fever Viruses Database. We have integrated and performed baseline analysis of all family Filoviridae sequences deposited into GenBank, with associated immune response data, and metadata, and we have added new computational tools with web-interfaces to assist users with analysis. Here, we (i) describe the main features of updated database, (ii) provide integrated views and some basic analyses summarizing evolutionary patterns as they relate to geo-temporal data captured in the database and (iii) highlight the most conserved regions in the proteome that may be useful for a T cell vaccine strategy.« less

  14. Integrated sequence and immunology filovirus database at Los Alamos

    DOE PAGES

    Yusim, Karina; Yoon, Hyejin; Foley, Brian; ...

    2016-01-01

    The Ebola outbreak of 2013–15 infected more than 28,000 people and claimed more lives than all previous filovirus outbreaks combined. Governmental agencies, clinical teams, and the world scientific community pulled together in a multifaceted response ranging from prevention and disease control, to evaluating vaccines and therapeutics in human trials. We report that as this epidemic is finally coming to a close, refocusing on long-term prevention strategies becomes paramount. Given the very real threat of future filovirus outbreaks, and the inherent uncertainty of the next outbreak virus and geographic location, it is prudent to consider the extent and implications of knownmore » natural diversity in advancing vaccines and therapeutic approaches. To facilitate such consideration, we have updated and enhanced the content of the filovirus portion of Los Alamos Hemorrhagic Fever Viruses Database. We have integrated and performed baseline analysis of all family Filoviridae sequences deposited into GenBank, with associated immune response data, and metadata, and we have added new computational tools with web-interfaces to assist users with analysis. Here, we (i) describe the main features of updated database, (ii) provide integrated views and some basic analyses summarizing evolutionary patterns as they relate to geo-temporal data captured in the database and (iii) highlight the most conserved regions in the proteome that may be useful for a T cell vaccine strategy.« less

  15. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  16. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Integrated baseline...

  17. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Integrated baseline...

  18. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Integrated baseline...

  19. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Integrated baseline...

  20. 48 CFR 1334.202 - Integrated baseline reviews.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1334.202 Integrated baseline reviews. An Integrated Baseline Review shall be conducted when an Earned Value Management System... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Integrated baseline...

  1. Information support of monitoring of technical condition of buildings in construction risk area

    NASA Astrophysics Data System (ADS)

    Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.

    2018-05-01

    The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.

  2. Baseline Caesium-137 and Plutonium-239+240 inventory assessment for Central Europe

    NASA Astrophysics Data System (ADS)

    Meusburger, Katrin; Borelli, Pasquale; Evrard, Olivier; Ketterer, Michael; Mabit, Lionel; van Oost, Kristof; Alewell, Christine; Panagos, Panos

    2017-04-01

    Artificial fallout radionuclides (FRNs) such as Caesium-137 and Plutonium-239+240 released as products of the thermonuclear weapons testing that took place from the mid-1950s to the early 1980s and from nuclear power plant accidents (e.g. Chernobyl) are useful tools to quantify soil redistribution. In combination with geostatistics, FRNs may have the potential to bridge the gap between small scale process oriented studies and modelling that simplifies processes and effects over large spatial scales. An essential requirement for the application of FRNs as soil erosion tracers is the establishment of the baseline fallout at undisturbed sites before its comparison to those inventories found at sites undergoing erosion/accumulation. For this purpose, undisturbed topsoil (0-20cm) samples collected in 2009 within the framework of the Land Use/Cover Area frame Survey (LUCAS) have been measured by gamma-spectrometry and ICP-MS to determine 137Cs (n=145) and 239+240Pu (n=108) activities. To restrict the analysis to undisturbed reference sites a geospatial database query selecting only sites having a slope angle <2 degree, outside riparian zones (to avoid depositional sites) and under permanent grassland cover (according to CORINE Land Cover and Landsat) was applied. This study reports preliminary results on the feasibility of establishing a 137Cs and 239+240Pu baseline inventory map for Central Europe. The 137Cs/239+240Pu activity ratios will further allow assessing the rate and the spatial variability of 137Cs Chernobyl fallout. The establishment of such baseline inventory map will provide a unique opportunity to assess soil redistribution for a comparable time-frame (1953-2009) following a harmonised methodological protocol across national boundaries.

  3. An Online Database for Informing Ecological Network Models: http://kelpforest.ucsc.edu

    PubMed Central

    Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H.; Tinker, Martin T.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui). PMID:25343723

  4. An online database for informing ecological network models: http://kelpforest.ucsc.edu

    USGS Publications Warehouse

    Beas-Luna, Rodrigo; Tinker, M. Tim; Novak, Mark; Carr, Mark H.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison C.

    2014-01-01

    Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/data​baseui).

  5. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  6. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  7. Satellite-Surface Perspectives of Air Quality and Aerosol-Cloud Effects on the Environment: An Overview of 7-SEAS BASELInE

    NASA Technical Reports Server (NTRS)

    Tsay, Si-Chee; Maring, Hal B.; Lin, Neng-Huei; Buntoung, Sumaman; Chantara, Somporn; Chuang, Hsiao-Chi; Gabriel, Philip M.; Goodloe, Colby S.; Holben, Brent N.; Hsiao, Ta-Chih; hide

    2016-01-01

    The objectives of 7-SEASBASELInE (Seven SouthEast Asian Studies Biomass-burning Aerosols and Stratocumulus Environment: Lifecycles and Interactions Experiment) campaigns in spring 2013-2015 were to synergize measurements from uniquely distributed ground-based networks (e.g., AERONET (AErosol RObotic NETwork)), MPLNET ( NASA Micro-Pulse Lidar Network)) and sophisticated platforms (e.g.,SMARTLabs (Surface-based Mobile Atmospheric Research and Testbed Laboratories), regional contributing instruments), along with satellite observations retrievals and regional atmospheric transport chemical models to establish a critically needed database, and to advance our understanding of biomass-burning aerosols and trace gases in Southeast Asia (SEA). We present a satellite-surface perspective of 7-SEASBASELInE and highlight scientific findings concerning: (1) regional meteorology of moisture fields conducive to the production and maintenance of low-level stratiform clouds over land; (2) atmospheric composition in a biomass-burning environment, particularly tracers-markers to serve as important indicators for assessing the state and evolution of atmospheric constituents; (3) applications of remote sensing to air quality and impact on radiative energetics, examining the effect of diurnal variability of boundary-layer height on aerosol loading; (4) aerosol hygroscopicity and ground-based cloud radar measurements in aerosol-cloud processes by advanced cloud ensemble models; and (5) implications of air quality, in terms of toxicity of nanoparticles and trace gases, to human health. This volume is the third 7-SEAS special issue (after Atmospheric Research, vol. 122, 2013; and Atmospheric Environment, vol. 78, 2013) and includes 27 papers published, with emphasis on air quality and aerosol-cloud effects on the environment. BASELInE observations of stratiform clouds over SEA are unique, such clouds are embedded in a heavy aerosol-laden environment and feature characteristically greater

  8. "Mr. Database" : Jim Gray and the History of Database Technologies.

    PubMed

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  9. 40 CFR 80.92 - Baseline auditor requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  10. 10 CFR 850.20 - Baseline beryllium inventory.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the locations of beryllium operations and other locations of potential beryllium contamination, and identify the...

  11. Database on unstable rock slopes in Norway

    NASA Astrophysics Data System (ADS)

    Oppikofer, Thierry; Nordahl, Bo; Bunkholt, Halvor; Nicolaisen, Magnus; Hermanns, Reginald L.; Böhme, Martina; Yugsi Molina, Freddy X.

    2014-05-01

    Several large rockslides have occurred in historic times in Norway causing many casualties. Most of these casualties are due to displacement waves triggered by a rock avalanche and affecting coast lines of entire lakes and fjords. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected up to now more than 230 unstable slopes with significant postglacial deformation. This systematic mapping aims to detect future rock avalanches before they occur. The registered unstable rock slopes are stored in a database on unstable rock slopes developed and maintained by the Geological Survey of Norway. The main aims of this database are (1) to serve as a national archive for unstable rock slopes in Norway; (2) to serve for data collection and storage during field mapping; (3) to provide decision-makers with hazard zones and other necessary information on unstable rock slopes for land-use planning and mitigation; and (4) to inform the public through an online map service. The database is organized hierarchically with a main point for each unstable rock slope to which several feature classes and tables are linked. This main point feature class includes several general attributes of the unstable rock slopes, such as site name, general and geological descriptions, executed works, recommendations, technical parameters (volume, lithology, mechanism and others), displacement rates, possible consequences, hazard and risk classification and so on. Feature classes and tables linked to the main feature class include the run-out area, the area effected by secondary effects, the hazard and risk classification, subareas and scenarios of an unstable rock slope, field observation points, displacement measurement stations, URL links for further documentation and references. The database on unstable rock slopes in Norway will be publicly consultable through the online map service on www.skrednett.no in 2014. Only publicly relevant parts of

  12. Airport databases for 3D synthetic-vision flight-guidance displays: database design, quality assessment, and data generation

    NASA Astrophysics Data System (ADS)

    Friedrich, Axel; Raabe, Helmut; Schiefele, Jens; Doerr, Kai Uwe

    1999-07-01

    In future aircraft cockpit designs SVS (Synthetic Vision System) databases will be used to display 3D physical and virtual information to pilots. In contrast to pure warning systems (TAWS, MSAW, EGPWS) SVS serve to enhance pilot spatial awareness by 3-dimensional perspective views of the objects in the environment. Therefore all kind of aeronautical relevant data has to be integrated into the SVS-database: Navigation- data, terrain-data, obstacles and airport-Data. For the integration of all these data the concept of a GIS (Geographical Information System) based HQDB (High-Quality- Database) has been created at the TUD (Technical University Darmstadt). To enable database certification, quality- assessment procedures according to ICAO Annex 4, 11, 14 and 15 and RTCA DO-200A/EUROCAE ED76 were established in the concept. They can be differentiated in object-related quality- assessment-methods following the keywords accuracy, resolution, timeliness, traceability, assurance-level, completeness, format and GIS-related quality assessment methods with the keywords system-tolerances, logical consistence and visual quality assessment. An airport database is integrated in the concept as part of the High-Quality- Database. The contents of the HQDB are chosen so that they support both Flight-Guidance-SVS and other aeronautical applications like SMGCS (Surface Movement and Guidance Systems) and flight simulation as well. Most airport data are not available. Even though data for runways, threshold, taxilines and parking positions were to be generated by the end of 1997 (ICAO Annex 11 and 15) only a few countries fulfilled these requirements. For that reason methods of creating and certifying airport data have to be found. Remote sensing and digital photogrammetry serve as means to acquire large amounts of airport objects with high spatial resolution and accuracy in much shorter time than with classical surveying methods. Remotely sensed images can be acquired from satellite

  13. Integrated Electronic Health Record Database Management System: A Proposal.

    PubMed

    Schiza, Eirini C; Panos, George; David, Christiana; Petkov, Nicolai; Schizas, Christos N

    2015-01-01

    eHealth has attained significant importance as a new mechanism for health management and medical practice. However, the technological growth of eHealth is still limited by technical expertise needed to develop appropriate products. Researchers are constantly in a process of developing and testing new software for building and handling Clinical Medical Records, being renamed to Electronic Health Record (EHR) systems; EHRs take full advantage of the technological developments and at the same time provide increased diagnostic and treatment capabilities to doctors. A step to be considered for facilitating this aim is to involve more actively the doctor in building the fundamental steps for creating the EHR system and database. A global clinical patient record database management system can be electronically created by simulating real life medical practice health record taking and utilizing, analyzing the recorded parameters. This proposed approach demonstrates the effective implementation of a universal classic medical record in electronic form, a procedure by which, clinicians are led to utilize algorithms and intelligent systems for their differential diagnosis, final diagnosis and treatment strategies.

  14. 33 CFR 2.20 - Territorial sea baseline.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line defining the shoreward extent of the territorial sea of the United States drawn according to the principles...

  15. 33 CFR 2.20 - Territorial sea baseline.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line defining the shoreward extent of the territorial sea of the United States drawn according to the principles...

  16. 33 CFR 2.20 - Territorial sea baseline.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line defining the shoreward extent of the territorial sea of the United States drawn according to the principles...

  17. 33 CFR 2.20 - Territorial sea baseline.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line defining the shoreward extent of the territorial sea of the United States drawn according to the principles...

  18. 33 CFR 2.20 - Territorial sea baseline.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line defining the shoreward extent of the territorial sea of the United States drawn according to the principles...

  19. A new estimator for VLBI baseline length repeatability

    NASA Astrophysics Data System (ADS)

    Titov, O.

    2009-11-01

    The goal of this paper is to introduce a more effective technique to approximate for the “repeatability-baseline length” relationship that is used to evaluate the quality of geodetic VLBI results. Traditionally, this relationship is approximated by a quadratic function of baseline length over all baselines. The new model incorporates the mean number of observed group delays of the reference radio sources (i.e. estimated as global parameters) used in the estimation of each baseline. It is shown that the new method provides a better approximation of the “repeatability-baseline length” relationship than the traditional model. Further development of the new approach comes down to modeling the repeatability as a function of two parameters: baseline length and baseline slewing rate. Within the framework of this new approach the station vertical and horizontal uncertainties can be treated as a function of baseline length. While the previous relationship indicated that the station vertical uncertainties are generally 4-5 times larger than the horizontal uncertainties, the vertical uncertainties as determined by the new method are only larger by a factor of 1.44 over all baseline lengths.

  20. Baseline program

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.; Vonputtkamer, Jesco

    1992-01-01

    This assumed program was developed from several sources of information and is extrapolated over future decades using a set of reasonable assumptions based on incremental growth. The assumptions for the NASA baseline program are as follows: balanced emphasis in four domains; a constant level of activity; low to moderate real budget growth; maximum use of commonality; and realistic and practical technology development. The first domain is low Earth Orbit (LEO). Activities there are concentrated on the space station but extend on one side to Earth-pointing sensors for unmanned platforms and on the other to the launch and staging of unmanned solar system exploration missions. The second domain is geosynchronous Earth orbit (GEO) and cislunar space. Activities here include all GEO missions and operations, both unmanned and manned, and all transport of materials and crews between LEO and the vicinity of the Moon. The third domain is the Moon itself. Lunar activities are to include both orbiting and landing missions; the landings may be either unmanned or manned. The last domain is Mars. Missions to Mars will initially be unmanned but they will eventually be manned. Program elements and descriptions are discussed as are critiques of the NASA baseline.

  1. Pathogen Research Databases

    Science.gov Websites

    Hepatitis C Virus (HCV) database project is funded by the Division of Microbiology and Infectious Diseases of the National Institute of Allergies and Infectious Diseases (NIAID). The HCV database project started as a spin-off from the HIV database project. There are two databases for HCV, a sequence database

  2. Statistical baseline assessment in cardiotocography.

    PubMed

    Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.

  3. Infrared Database for Process Support Materials

    NASA Technical Reports Server (NTRS)

    Bennett, K. E.; Boothe, R. E.; Burns, H. D.

    2003-01-01

    Process support materials' compatibility with cleaning processes is critical to ensure final hardware cleanliness and that performance requirements are met. Previous discovery of potential contaminants in process materials shows the need for incoming materials testing and establishment of a process materials database. The Contamination Control Team of the Materials, Processes, and Manufacturing (MP&M) Department at Marshall Space Flight Center (MSFC) has initiated the development of such an infrared (IR) database, called the MSFC Process Materials IR database, of the common process support materials used at MSFC. These process support materials include solvents, wiper cloths, gloves, bagging materials, etc. Testing includes evaluation of the potential of gloves, wiper cloths, and other items to transfer contamination to handled articles in the absence of solvent exposure, and the potential for solvent exposure to induce material degradation. This Technical Memorandum (TM) summarizes the initial testing completed through December 2002. It is anticipated that additional testing will be conducted with updates provided in future TMs.Materials were analyzed using two different IR techniques: (1) Dry transference and (2) liquid extraction testing. The first of these techniques utilized the Nicolet Magna 750 IR spectrometer outfitted with a horizontal attenuated total reflectance (HATR) crystal accessory. The region from 650 to 4,000 wave numbers was analyzed, and 50 scans were performed per IR spectrum. A dry transference test was conducted by applying each sample with hand pressure to the HATR crystal to first obtain a spectrum of the parent material. The material was then removed from the HATR crystal and analyzed to determine the presence of any residues. If volatile, liquid samples were examined both prior to and following evaporation.The second technique was to perform an extraction test with each sample in five different solvents.Once the scans were complete for

  4. Impact of Medicare Advantage penetration and hospital competition on technical efficiency of nursing care in US intensive care units.

    PubMed

    Min, Ari; Scott, Linda D; Park, Chang; Vincent, Catherine; Ryan, Catherine J; Lee, Taewha

    2018-04-10

    This study aimed to evaluate technical efficiency of US intensive care units and determine the effects of environmental factors on technical efficiency in providing quality of nursing care. Data were obtained from the 2014 National Database of Nursing Quality Indicators and the Centers for Medicare and Medicaid Services. Data envelopment analysis was used to estimate technical efficiency for each intensive care unit. Multilevel modeling was used to determine the effects of environmental factors on technical efficiency. Overall, Medicare Advantage penetration and hospital competition in a market did not create pressure for intensive care units to become more efficient by reducing their inputs. However, these 2 environmental factors showed positive influences on technical efficiency in intensive care units with certain levels of technical efficiency. The implications of the study results for management strategies and health policy may vary according to the levels of technical efficiency in intensive care units. Further studies are needed to examine why and how intensive care units with particular levels of technical efficiency are differently affected by certain environmental factors. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Contact lens assessment in youth: methods and baseline findings.

    PubMed

    Lam, Dawn Y; Kinoshita, Beth T; Jansen, Meredith E; Mitchell, G Lynn; Chalmers, Robin L; McMahon, Timothy T; Richdale, Kathryn; Sorbara, Luigina; Wagner, Heidi

    2011-06-01

    To describe the Contact Lens Assessment in Youth (CLAY) Study design and report baseline data for a multicenter, retrospective, observational chart review of children, teenagers, and young adult soft contact lens (SCL) wearers. Clinical charts of SCL wearers aged 8 to 33 years were reviewed at six colleges of optometry. Data were captured retrospectively for eye care visits from January 2006 through September 2009. Patient demographics, SCL parameters, wearing schedules, care systems, and biomicroscopy findings and complications that interrupted SCL wear were entered into an online database. Charts from 3549 patients (14,276 visits) were reviewed; 78.8% were current SCL wearers and 21.2% were new fits. Age distribution was 8 to <13 years (n = 260, 7.3%), 13 to <18 years (n = 879, 24.8%), 18 to <26 years (n = 1,274, 36.0%), and 26 to <34 years (n = 1,136, 32.0%). The sample was 63.2% females and 37.7% college students. At baseline, 85.2% wore spherical SCLs, 13.5% torics, and 0.1% multifocals. Silicone hydrogel lenses were worn by 39.3% of the cohort. Daily wear was reported by 82.1%, whereas 17.9% reported any or occasional overnight wear. Multipurpose care systems were used by 78.1%, whereas another 9.9% indicated hydrogen peroxide solutions use. This data represent the SCL prescribing and wearing patterns for children, teenager, and young adult SCL wearers who presented for eye care in North American academic clinics. This will provide insight into SCL utilization, change in SCL refractive correction, and risk factors for SCL-related complications by age group.

  6. Aptamer Database

    PubMed Central

    Lee, Jennifer F.; Hesselberth, Jay R.; Meyers, Lauren Ancel; Ellington, Andrew D.

    2004-01-01

    The aptamer database is designed to contain comprehensive sequence information on aptamers and unnatural ribozymes that have been generated by in vitro selection methods. Such data are not normally collected in ‘natural’ sequence databases, such as GenBank. Besides serving as a storehouse of sequences that may have diagnostic or therapeutic utility, the database serves as a valuable resource for theoretical biologists who describe and explore fitness landscapes. The database is updated monthly and is publicly available at http://aptamer.icmb.utexas.edu/. PMID:14681367

  7. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    ERIC Educational Resources Information Center

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  8. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  9. The research infrastructure of Chinese foundations, a database for Chinese civil society studies

    PubMed Central

    Ma, Ji; Wang, Qun; Dong, Chao; Li, Huafang

    2017-01-01

    This paper provides technical details and user guidance on the Research Infrastructure of Chinese Foundations (RICF), a database of Chinese foundations, civil society, and social development in general. The structure of the RICF is deliberately designed and normalized according to the Three Normal Forms. The database schema consists of three major themes: foundations’ basic organizational profile (i.e., basic profile, board member, supervisor, staff, and related party tables), program information (i.e., program information, major program, program relationship, and major recipient tables), and financial information (i.e., financial position, financial activities, cash flow, activity overview, and large donation tables). The RICF’s data quality can be measured by four criteria: data source reputation and credibility, completeness, accuracy, and timeliness. Data records are properly versioned, allowing verification and replication for research purposes. PMID:28742065

  10. Sources of Variance in Baseline Gene Expression in the Rodent Liver

    PubMed Central

    Corton, J. Christopher; Bushel, Pierre R.; Fostel, Jennifer; O'Lone, Raegan B.

    2012-01-01

    The use of gene expression profiling in both clinical and laboratory settings would be enhanced by better characterization of variation due to individual, environmental, and technical factors. Analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in liver gene expression in the rodent. Here, studies which highlight contributions of different factors to gene expression variability in the rodent liver are discussed including a large meta-analysis of rat liver, which identified genes that vary in control animals in the absence of chemical treatment. Genes and their pathways that are the most and least variable were identified in a number of these studies. Life stage, fasting, sex, diet, circadian rhythm and liver lobe source can profoundly influence gene expression in the liver. Recognition of biological and technical factors that contribute to variability of background gene expression can help the investigator in the design of an experiment that maximizes sensitivity and reduces the influence of confounders that may lead to misinterpretation of genomic changes. The factors that contribute to variability in liver gene expression in rodents are likely analogous to those contributing to human interindividual variability in drug response and chemical toxicity. Identification of batteries of genes that are altered in a variety of background conditions could be used to predict responses to drugs and chemicals in appropriate models of the human liver. PMID:22230429

  11. Creating a literature database of low-calorie sweeteners and health studies: evidence mapping.

    PubMed

    Wang, Ding Ding; Shams-White, Marissa; Bright, Oliver John M; Parrott, J Scott; Chung, Mei

    2016-01-05

    Evidence mapping is an emerging tool used to systematically identify, organize and summarize the quantity and focus of scientific evidence on a broad topic, but there are currently no methodological standards. Using the topic of low-calorie sweeteners (LCS) and selected health outcomes, we describe the process of creating an evidence-map database and demonstrate several example descriptive analyses using this database. The process of creating an evidence-map database is described in detail. The steps include: developing a comprehensive literature search strategy, establishing study eligibility criteria and a systematic study selection process, extracting data, developing outcome groups with input from expert stakeholders and tabulating data using descriptive analyses. The database was uploaded onto SRDR™ (Systematic Review Data Repository), an open public data repository. Our final LCS evidence-map database included 225 studies, of which 208 were interventional studies and 17 were cohort studies. An example bubble plot was produced to display the evidence-map data and visualize research gaps according to four parameters: comparison types, population baseline health status, outcome groups, and study sample size. This plot indicated a lack of studies assessing appetite and dietary intake related outcomes using LCS with a sugar intake comparison in people with diabetes. Evidence mapping is an important tool for the contextualization of in-depth systematic reviews within broader literature and identifies gaps in the evidence base, which can be used to inform future research. An open evidence-map database has the potential to promote knowledge translation from nutrition science to policy.

  12. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    NASA Astrophysics Data System (ADS)

    Pinheiro, Muriel; Reigber, Andreas; Moreira, Alberto

    2017-09-01

    The global Digital Elevation Model (DEM) resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  13. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  14. Alaska Geochemical Database - Mineral Exploration Tool for the 21st Century - PDF of presentation

    USGS Publications Warehouse

    Granitto, Matthew; Schmidt, Jeanine M.; Labay, Keith A.; Shew, Nora B.; Gamble, Bruce M.

    2012-01-01

    The U.S. Geological Survey has created a geochemical database of geologic material samples collected in Alaska. This database is readily accessible to anyone with access to the Internet. Designed as a tool for mineral or environmental assessment, land management, or mineral exploration, the initial version of the Alaska Geochemical Database - U.S. Geological Survey Data Series 637 - contains geochemical, geologic, and geospatial data for 264,158 samples collected from 1962-2009: 108,909 rock samples; 92,701 sediment samples; 48,209 heavy-mineral-concentrate samples; 6,869 soil samples; and 7,470 mineral samples. In addition, the Alaska Geochemical Database contains mineralogic data for 18,138 nonmagnetic-fraction heavy mineral concentrates, making it the first U.S. Geological Survey database of this scope that contains both geochemical and mineralogic data. Examples from the Alaska Range will illustrate potential uses of the Alaska Geochemical Database in mineral exploration. Data from the Alaska Geochemical Database have been extensively checked for accuracy of sample media description, sample site location, and analytical method using U.S. Geological Survey sample-submittal archives and U.S. Geological Survey publications (plus field notebooks and sample site compilation base maps from the Alaska Technical Data Unit in Anchorage, Alaska). The database is also the repository for nearly all previously released U.S. Geological Survey Alaska geochemical datasets. Although the Alaska Geochemical Database is a fully relational database in Microsoft® Access 2003 and 2010 formats, these same data are also provided as a series of spreadsheet files in Microsoft® Excel 2003 and 2010 formats, and as ASCII text files. A DVD version of the Alaska Geochemical Database was released in October 2011, as U.S. Geological Survey Data Series 637, and data downloads are available at http://pubs.usgs.gov/ds/637/. Also, all Alaska Geochemical Database data have been incorporated into

  15. WEST VIRGINIA BASELINE

    EPA Science Inventory

    This report was prepared as part of the Ohio River Basin Energy Study (ORBES), a multidisciplinary policy research program supported by the Environmental Protection Agency. Its purpose is to provide baseline information on West Virginia, one of six states included partly or total...

  16. Database Administrator

    ERIC Educational Resources Information Center

    Moore, Pam

    2010-01-01

    The Internet and electronic commerce (e-commerce) generate lots of data. Data must be stored, organized, and managed. Database administrators, or DBAs, work with database software to find ways to do this. They identify user needs, set up computer databases, and test systems. They ensure that systems perform as they should and add people to the…

  17. FIREMON Database

    Treesearch

    John F. Caratti

    2006-01-01

    The FIREMON database software allows users to enter data, store, analyze, and summarize plot data, photos, and related documents. The FIREMON database software consists of a Java application and a Microsoft® Access database. The Java application provides the user interface with FIREMON data through data entry forms, data summary reports, and other data management tools...

  18. Temporal and Fine-Grained Pedestrian Action Recognition on Driving Recorder Database

    PubMed Central

    Satoh, Yutaka; Aoki, Yoshimitsu; Oikawa, Shoko; Matsui, Yasuhiro

    2018-01-01

    The paper presents an emerging issue of fine-grained pedestrian action recognition that induces an advanced pre-crush safety to estimate a pedestrian intention in advance. The fine-grained pedestrian actions include visually slight differences (e.g., walking straight and crossing), which are difficult to distinguish from each other. It is believed that the fine-grained action recognition induces a pedestrian intention estimation for a helpful advanced driver-assistance systems (ADAS). The following difficulties have been studied to achieve a fine-grained and accurate pedestrian action recognition: (i) In order to analyze the fine-grained motion of a pedestrian appearance in the vehicle-mounted drive recorder, a method to describe subtle change of motion characteristics occurring in a short time is necessary; (ii) even when the background moves greatly due to the driving of the vehicle, it is necessary to detect changes in subtle motion of the pedestrian; (iii) the collection of large-scale fine-grained actions is very difficult, and therefore a relatively small database should be focused. We find out how to learn an effective recognition model with only a small-scale database. Here, we have thoroughly evaluated several types of configurations to explore an effective approach in fine-grained pedestrian action recognition without a large-scale database. Moreover, two different datasets have been collected in order to raise the issue. Finally, our proposal attained 91.01% on National Traffic Science and Environment Laboratory database (NTSEL) and 53.23% on the near-miss driving recorder database (NDRDB). The paper has improved +8.28% and +6.53% from baseline two-stream fusion convnets. PMID:29461473

  19. INDOT Technical Training Plan : [Technical Summary

    DOT National Transportation Integrated Search

    2012-01-01

    A wide range of job classifications, increasing technical : performance expectations, licensing and certification requirements, : budget restrictions and frequent department : reorganization has made technical training of employees : more difficult, ...

  20. Baseline tests for arc melter vitrification of INEL buried wastes. Volume II: Baseline test data appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, L.L.; O`Conner, W.K.; Turner, P.C.

    1993-11-19

    This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc meltingmore » furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.« less

  1. A Non-technical User-Oriented Display Notation for XACML Conditions

    NASA Astrophysics Data System (ADS)

    Stepien, Bernard; Felty, Amy; Matwin, Stan

    Ideally, access control to resources in complex IT systems ought to be handled by business decision makers who own a given resource (e.g., the pay and benefits section of an organization should decide and manage the access rules to the payroll system). To make this happen, the security and database communities need to develop vendor-independent access management tools, useable by decision makers, rather than technical personnel detached from a given business function. We have developed and implemented such tool, based on XACML. The XACML is an important emerging tool for managing complex access control applications. As a formal notation, based on an XML schema representing the grammar of a given application, XACML is precise and non-ambiguous. But this very property puts it out of reach of non-technical users. We propose a new notation for displaying and editing XACML rules that is independent of XML, and we develop an editor for it. Our notation combines a tree representation of logical expressions with an accessible natural language layer. Our early experience indicates that such rules can be grasped by non-technical users wishing to develop and control rules for accessing their own resources.

  2. Structure and needs of global loss databases about natural disaster

    NASA Astrophysics Data System (ADS)

    Steuer, Markus

    2010-05-01

    Global loss databases are used for trend analyses and statistics in scientific projects, studies for governmental and nongovernmental organizations and for the insurance and finance industry as well. At the moment three global data sets are established: EM-DAT (CRED), Sigma (Swiss Re) and NatCatSERVICE (Munich Re). Together with the Asian Disaster Reduction Center (ADRC) and United Nations Development Program (UNDP) started a collaborative initiative in 2007 with the aim to agreed on and implemented a common "Disaster Category Classification and Peril Terminology for Operational Databases". This common classification has been established through several technical meetings and working groups and represents a first and important step in the development of a standardized international classification of disasters and terminology of perils. This means concrete to set up a common hierarchy and terminology for all global and regional databases on natural disasters and establish a common and agreed definition of disaster groups, main types and sub-types of events. Also the theme of georeferencing, temporal aspects, methodology and sourcing were other issues that have been identified and will be discussed. The implementation of the new and defined structure for global loss databases is already set up for Munich Re NatCatSERVICE. In the following oral session we will show the structure of the global databases as defined and in addition to give more transparency of the data sets behind published statistics and analyses. The special focus will be on the catastrophe classification from a moderate loss event up to a great natural catastrophe, also to show the quality of sources and give inside information about the assessment of overall and insured losses. Keywords: disaster category classification, peril terminology, overall and insured losses, definition

  3. An integrated data-analysis and database system for AMS 14C

    NASA Astrophysics Data System (ADS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-04-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  4. Comparison of hybrid and baseline ELMy H-mode confinement in JET with the carbon wall

    NASA Astrophysics Data System (ADS)

    Beurskens, M. N. A.; Frassinetti, L.; Challis, C.; Osborne, T.; Snyder, P. B.; Alper, B.; Angioni, C.; Bourdelle, C.; Buratti, P.; Crisanti, F.; Giovannozzi, E.; Giroud, C.; Groebner, R.; Hobirk, J.; Jenkins, I.; Joffrin, E.; Leyland, M. J.; Lomas, P.; Mantica, P.; McDonald, D.; Nunes, I.; Rimini, F.; Saarelma, S.; Voitsekhovitch, I.; de Vries, P.; Zarzoso, D.; Contributors, JET-EFDA

    2013-01-01

    The confinement in JET baseline type I ELMy H-mode plasmas is compared to that in so-called hybrid H-modes in a database study of 112 plasmas in JET with the carbon fibre composite (CFC) wall. The baseline plasmas typically have βN ˜ 1.5-2, H98 ˜ 1, whereas the hybrid plasmas have βN ˜ 2.5-3, H98 < 1.5. The database study contains both low- (δ ˜ 0.2-0.25) and high-triangularity (δ ˜ 0.4) hybrid and baseline H-mode plasmas from the last JET operational campaigns in the CFC wall from the period 2008-2009. Based on a detailed confinement study of the global as well as the pedestal and core confinement, there is no evidence that the hybrid and baseline plasmas form separate confinement groups; it emerges that the transition between the two scenarios is of a gradual kind rather than demonstrating a bifurcation in the confinement. The elevated confinement enhancement factor H98 in the hybrid plasmas may possibly be explained by the density dependence in the τ98 scaling as n0.41 and the fact that the hybrid plasmas operate at low plasma density compared to the baseline ELMy H-mode plasmas. A separate regression on the confinement data in this study shows a reduction in the density dependence as n0.09±0.08. Furthermore, inclusion of the plasma toroidal rotation in the confinement regression provides a scaling with the toroidal Alfvén Mach number as Mach_A^{0.41+/- 0.07} and again a reduced density dependence as n0.15±0.08. The differences in pedestal confinement can be explained on the basis of linear MHD stability through a coupling of the total and pedestal poloidal pressure and the pedestal performance can be improved through plasma shaping as well as high β operation. This has been confirmed in a comparison with the EPED1 predictive pedestal code which shows a good agreement between the predicted and measured pedestal pressure within 20-30% for a wide range of βN ˜ 1.5-3.5. The core profiles show a strong degree of pressure profile consistency. No

  5. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    NASA Technical Reports Server (NTRS)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  6. Comparison of the NCI open database with seven large chemical structural databases.

    PubMed

    Voigt, J H; Bienfait, B; Wang, S; Nicklaus, M C

    2001-01-01

    Eight large chemical databases have been analyzed and compared to each other. Central to this comparison is the open National Cancer Institute (NCI) database, consisting of approximately 250 000 structures. The other databases analyzed are the Available Chemicals Directory ("ACD," from MDL, release 1.99, 3D-version); the ChemACX ("ACX," from CamSoft, Version 4.5); the Maybridge Catalog and the Asinex database (both as distributed by CamSoft as part of ChemInfo 4.5); the Sigma-Aldrich Catalog (CD-ROM, 1999 Version); the World Drug Index ("WDI," Derwent, version 1999.03); and the organic part of the Cambridge Crystallographic Database ("CSD," from Cambridge Crystallographic Data Center, 1999 Version 5.18). The database properties analyzed are internal duplication rates; compounds unique to each database; cumulative occurrence of compounds in an increasing number of databases; overlap of identical compounds between two databases; similarity overlap; diversity; and others. The crystallographic database CSD and the WDI show somewhat less overlap with the other databases than those with each other. In particular the collections of commercial compounds and compilations of vendor catalogs have a substantial degree of overlap among each other. Still, no database is completely a subset of any other, and each appears to have its own niche and thus "raison d'être". The NCI database has by far the highest number of compounds that are unique to it. Approximately 200 000 of the NCI structures were not found in any of the other analyzed databases.

  7. PrimateLit Database

    Science.gov Websites

    Primate Info Net Related Databases NCRR PrimateLit: A bibliographic database for primatology Top of any problems with this service. We welcome your feedback. The PrimateLit database is no longer being Resources, National Institutes of Health. The database is a collaborative project of the Wisconsin Primate

  8. Data Model and Relational Database Design for Highway Runoff Water-Quality Metadata

    USGS Publications Warehouse

    Granato, Gregory E.; Tessler, Steven

    2001-01-01

    A National highway and urban runoff waterquality metadatabase was developed by the U.S. Geological Survey in cooperation with the Federal Highway Administration as part of the National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS). The database was designed to catalog available literature and to document results of the synthesis in a format that would facilitate current and future research on highway and urban runoff. This report documents the design and implementation of the NDAMS relational database, which was designed to provide a catalog of available information and the results of an assessment of the available data. All the citations and the metadata collected during the review process are presented in a stratified metadatabase that contains citations for relevant publications, abstracts (or previa), and reportreview metadata for a sample of selected reports that document results of runoff quality investigations. The database is referred to as a metadatabase because it contains information about available data sets rather than a record of the original data. The database contains the metadata needed to evaluate and characterize how valid, current, complete, comparable, and technically defensible published and available information may be when evaluated for application to the different dataquality objectives as defined by decision makers. This database is a relational database, in that all information is ultimately linked to a given citation in the catalog of available reports. The main database file contains 86 tables consisting of 29 data tables, 11 association tables, and 46 domain tables. The data tables all link to a particular citation, and each data table is focused on one aspect of the information collected in the literature search and the evaluation of available information. This database is implemented in the Microsoft (MS) Access database software because it is widely used within and outside of government and is familiar to many

  9. Phenol-Explorer: an online comprehensive database on polyphenol contents in foods.

    PubMed

    Neveu, V; Perez-Jiménez, J; Vos, F; Crespy, V; du Chaffaut, L; Mennen, L; Knox, C; Eisner, R; Cruz, J; Wishart, D; Scalbert, A

    2010-01-01

    A number of databases on the plant metabolome describe the chemistry and biosynthesis of plant chemicals. However, no such database is specifically focused on foods and more precisely on polyphenols, one of the major classes of phytochemicals. As antioxidants, polyphenols influence human health and may play a role in the prevention of a number of chronic diseases such as cardiovascular diseases, some cancers or type 2 diabetes. To determine polyphenol intake in populations and study their association with health, it is essential to have detailed information on their content in foods. However this information is not easily collected due to the variety of their chemical structures and the variability of their content in a given food. Phenol-Explorer is the first comprehensive web-based database on polyphenol content in foods. It contains more than 37,000 original data points collected from 638 scientific articles published in peer-reviewed journals. The quality of these data has been evaluated before they were aggregated to produce final representative mean content values for 502 polyphenols in 452 foods. The web interface allows making various queries on the aggregated data to identify foods containing a given polyphenol or polyphenols present in a given food. For each mean content value, it is possible to trace all original content values and their literature sources. Phenol-Explorer is a major step forward in the development of databases on food constituents and the food metabolome. It should help researchers to better understand the role of phytochemicals in the technical and nutritional quality of food, and food manufacturers to develop tailor-made healthy foods. Database URL: http://www.phenol-explorer.eu.

  10. Phenol-Explorer: an online comprehensive database on polyphenol contents in foods

    PubMed Central

    Neveu, V.; Perez-Jiménez, J.; Vos, F.; Crespy, V.; du Chaffaut, L.; Mennen, L.; Knox, C.; Eisner, R.; Cruz, J.; Wishart, D.; Scalbert, A.

    2010-01-01

    A number of databases on the plant metabolome describe the chemistry and biosynthesis of plant chemicals. However, no such database is specifically focused on foods and more precisely on polyphenols, one of the major classes of phytochemicals. As antoxidants, polyphenols influence human health and may play a role in the prevention of a number of chronic diseases such as cardiovascular diseases, some cancers or type 2 diabetes. To determine polyphenol intake in populations and study their association with health, it is essential to have detailed information on their content in foods. However this information is not easily collected due to the variety of their chemical structures and the variability of their content in a given food. Phenol-Explorer is the first comprehensive web-based database on polyphenol content in foods. It contains more than 37 000 original data points collected from 638 scientific articles published in peer-reviewed journals. The quality of these data has been evaluated before they were aggregated to produce final representative mean content values for 502 polyphenols in 452 foods. The web interface allows making various queries on the aggregated data to identify foods containing a given polyphenol or polyphenols present in a given food. For each mean content value, it is possible to trace all original content values and their literature sources. Phenol-Explorer is a major step forward in the development of databases on food constituents and the food metabolome. It should help researchers to better understand the role of phytochemicals in the technical and nutritional quality of food, and food manufacturers to develop tailor-made healthy foods. Database URL: http://www.phenol-explorer.eu PMID:20428313

  11. Predicting future learning from baseline network architecture.

    PubMed

    Mattar, Marcelo G; Wymbs, Nicholas F; Bock, Andrew S; Aguirre, Geoffrey K; Grafton, Scott T; Bassett, Danielle S

    2018-05-15

    Human behavior and cognition result from a complex pattern of interactions between brain regions. The flexible reconfiguration of these patterns enables behavioral adaptation, such as the acquisition of a new motor skill. Yet, the degree to which these reconfigurations depend on the brain's baseline sensorimotor integration is far from understood. Here, we asked whether spontaneous fluctuations in sensorimotor networks at baseline were predictive of individual differences in future learning. We analyzed functional MRI data from 19 participants prior to six weeks of training on a new motor skill. We found that visual-motor connectivity was inversely related to learning rate: sensorimotor autonomy at baseline corresponded to faster learning in the future. Using three additional scans, we found that visual-motor connectivity at baseline is a relatively stable individual trait. These results suggest that individual differences in motor skill learning can be predicted from sensorimotor autonomy at baseline prior to task execution. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  12. NSR Modeling Emission Baselines

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  13. Genome databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courteau, J.

    1991-10-11

    Since the Genome Project began several years ago, a plethora of databases have been developed or are in the works. They range from the massive Genome Data Base at Johns Hopkins University, the central repository of all gene mapping information, to small databases focusing on single chromosomes or organisms. Some are publicly available, others are essentially private electronic lab notebooks. Still others limit access to a consortium of researchers working on, say, a single human chromosome. An increasing number incorporate sophisticated search and analytical software, while others operate as little more than data lists. In consultation with numerous experts inmore » the field, a list has been compiled of some key genome-related databases. The list was not limited to map and sequence databases but also included the tools investigators use to interpret and elucidate genetic data, such as protein sequence and protein structure databases. Because a major goal of the Genome Project is to map and sequence the genomes of several experimental animals, including E. coli, yeast, fruit fly, nematode, and mouse, the available databases for those organisms are listed as well. The author also includes several databases that are still under development - including some ambitious efforts that go beyond data compilation to create what are being called electronic research communities, enabling many users, rather than just one or a few curators, to add or edit the data and tag it as raw or confirmed.« less

  14. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  15. Historical hydrology and database on flood events (Apulia, southern Italy)

    NASA Astrophysics Data System (ADS)

    Lonigro, Teresa; Basso, Alessia; Gentile, Francesco; Polemio, Maurizio

    2014-05-01

    Historical data about floods represent an important tool for the comprehension of the hydrological processes, the estimation of hazard scenarios as a basis for Civil Protection purposes, as a basis of the rational land use management, especially in karstic areas, where time series of river flows are not available and the river drainage is rare. The research shows the importance of the improvement of existing flood database with an historical approach, finalized to collect past or historical floods event, in order to better assess the occurrence trend of floods, in the case for the Apulian region (south Italy). The main source of records of flood events for Apulia was the AVI (the acronym means Italian damaged areas) database, an existing Italian database that collects data concerning damaging floods from 1918 to 1996. The database was expanded consulting newspapers, publications, and technical reports from 1996 to 2006. In order to expand the temporal range further data were collected searching in the archives of regional libraries. About 700 useful news from 17 different local newspapers were found from 1876 to 1951. From a critical analysis of the 700 news collected since 1876 to 1952 only 437 were useful for the implementation of the Apulia database. The screening of these news showed the occurrence of about 122 flood events in the entire region. The district of Bari, the regional main town, represents the area in which the great number of events occurred; the historical analysis confirms this area as flood-prone. There is an overlapping period (from 1918 to 1952) between old AVI database and new historical dataset obtained by newspapers. With regard to this period, the historical research has highlighted new flood events not reported in the existing AVI database and it also allowed to add more details to the events already recorded. This study shows that the database is a dynamic instrument, which allows a continuous implementation of data, even in real time

  16. NASA's experience in the international exchange of scientific and technical information in the aerospace field

    NASA Technical Reports Server (NTRS)

    Thibideau, Philip A.

    1990-01-01

    The early NASA international scientific and technical information exchange arrangements were usually detailed in correspondence with the librarians of the institutions involved. While this type of exchange grew to include some 200 organizations in 43 countries, NASA's main focus shifted to the relationship with the European Space Agency (ESA), which began in 1964. The NASA/ESA Tripartite Exchange Program provides more than 4000 technical reports from the NASA-produced Aerospace Database. The experience in the evolving cooperation between NASA and ESA has established the model for more recent exchange agreements with Israel, Australia, and Canada. The results of these agreements are made available to participating European organizations through the NASA File.

  17. The EPMI Malay Basin petroleum geology database: Design philosophy and keys to success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Low, H.E.; Creaney, S.; Fairchild, L.H.

    1994-07-01

    Esso Production Malaysia Inc. (EPMI) developed and populated a database containing information collected in the areas of basic well data: stratigraphy, lithology, facies; pressure, temperature, column/contacts; geochemistry, shows and stains, migration, fluid properties; maturation; seal; structure. Paradox was used as the database engine and query language, with links to ZYCOR ZMAP+ for mapping and SAS for data analysis. Paradox has a query language that is simple enough for users. The ability to link to good analytical packages was deemed more important than having the capability in the package. Important elements of design philosophy were included: (1) information on data qualitymore » had to be rigorously recorded; (2) raw and interpreted data were kept separate and clearly identified; (3) correlations between rock and chronostratigraphic surfaces were recorded; and (4) queries across technical boundaries had to be seamless.« less

  18. Spatial abilities and technical skills performance in health care: a systematic review.

    PubMed

    Langlois, Jean; Bellemare, Christian; Toulouse, Josée; Wells, George A

    2015-11-01

    The aim of this study was to conduct a systematic review and meta-analysis of the relationship between spatial abilities and technical skills performance in health care in beginners and to compare this relationship with those in intermediate and autonomous learners. Search criteria included 'spatial abilities' and 'technical skills'. Keywords related to these criteria were defined. A literature search was conducted to 20 December, 2013 in Scopus (including MEDLINE) and in several databases on EBSCOhost platforms (CINAHL Plus with Full Text, ERIC, Education Source and PsycINFO). Citations were obtained and reviewed by two independent reviewers. Articles related to retained citations were reviewed and a final list of eligible articles was determined. Articles were assessed for quality using the Scottish Intercollegiate Guidelines Network-50 assessment instrument. Data were extracted from articles in a systematic way. Correlations between spatial abilities test scores and technical skills performance were identified. A series of 8289 citations was obtained. Eighty articles were retained and fully reviewed, yielding 36 eligible articles. The systematic review found a tendency for spatial abilities to be negatively correlated with the duration of technical skills and positively correlated with the quality of technical skills performance in beginners and intermediate learners. Pooled correlations of studies were -0.46 (p = 0.03) and -0.38 (95% confidence interval [CI] -0.53 to -0.21) for duration and 0.33 (95% CI 0.20-0.44) and 0.41 (95% CI 0.26-0.54) for quality of technical skills performance in beginners and intermediate learners, respectively. However, correlations between spatial abilities test scores and technical skills performance were not statistically significant in autonomous learners. Spatial abilities are an important factor to consider in selecting and training individuals in technical skills in health care. © 2015 John Wiley & Sons Ltd.

  19. DataBase on Demand

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.

    2012-12-01

    At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.

  20. Alignment of high-throughput sequencing data inside in-memory databases.

    PubMed

    Firnkorn, Daniel; Knaup-Gregori, Petra; Lorenzo Bermejo, Justo; Ganzinger, Matthias

    2014-01-01

    In times of high-throughput DNA sequencing techniques, performance-capable analysis of DNA sequences is of high importance. Computer supported DNA analysis is still an intensive time-consuming task. In this paper we explore the potential of a new In-Memory database technology by using SAP's High Performance Analytic Appliance (HANA). We focus on read alignment as one of the first steps in DNA sequence analysis. In particular, we examined the widely used Burrows-Wheeler Aligner (BWA) and implemented stored procedures in both, HANA and the free database system MySQL, to compare execution time and memory management. To ensure that the results are comparable, MySQL has been running in memory as well, utilizing its integrated memory engine for database table creation. We implemented stored procedures, containing exact and inexact searching of DNA reads within the reference genome GRCh37. Due to technical restrictions in SAP HANA concerning recursion, the inexact matching problem could not be implemented on this platform. Hence, performance analysis between HANA and MySQL was made by comparing the execution time of the exact search procedures. Here, HANA was approximately 27 times faster than MySQL which means, that there is a high potential within the new In-Memory concepts, leading to further developments of DNA analysis procedures in the future.

  1. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  2. Improving Forsyth Technical Community College's Ability to Develop and Maintain Partnerships: Leveraging Technology to Develop Partnerships

    ERIC Educational Resources Information Center

    Murdock, Alan K.

    2017-01-01

    Forsyth Technical Community College (FTCC) face a shortage of funding to meet the demands of students, faculty, staff and businesses. Through this practitioner research, the utilization of the college's current customer relationship management (CRM) database advanced. By leveraging technology, the researcher assisted the college in meeting the…

  3. Recognizing emotional speech in Persian: a validated database of Persian emotional speech (Persian ESD).

    PubMed

    Keshtiari, Niloofar; Kuhlmann, Michael; Eslami, Moharram; Klann-Delius, Gisela

    2015-03-01

    Research on emotional speech often requires valid stimuli for assessing perceived emotion through prosody and lexical content. To date, no comprehensive emotional speech database for Persian is officially available. The present article reports the process of designing, compiling, and evaluating a comprehensive emotional speech database for colloquial Persian. The database contains a set of 90 validated novel Persian sentences classified in five basic emotional categories (anger, disgust, fear, happiness, and sadness), as well as a neutral category. These sentences were validated in two experiments by a group of 1,126 native Persian speakers. The sentences were articulated by two native Persian speakers (one male, one female) in three conditions: (1) congruent (emotional lexical content articulated in a congruent emotional voice), (2) incongruent (neutral sentences articulated in an emotional voice), and (3) baseline (all emotional and neutral sentences articulated in neutral voice). The speech materials comprise about 470 sentences. The validity of the database was evaluated by a group of 34 native speakers in a perception test. Utterances recognized better than five times chance performance (71.4 %) were regarded as valid portrayals of the target emotions. Acoustic analysis of the valid emotional utterances revealed differences in pitch, intensity, and duration, attributes that may help listeners to correctly classify the intended emotion. The database is designed to be used as a reliable material source (for both text and speech) in future cross-cultural or cross-linguistic studies of emotional speech, and it is available for academic research purposes free of charge. To access the database, please contact the first author.

  4. Using a Semi-Realistic Database to Support a Database Course

    ERIC Educational Resources Information Center

    Yue, Kwok-Bun

    2013-01-01

    A common problem for university relational database courses is to construct effective databases for instructions and assignments. Highly simplified "toy" databases are easily available for teaching, learning, and practicing. However, they do not reflect the complexity and practical considerations that students encounter in real-world…

  5. The impact of nontechnical skills on technical performance in surgery: a systematic review.

    PubMed

    Hull, Louise; Arora, Sonal; Aggarwal, Rajesh; Darzi, Ara; Vincent, Charles; Sevdalis, Nick

    2012-02-01

    Failures in nontechnical and teamwork skills frequently lie at the heart of harm and near-misses in the operating room (OR). The purpose of this systematic review was to assess the impact of nontechnical skills on technical performance in surgery. MEDLINE, EMBASE, PsycINFO databases were searched, and 2,041 articles were identified. After limits were applied, 341 articles were retrieved for evaluation. Of these, 28 articles were accepted for this review. Data were extracted from the articles regarding sample population, study design and setting, measures of nontechnical skills and technical performance, study findings, and limitations. Of the 28 articles that met inclusion criteria, 21 articles assessed the impact of surgeons' nontechnical skills on their technical performance. The evidence suggests that receiving feedback and effectively coping with stressful events in the OR has a beneficial impact on certain aspects of technical performance. Conversely, increased levels of fatigue are associated with detriments to surgical skill. One article assessed the impact of anesthesiologists' nontechnical skills on anesthetic technical performance, finding a strong positive correlation between the 2 skill sets. Finally, 6 articles assessed the impact of multiple nontechnical skills of the entire OR team on surgical performance. A strong relationship between teamwork failure and technical error was empirically demonstrated in these studies. Evidence suggests that certain nontechnical aspects of performance can enhance or, if lacking, contribute to deterioration of surgeons' technical performance. The precise extent of this effect remains to be elucidated. Copyright © 2012 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  6. Creating Pathways for Low-Skill Adults: Lessons for Community and Technical Colleges from a Statewide Longitudinal Study

    ERIC Educational Resources Information Center

    Perry, Carol A.

    2012-01-01

    The purpose of this study was to examine the educational experiences and outcomes of low-skill adults in West Virginia's community and technical colleges, providing a more detailed profile of these students. Data for the variables were obtained from archival databases through a cooperative agreement between state agencies. Descriptive statistics…

  7. Weeks Island brine diffuser site study: baseline conditions and environmental assessment technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1980-12-12

    This technical report presents the results of a study conducted at two alternative brine diffuser sites (A and B) proposed for the Weeks Island salt dome, together with an analysis of the potential physical, chemical, and biological effects of brine disposal for this area of the Gulf of Mexico. Brine would result from either the leaching of salt domes to form or enlarge oil storage caverns, or the subsequent use of these caverns for crude oil storage in the Strategic Petroleum Reserve (SPR) program. Brine leached from the Weeks Island salt dome would be transported through a pipeline which wouldmore » extend from the salt dome either 27 nautical miles (32 statute miles) for Site A, or 41 nautical miles (47 statute miles) for Site B, into Gulf waters. The brine would be discharged at these sites through an offshore diffuser at a sustained peak rate of 39 ft/sup 3//sec. The disposal of large quantities of brine in the Gulf could have a significant impact on the biology and water quality of the area. Physical and chemical measurements of the marine environment at Sites A and B were taken between September 1977 and July 1978 to correlate the existing environmental conditions with the estimated physical extent of tthe brine discharge as predicted by the MIT model (US Dept. of Commerce, 1977a). Measurements of wind, tide, waves, currents, and stratification (water column structure) were also obtained since the diffusion and dispersion of the brine plume are a function of the local circulation regime. These data were used to calculate both near- and far-field concentrations of brine, and may also be used in the design criteria for diffuser port configuration and verification of the plume model. Biological samples were taken to characterize the sites and to predict potential areas of impact with regard to the discharge. This sampling focused on benthic organisms and demersal fish. (DMC)« less

  8. Scientific and Technical Publishing at Goddard Space Flight Center in Fiscal Year 1994

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This publication is a compilation of scientific and technical material that was researched, written, prepared, and disseminated by the Center's scientists and engineers during FY94. It is presented in numerical order of the GSFC author's sponsoring technical directorate; i.e., Code 300 is the Office of Flight Assurance, Code 400 is the Flight Projects Directorate, Code 500 is the Mission Operations and Data Systems Directorate, Code 600 is the Space Sciences Directorate, Code 700 is the Engineering Directorate, Code 800 is the Suborbital Projects and Operations Directorate, and Code 900 is the Earth Sciences Directorate. The publication database contains publication or presentation title, author(s), document type, sponsor, and organizational code. This is the second annual compilation for the Center.

  9. [Construction of chemical information database based on optical structure recognition technique].

    PubMed

    Lv, C Y; Li, M N; Zhang, L R; Liu, Z M

    2018-04-18

    To create a protocol that could be used to construct chemical information database from scientific literature quickly and automatically. Scientific literature, patents and technical reports from different chemical disciplines were collected and stored in PDF format as fundamental datasets. Chemical structures were transformed from published documents and images to machine-readable data by using the name conversion technology and optical structure recognition tool CLiDE. In the process of molecular structure information extraction, Markush structures were enumerated into well-defined monomer molecules by means of QueryTools in molecule editor ChemDraw. Document management software EndNote X8 was applied to acquire bibliographical references involving title, author, journal and year of publication. Text mining toolkit ChemDataExtractor was adopted to retrieve information that could be used to populate structured chemical database from figures, tables, and textual paragraphs. After this step, detailed manual revision and annotation were conducted in order to ensure the accuracy and completeness of the data. In addition to the literature data, computing simulation platform Pipeline Pilot 7.5 was utilized to calculate the physical and chemical properties and predict molecular attributes. Furthermore, open database ChEMBL was linked to fetch known bioactivities, such as indications and targets. After information extraction and data expansion, five separate metadata files were generated, including molecular structure data file, molecular information, bibliographical references, predictable attributes and known bioactivities. Canonical simplified molecular input line entry specification as primary key, metadata files were associated through common key nodes including molecular number and PDF number to construct an integrated chemical information database. A reasonable construction protocol of chemical information database was created successfully. A total of 174 research

  10. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  11. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  12. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  13. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  14. 40 CFR 80.90 - Conventional gasoline baseline emissions determination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Conventional gasoline baseline... gasoline baseline emissions determination. (a) Annual average baseline values. For any facility of a refiner or importer of conventional gasoline, the annual average baseline values of the facility's exhaust...

  15. Desktop Access to Full-Text NACA and NASA Reports: Systems Developed by NASA Langley Technical Library

    NASA Technical Reports Server (NTRS)

    Ambur, Manjula Y.; Adams, David L.; Trinidad, P. Paul

    1997-01-01

    NASA Langley Technical Library has been involved in developing systems for full-text information delivery of NACA/NASA technical reports since 1991. This paper will describe the two prototypes it has developed and the present production system configuration. The prototype systems are a NACA CD-ROM of thirty-three classic paper NACA reports and a network-based Full-text Electronic Reports Documents System (FEDS) constructed from both paper and electronic formats of NACA and NASA reports. The production system is the DigiDoc System (DIGItal Documents) presently being developed based on the experiences gained from the two prototypes. DigiDoc configuration integrates the on-line catalog database World Wide Web interface and PDF technology to provide a powerful and flexible search and retrieval system. It describes in detail significant achievements and lessons learned in terms of data conversion, storage technologies, full-text searching and retrieval, and image databases. The conclusions from the experiences of digitization and full- text access and future plans for DigiDoc system implementation are discussed.

  16. NATIVE HEALTH DATABASES: NATIVE HEALTH RESEARCH DATABASE (NHRD)

    EPA Science Inventory

    The Native Health Databases contain bibliographic information and abstracts of health-related articles, reports, surveys, and other resource documents pertaining to the health and health care of American Indians, Alaska Natives, and Canadian First Nations. The databases provide i...

  17. Are general surgeons able to accurately self-assess their level of technical skills?

    PubMed

    Rizan, C; Ansell, J; Tilston, T W; Warren, N; Torkington, J

    2015-11-01

    Self-assessment is a way of improving technical capabilities without the need for trainer feedback. It can identify areas for improvement and promote professional medical development. The aim of this review was to identify whether self-assessment is an accurate form of technical skills appraisal in general surgery. The PubMed, MEDLINE(®), Embase(™) and Cochrane databases were searched for studies assessing the reliability of self-assessment of technical skills in general surgery. For each study, we recorded the skills assessed and the evaluation methods used. Common endpoints between studies were compared to provide recommendations based on the levels of evidence. Twelve studies met the inclusion criteria from 22,292 initial papers. There was no level 1 evidence published. All papers compared the correlation between self-appraisal versus an expert score but differed in the technical skills assessment and the evaluation tools used. The accuracy of self-assessment improved with increasing experience (level 2 recommendation), age (level 3 recommendation) and the use of video playback (level 3 recommendation). Accuracy was reduced by stressful learning environments (level 2 recommendation), lack of familiarity with assessment tools (level 3 recommendation) and in advanced surgical procedures (level 3 recommendation). Evidence exists to support the reliability of self-assessment of technical skills in general surgery. Several variables have been shown to affect the accuracy of self-assessment of technical skills. Future work should focus on evaluating the reliability of self-assessment during live operating procedures.

  18. Predictive value of background experiences and visual spatial ability testing on laparoscopic baseline performance among residents entering postgraduate surgical training.

    PubMed

    Louridas, Marisa; Quinn, Lauren E; Grantcharov, Teodor P

    2016-03-01

    Emerging evidence suggests that despite dedicated practice, not all surgical trainees have the ability to reach technical competency in minimally invasive techniques. While selecting residents that have the ability to reach technical competence is important, evidence to guide the incorporation of technical ability into selection processes is limited. Therefore, the purpose of the present study was to evaluate whether background experiences and 2D-3D visual spatial test results are predictive of baseline laparoscopic skill for the novice surgical trainee. First-year residents were studied. Demographic data and background surgical and non-surgical experiences were obtained using a questionnaire. Visual spatial ability was evaluated using the PicSOr, cube comparison (CC) and card rotation (CR) tests. Technical skill was assessed using the camera navigation (LCN) task and laparoscopic circle cut (LCC) task. Resident performance on these technical tasks was compared and correlated with the questionnaire and visual spatial findings. Previous experience in observing laparoscopic procedures was associated with significantly better LCN performance, and experience in navigating the laparoscopic camera was associated with significantly better LCC task results. Residents who scored higher on the CC test demonstrated a more accurate LCN path length score (r s(PL) = -0.36, p = 0.03) and angle path (r s(AP) = -0.426, p = 0.01) score when completing the LCN task. No other significant correlations were found between the visual spatial tests (PicSOr, CC or CR) and LCC performance. While identifying selection tests for incoming surgical trainees that predict technical skill performance is appealing, the surrogate markers evaluated correlate with specific metrics of surgical performance related to a single task but do not appear to reliably predict technical performance of different laparoscopic tasks. Predicting the acquisition of technical skills will require the development

  19. Technical report analysis and design: Study of solid rocket motors for a space shuttle booster, volume 2, book 1, supplement 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An analysis and design effort was conducted as part of the study of solid rocket motor for a space shuttle booster. The 156-inch-diameter, parallel burn solid rocket motor was selected as its baseline because it is transportable and is the most cost-effective, reliable system that has been developed and demonstrated. The basic approach was to concentrate on the selected baseline design, and to draw from the baseline sufficient data to describe the alternate approaches also studied. The following conclusions were reached with respect to technical feasibility of the use of solid rocket booster motors for the space shuttle vehicle: (1) The 156-inch, parallel-burn baseline SRM design meets NASA's study requirements while incorporating conservative safety factors. (2) The solid rocket motor booster represents a cost-effective approach. (3) Baseline costs are conservative and are based on a demonstrated design. (4) Recovery and reuse are feasible and offer substantial cost savings. (5) Abort can be accomplished successfully. (6) Ecological effects are acceptable.

  20. Database Development for Electrical, Electronic, and Electromechanical (EEE) Parts for the International Space Station Alpha

    NASA Technical Reports Server (NTRS)

    Wassil-Grimm, Andrew D.

    1997-01-01

    More effective electronic communication processes are needed to transfer contractor and international partner data into NASA and prime contractor baseline database systems. It is estimated that the International Space Station Alpha (ISSA) parts database will contain up to one million parts each of which may require database capabilities for approximately one thousand bytes of data for each part. The resulting gigabyte database must provide easy access to users who will be preparing multiple analyses and reports in order to verify as-designed, as-built, launch, on-orbit, and return configurations for up to 45 missions associated with the construction of the ISSA. Additionally, Internet access to this data base is strongly indicated to allow multiple user access from clients located in many foreign countries. This summer's project involved familiarization and evaluation of the ISSA Electrical, Electronic, and Electromechanical (EEE) Parts data and the process of electronically managing these data. Particular attention was devoted to improving the interfaces among the many elements of the ISSA information system and its global customers and suppliers. Additionally, prototype queries were developed to facilitate the identification of data changes in the data base, verifications that the designs used only approved parts, and certifications that the flight hardware containing EEE parts was ready for flight. This project also resulted in specific recommendations to NASA for further development in the area of EEE parts database development and usage.

  1. On baseline corrections and uncertainty in response spectrafor baseline variations commonly encountered in digital accelerograph records

    USGS Publications Warehouse

    Akkar, Sinan; Boore, David M.

    2009-01-01

    Most digital accelerograph recordings are plagued by long-period drifts, best seen in the velocity and displacement time series obtained from integration of the acceleration time series. These drifts often result in velocity values that are nonzero near the end of the record. This is clearly unphysical and can lead to inaccurate estimates of peak ground displacement and long-period spectral response. The source of the long-period noise seems to be variations in the acceleration baseline in many cases. These variations could be due to true ground motion (tilting and rotation, as well as local permanent ground deformation), instrumental effects, or analog-to-digital conversion. Very often the trends in velocity are well approximated by a linear trend after the strong shaking subsides. The linearity of the trend in velocity implies that no variations in the baseline could have occurred after the onset of linearity in the velocity time series. This observation, combined with the lack of any trends in the pre-event motion, allows us to compute the time interval in which any baseline variations could occur. We then use several models of the variations in a Monte Carlo procedure to derive a suite of baseline-corrected accelerations for each noise model using records from the 1999 Chi-Chi earthquake and several earthquakes in Turkey. Comparisons of the mean values of the peak ground displacements, spectral displacements, and residual displacements computed from these corrected accelerations for the different noise models can be used as a guide to the accuracy of the baseline corrections. For many of the records considered here the mean values are similar for each noise model, giving confidence in the estimation of the mean values. The dispersion of the ground-motion measures increases with period and is noise-model dependent. The dispersion of inelastic spectra is greater than the elastic spectra at short periods but approaches that of the elastic spectra at longer periods

  2. Physiological and Performance Measures for Baseline Concussion Assessment.

    PubMed

    Dobney, Danielle M; Thomas, Scott G; Taha, Tim; Keightley, Michelle

    2017-05-17

    Baseline testing is a common strategy for concussion assessment and management. Research continues to evaluate novel measures for potential to improve baseline testing methods. The primary objective was to; 1) determine the feasibility of including physiological, neuromuscular and mood measures as part of baseline concussion testing protocol, 2) describe typical values in a varsity athlete sample, and 3) estimate the influence of concussion history on these baseline measures. Prospective observational study. University Athletic Therapy Clinic. 100 varsity athletes. Frequency and domain measures of heart rate variability (HRV), blood pressure (BP), grip strength, Profile of Mood States and the Sport Concussion Assessment Tool-2. Physiological, neuromuscular performance and mood measures were feasible at baseline. Participants with a history of two or more previous concussions displayed significantly higher diastolic blood pressure. Females reported higher total mood disturbance compared to males. Physiological and neuromuscular performance measures are safe and feasible as baseline concussion assessment outcomes. History of concussion may have an influence on diastolic blood pressure.

  3. Database for content of mercury in Polish brown coal

    NASA Astrophysics Data System (ADS)

    Jastrząb, Krzysztof

    2018-01-01

    Poland is rated among the countries with largest level of mercury emission in Europe. According to information provided by the National Centre for Balancing and Management of Emissions (KOBiZE) more than 10.5 tons of mercury and its compounds were emitted into the atmosphere in 2015 from the area of Poland. Within the scope of the BazaHg project lasting from 2014 to 2015 and co-financed from the National Centre of Research and Development (NCBiR) a database was set up with specification of mercury content in Polish hard steam coal, coking coal and brown coal (lignite) grades. With regard to domestic brown coal the database comprises information on coal grades from Brown Coal Mines of `Bełchatów', `Adamów', `Turów' and `Sieniawa'. Currently the database contains 130 records with parameters of brown coal, where each record stands for technical analysis (content of moisture, ash and volatile particles), elemental analysis (CHNS), content of chlorine and mercury as well as net calorific value and combustion heat. Content of mercury in samples of brown coal grades under test ranged from 44 to 985 μg of Hg/kg with the average level of 345 μg of Hg/kg. The established database makes up a reliable and trustworthy source of information about content of mercury in Polish fossils. The foregoing details completed with information about consumption of coal by individual electric power stations and multiplied by appropriate emission coefficients may serve as the background to establish loads of mercury emitted into atmosphere from individual stations and by the entire sector of power engineering in total. It will also enable Polish central organizations and individual business entities to implement reasonable policy with respect of mercury emission into atmosphere.

  4. The 2015 Nucleic Acids Research Database Issue and molecular biology database collection.

    PubMed

    Galperin, Michael Y; Rigden, Daniel J; Fernández-Suárez, Xosé M

    2015-01-01

    The 2015 Nucleic Acids Research Database Issue contains 172 papers that include descriptions of 56 new molecular biology databases, and updates on 115 databases whose descriptions have been previously published in NAR or other journals. Following the classification that has been introduced last year in order to simplify navigation of the entire issue, these articles are divided into eight subject categories. This year's highlights include RNAcentral, an international community portal to various databases on noncoding RNA; ValidatorDB, a validation database for protein structures and their ligands; SASBDB, a primary repository for small-angle scattering data of various macromolecular complexes; MoonProt, a database of 'moonlighting' proteins, and two new databases of protein-protein and other macromolecular complexes, ComPPI and the Complex Portal. This issue also includes an unusually high number of cancer-related databases and other databases dedicated to genomic basics of disease and potential drugs and drug targets. The size of NAR online Molecular Biology Database Collection, http://www.oxfordjournals.org/nar/database/a/, remained approximately the same, following the addition of 74 new resources and removal of 77 obsolete web sites. The entire Database Issue is freely available online on the Nucleic Acids Research web site (http://nar.oxfordjournals.org/). Published by Oxford University Press on behalf of Nucleic Acids Research 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  5. Energy Consumption Database

    Science.gov Websites

    Consumption Database The California Energy Commission has created this on-line database for informal reporting ) classifications. The database also provides easy downloading of energy consumption data into Microsoft Excel (XLSX

  6. Technical Guidance for Assessing Environmental Justice in ...

    EPA Pesticide Factsheets

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyses to evaluate EJ concerns in the context of regulatory actions. Senior EPA managers and decision makers also may find this document useful to understand analytic expectations and to ensure that EJ concerns are appropriately considered in the development of analyses to support regulatory actions under EPA’s action development process. Specifically, the document outlines approaches and methods to help Agency analysts evaluate EJ concerns. The document provides overarching direction to analysts by outlining a series of questions that will ensure the decision maker has appropriate information about baseline risks across population groups, and how those risks are distributed under the options being considered. In addition, the document provides a set of recommendations and requirements as well as best practices for use in analyzing and reporting results from consideration of EJ concerns. These principles will help ensure consistency, quality, and transparency across regulatory actions, while allowing for flexibility needed across different regulatory actions. The purpose of the EJTG is ensure consistency, quality, and transparency in considering environmental justice, while allowing f

  7. 75 FR 66748 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  8. Human Mitochondrial Protein Database

    National Institute of Standards and Technology Data Gateway

    SRD 131 Human Mitochondrial Protein Database (Web, free access)   The Human Mitochondrial Protein Database (HMPDb) provides comprehensive data on mitochondrial and human nuclear encoded proteins involved in mitochondrial biogenesis and function. This database consolidates information from SwissProt, LocusLink, Protein Data Bank (PDB), GenBank, Genome Database (GDB), Online Mendelian Inheritance in Man (OMIM), Human Mitochondrial Genome Database (mtDB), MITOMAP, Neuromuscular Disease Center and Human 2-D PAGE Databases. This database is intended as a tool not only to aid in studying the mitochondrion but in studying the associated diseases.

  9. Recent advances in the compilation of holocene relative Sea-level database in North America

    NASA Astrophysics Data System (ADS)

    Horton, B.; Vacchi, M.; Engelhart, S. E.; Nikitina, D.

    2015-12-01

    Reconstruction of relative sea level (RSL) has implications for investigation of crustal movements, calibration of earth rheology models and the reconstruction of ice sheets. In recent years, efforts were made to create RSL databases following a standardized methodology. These regional databases provided a framework for developing our understanding of the primary mechanisms of RSL change since the Last Glacial Maximum and a long-term baseline against which to gauge changes in sea-level during the 20th century and forecasts for the 21st. Here we present two quality-controlled Holocene RSL database compiled for North America. Along the Pacific coast of North America (British Columbia, Canada to California, USA), our re-evaluation of sea-level indicators from geological and archaeological investigations yield 841 RSL data-points mainly from salt and freshwater wetlands or adjacent estuarine sediment as well as from isolation basin. Along the Atlantic coast of North America (Hudson Bay, Canada to South Carolina, USA), we are currently compiling a database including more than 2000 RSL data-points from isolation basin, salt and freshwater wetlands, beach ridges and intratidal deposits. We outline the difficulties and solutions we made to compile databases in such different depostional environment. We address complex tectonics and the framework to compare such large variability of RSL data-point. We discuss the implications of our results for the glacio-isostatic adjustment (GIA) models in the two studied regions.

  10. Development of a Data Citations Database for an Interdisciplinary Data Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; Downs, R. R.; Schumacher, J.; Gerard, A.

    2017-12-01

    The scientific community has long depended on consistent citation of the scientific literature to enable traceability, support replication, and facilitate analysis and debate about scientific hypotheses, theories, assumptions, and conclusions. However, only in the past few years has the community focused on consistent citation of scientific data, e.g., through the application of Digital Object Identifiers (DOIs) to data, the development of peer-reviewed data publications, community principles and guidelines, and other mechanisms. This means that, moving ahead, it should be easier to identify and track data citations and conduct systematic bibliometric studies. However, this still leaves the problem that many legacy datasets and past citations lack DOIs, making it difficult to develop a historical baseline or assess trends. With this in mind, the NASA Socioeconomic Data and Applications Center (SEDAC) has developed a searchable citations database, containing more than 3,400 citations of SEDAC data and information products over the past 20 years. These citations were collected through various indices and search tools and in some cases through direct contacts with authors. The citations come from a range of natural, social, health, and engineering science journals, books, reports, and other media. The database can be used to find and extract citations filtered by a range of criteria, enabling quantitative analysis of trends, intercomparisons between data collections, and categorization of citations by type. We present a preliminary analysis of citations for selected SEDAC data collections, in order to establish a baseline and assess options for ongoing metrics to track the impact of SEDAC data on interdisciplinary science. We also present an analysis of the uptake of DOIs within data citations reported in published studies that used SEDAC data.

  11. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  12. Predictors of employer satisfaction: technical and non-technical skills.

    PubMed

    Danielson, Jared A; Wu, Tsui-Feng; Fales-Williams, Amanda J; Kirk, Ryan A; Preast, Vanessa A

    2012-01-01

    Employers of 2007-2009 graduates from Iowa State University College of Veterinary Medicine were asked to respond to a survey regarding their overall satisfaction with their new employees as well as their new employees' preparation in several technical and non-technical skill areas. Seventy-five responses contained complete data and were used in the analysis. Four technical skill areas (data collection, data interpretation, planning, and taking action) and five non-technical skill areas (interpersonal skills, ability to deal with legal issues, business skills, making referrals, and problem solving) were identified. All of the skill area subscales listed above had appropriate reliability (Cronbach's alpha>0.70) and were positively and significantly correlated with overall employer satisfaction. Results of two simultaneous regression analyses indicated that of the four technical skill areas, taking action is the most salient predictor of employer satisfaction. Of the five non-technical skill areas, interpersonal skills, business skills, making referrals, and problem solving were the most important skills in predicting employer satisfaction. Hierarchical regression analysis revealed that all technical skills explained 25% of the variation in employer satisfaction; non-technical skills explained an additional 42% of the variation in employer satisfaction.

  13. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ribeiro, V. A. R. M.; Russo, P.; Cárdenas-Avendaño, A., E-mail: vribeiro@ast.uct.ac.za, E-mail: russo@strw.leidenuniv.nl

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' withmore » a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.« less

  14. Salton Sea sampling program: baseline studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tullis, R.E.; Carter, J.L.; Langlois, G.W.

    1981-04-13

    Baseline data are provided on three species of fish from the Salton Sea, California. The fishes considered were the orange mouth corvina (Cynoscion xanthulus), gulf croaker (Bairdiella icistius) and sargo (Anisotremus davidsonii). Morphometric and meristic data are presented as a baseline to aid in the evaluation of any physiological stress the fish may experience as a result of geothermal development. Analyses were made on muscle, liver, and bone of the fishes sampled to provide baseline data on elemental tissue burdens. The elements measured were: As, Br, Ca, Cu, Fe, Ga, K, Mn, Mi, Pb, Rb, Se, Sr, Zn, and Zr.more » These data are important if an environmentally sound progression of geothermal power production is to occur at the Salton Sea.« less

  15. Air traffic control system baseline methodology guide.

    DOT National Transportation Integrated Search

    1999-06-01

    The Air Traffic Control System Baseline Methodology Guide serves as a reference in the design and conduct of baseline studies. : Engineering research psychologists are the intended audience for the Methodology Guide, which focuses primarily on techni...

  16. First GPS baseline results from the North Andes

    NASA Technical Reports Server (NTRS)

    Kellogg, James N.; Freymueller, Jeffrey T.; Dixon, Timothy H.; Neilan, Ruth E.; Ropain, Clemente

    1990-01-01

    The CASA Uno GPS experiment (January-February 1988) has provided the first epoch baseline measurements for the study of plate motions and crustal deformation in and around the North Andes. Two dimensional horizontal baseline repeatabilities are as good as 5 parts in 10 to the 8th for short baselines (100-1000 km), and better than 3 parts in 10 to the 8th for long baselines (greater than 1000 km). Vertical repeatabilities are typically 4-6 cm, with a weak dependence on baseline length. The expected rate of plate convergence across the Colombia Trench is 6-8cm/yr, which should be detectable by the repeat experiment planned for 1991. Expected deformation rates within the North Andes are of the order of 1 cm/yr, which may be detectable with the 1991 experiment.

  17. A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network

    NASA Astrophysics Data System (ADS)

    Lussana, C.; Ranci, M.; Uboldi, F.

    2012-04-01

    In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.

  18. Biofuel Database

    National Institute of Standards and Technology Data Gateway

    Biofuel Database (Web, free access)   This database brings together structural, biological, and thermodynamic data for enzymes that are either in current use or are being considered for use in the production of biofuels.

  19. Analysis of User Need with CD-ROM Databases: A Case Study Based on Work Sampling at One University Library.

    ERIC Educational Resources Information Center

    Wells, Amy Tracy

    Analysis of the needs of users of Compact Disk-Read Only Memory (CD-ROM) was performed at the Tampa campus of the University of South Florida. A review of the literature indicated that problems associated with selecting the appropriate database, searching, and requiring technical assistance were the probable areas of user need. The library has 17…

  20. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases

    PubMed Central

    Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-01-01

    Objectives To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Materials and methods Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Results Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. Discussion The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Conclusion Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. PMID:25670757

  1. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    ERIC Educational Resources Information Center

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  2. DR HAGIS-a fundus image database for the automatic extraction of retinal surface vessels from diabetic patients.

    PubMed

    Holm, Sven; Russell, Greg; Nourrit, Vincent; McLoughlin, Niall

    2017-01-01

    A database of retinal fundus images, the DR HAGIS database, is presented. This database consists of 39 high-resolution color fundus images obtained from a diabetic retinopathy screening program in the UK. The NHS screening program uses service providers that employ different fundus and digital cameras. This results in a range of different image sizes and resolutions. Furthermore, patients enrolled in such programs often display other comorbidities in addition to diabetes. Therefore, in an effort to replicate the normal range of images examined by grading experts during screening, the DR HAGIS database consists of images of varying image sizes and resolutions and four comorbidity subgroups: collectively defined as the diabetic retinopathy, hypertension, age-related macular degeneration, and Glaucoma image set (DR HAGIS). For each image, the vasculature has been manually segmented to provide a realistic set of images on which to test automatic vessel extraction algorithms. Modified versions of two previously published vessel extraction algorithms were applied to this database to provide some baseline measurements. A method based purely on the intensity of images pixels resulted in a mean segmentation accuracy of 95.83% ([Formula: see text]), whereas an algorithm based on Gabor filters generated an accuracy of 95.71% ([Formula: see text]).

  3. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  4. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  5. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  6. Life Support Baseline Values and Assumptions Document

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  7. Open Geoscience Database

    NASA Astrophysics Data System (ADS)

    Bashev, A.

    2012-04-01

    Currently there is an enormous amount of various geoscience databases. Unfortunately the only users of the majority of the databases are their elaborators. There are several reasons for that: incompaitability, specificity of tasks and objects and so on. However the main obstacles for wide usage of geoscience databases are complexity for elaborators and complication for users. The complexity of architecture leads to high costs that block the public access. The complication prevents users from understanding when and how to use the database. Only databases, associated with GoogleMaps don't have these drawbacks, but they could be hardly named "geoscience" Nevertheless, open and simple geoscience database is necessary at least for educational purposes (see our abstract for ESSI20/EOS12). We developed a database and web interface to work with them and now it is accessible at maps.sch192.ru. In this database a result is a value of a parameter (no matter which) in a station with a certain position, associated with metadata: the date when the result was obtained; the type of a station (lake, soil etc); the contributor that sent the result. Each contributor has its own profile, that allows to estimate the reliability of the data. The results can be represented on GoogleMaps space image as a point in a certain position, coloured according to the value of the parameter. There are default colour scales and each registered user can create the own scale. The results can be also extracted in *.csv file. For both types of representation one could select the data by date, object type, parameter type, area and contributor. The data are uploaded in *.csv format: Name of the station; Lattitude(dd.dddddd); Longitude(ddd.dddddd); Station type; Parameter type; Parameter value; Date(yyyy-mm-dd). The contributor is recognised while entering. This is the minimal set of features that is required to connect a value of a parameter with a position and see the results. All the complicated data

  8. 40 CFR 80.850 - How is the compliance baseline determined?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false How is the compliance baseline... Requirements § 80.850 How is the compliance baseline determined? (a) The compliance baseline to which annual... equation: ER29mr01.001 Where: TCBase = Compliance baseline toxics value. TBase = Baseline toxics value for...

  9. The MAR databases: development and implementation of databases specific for marine metagenomics

    PubMed Central

    Klemetsen, Terje; Raknes, Inge A; Fu, Juan; Agafonov, Alexander; Balasundaram, Sudhagar V; Tartari, Giacomo; Robertsen, Espen

    2018-01-01

    Abstract We introduce the marine databases; MarRef, MarDB and MarCat (https://mmp.sfb.uit.no/databases/), which are publicly available resources that promote marine research and innovation. These data resources, which have been implemented in the Marine Metagenomics Portal (MMP) (https://mmp.sfb.uit.no/), are collections of richly annotated and manually curated contextual (metadata) and sequence databases representing three tiers of accuracy. While MarRef is a database for completely sequenced marine prokaryotic genomes, which represent a marine prokaryote reference genome database, MarDB includes all incomplete sequenced prokaryotic genomes regardless level of completeness. The last database, MarCat, represents a gene (protein) catalog of uncultivable (and cultivable) marine genes and proteins derived from marine metagenomics samples. The first versions of MarRef and MarDB contain 612 and 3726 records, respectively. Each record is built up of 106 metadata fields including attributes for sampling, sequencing, assembly and annotation in addition to the organism and taxonomic information. Currently, MarCat contains 1227 records with 55 metadata fields. Ontologies and controlled vocabularies are used in the contextual databases to enhance consistency. The user-friendly web interface lets the visitors browse, filter and search in the contextual databases and perform BLAST searches against the corresponding sequence databases. All contextual and sequence databases are freely accessible and downloadable from https://s1.sfb.uit.no/public/mar/. PMID:29106641

  10. Consultation-Liaison Psychiatry Literature Database (2003 update). Part I: Consultation - Liaison Literature Database: 2003 update and national lists.

    PubMed

    Strain, James J; Strain, Jay J; Mustafa, Shawkat; Flores, Luis Ruiz Guillermo; Smith, Graeme; Mayou, Richard; Carvalho, Serafim; Chiu, Niem Mu; Zimmermann, Paulo; Fragras, Renerio; Lyons, John; Tsopolis, Nicholas; Malt, Ulrik

    2003-01-01

    Every day there are 10,000 scientific articles published. Since the Consultation-Liaison ("C-L") psychiatrist may be asked to consult on a patient with any medical illness, e.g., severe acute respiratory syndrome (SARS), malaria, cancer, stroke, amytrophic, lateral sclerosis, and a patient who may be on any medical drug, methods need to be developed to review the recent literature and have an awareness of key and essential current findings. At the same time, teachers need to develop a current listing of seminal papers for trainees and practitioners of this newest cross-over subspecialty of psychiatry-now called Psychosomatic Medicine. Experts selected because of their writings and acknowledged contributions to a specific clinical area or problem hope examined thousands of citations to choose those articles, chapters, books, or letters that they regard as most important to Psychosomatic Medicine. In addition, psychiatric specialists in six countries have provided their national Psychosomatic Medicine (Consultation-Liaison) lists as examples of what they regard as the most important teaching materials journals: Australia, Brazil, Greece, Mexico, Portugal, and Taiwan. It is our belief that a cogent, international, systematic review will provide the greatest success in creating a "regionally appropriate" teaching and consultation literature database with world-wide applicability. We review our current progress on this literature database and software, the technical system and data organization involved, the approach used to populate the literature system, and ongoing development plans to bring this system to the physician via mobile technologies.

  11. 75 FR 74706 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  12. Atmospheric phase characteristics of the ALMA long baseline

    NASA Astrophysics Data System (ADS)

    Matsushita, Satoki; Asaki, Yoshiharu; Fomalont, Edward B.; Barkats, Denis; Corder, Stuartt A.; Hills, Richard E.; Kawabe, Ryohei; Maud, Luke T.; Morita, Koh-Ichiro; Nikolic, Bojan; Tilanus, Remo P. J.; Vlahakis, Catherine

    2016-07-01

    Atacama Large Millimeter/submillimeter Array (ALMA) is the world's largest millimeter/ submillimeter (mm / Submm) interferometer. Along with science observations, ALMA has performed several long baseline campaigns in the last 6 years to characterize and optimize its long baseline capabilities. To achieve full long baseline capability of ALMA, it is important to understand the characteristics of atmospheric phase fluctuation at long baselines, since it is believed to be the main cause of mm/submm image degradation. For the first time, we present detailed properties of atmospheric phase fluctuation at mm/submm wavelength from baselines up to 15 km in length. Atmospheric phase fluctuation increases as a function of baseline length with a power-law slope close to 0.6, and many of the data display a shallower slope (02.-03) at baseline length greater than about 15 km. Some of the data, on the other hand, show a single slope up to the maximum baseline length of around 15 km. The phase correction method based on water vapor radiometers (WVRs) works well, especially for cases with precipitable water vapor (PWV) greater than 1 mm, typically yielding a 50% decrease or more in the degree of phase fluctuation. However, signicant amount of atmospheric phase fluctuation still remains after the WVR phase correction: about 200 micron in rms excess path length (rms phase fluctuation in unit of length) even at PWV less than 1 mm. This result suggests the existence of other non-water-vapor sources of phase fluctuation. and emphasizes the need for additional phase correction methods, such as band-to-band and/or fast switching.

  13. The MAR databases: development and implementation of databases specific for marine metagenomics.

    PubMed

    Klemetsen, Terje; Raknes, Inge A; Fu, Juan; Agafonov, Alexander; Balasundaram, Sudhagar V; Tartari, Giacomo; Robertsen, Espen; Willassen, Nils P

    2018-01-04

    We introduce the marine databases; MarRef, MarDB and MarCat (https://mmp.sfb.uit.no/databases/), which are publicly available resources that promote marine research and innovation. These data resources, which have been implemented in the Marine Metagenomics Portal (MMP) (https://mmp.sfb.uit.no/), are collections of richly annotated and manually curated contextual (metadata) and sequence databases representing three tiers of accuracy. While MarRef is a database for completely sequenced marine prokaryotic genomes, which represent a marine prokaryote reference genome database, MarDB includes all incomplete sequenced prokaryotic genomes regardless level of completeness. The last database, MarCat, represents a gene (protein) catalog of uncultivable (and cultivable) marine genes and proteins derived from marine metagenomics samples. The first versions of MarRef and MarDB contain 612 and 3726 records, respectively. Each record is built up of 106 metadata fields including attributes for sampling, sequencing, assembly and annotation in addition to the organism and taxonomic information. Currently, MarCat contains 1227 records with 55 metadata fields. Ontologies and controlled vocabularies are used in the contextual databases to enhance consistency. The user-friendly web interface lets the visitors browse, filter and search in the contextual databases and perform BLAST searches against the corresponding sequence databases. All contextual and sequence databases are freely accessible and downloadable from https://s1.sfb.uit.no/public/mar/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Databases for LDEF results

    NASA Technical Reports Server (NTRS)

    Bohnhoff-Hlavacek, Gail

    1992-01-01

    One of the objectives of the team supporting the LDEF Systems and Materials Special Investigative Groups is to develop databases of experimental findings. These databases identify the hardware flown, summarize results and conclusions, and provide a system for acknowledging investigators, tracing sources of data, and future design suggestions. To date, databases covering the optical experiments, and thermal control materials (chromic acid anodized aluminum, silverized Teflon blankets, and paints) have been developed at Boeing. We used the Filemaker Pro software, the database manager for the Macintosh computer produced by the Claris Corporation. It is a flat, text-retrievable database that provides access to the data via an intuitive user interface, without tedious programming. Though this software is available only for the Macintosh computer at this time, copies of the databases can be saved to a format that is readable on a personal computer as well. Further, the data can be exported to more powerful relational databases, capabilities, and use of the LDEF databases and describe how to get copies of the database for your own research.

  15. Directory of On-Line Networks, Databases and Bulletin Boards on Assistive Technology. Second Edition. RESNA Technical Assistance Project.

    ERIC Educational Resources Information Center

    RESNA: Association for the Advancement of Rehabilitation Technology, Washington, DC.

    This resource directory provides a selective listing of electronic networks, online databases, and bulletin boards that highlight technology-related services and products. For each resource, the following information is provided: name, address, and telephone number; description; target audience; hardware/software needs to access the system;…

  16. The Effect of Technical Performance on Patient Outcomes in Surgery: A Systematic Review.

    PubMed

    Fecso, Andras B; Szasz, Peter; Kerezov, Georgi; Grantcharov, Teodor P

    2017-03-01

    Systematic review of the effect of intraoperative technical performance on patient outcomes. The operating room is a high-stakes, high-risk environment. As a result, the quality of surgical interventions affecting patient outcomes has been the subject of discussion and research for years. MEDLINE, EMBASE, PsycINFO, and Cochrane databases were searched. All surgical specialties were eligible for inclusion. Data were reviewed in regards to the methods by which technical performance was measured, what patient outcomes were assessed, and how intraoperative technical performance affected patient outcomes. Quality of evidence was assessed using the Medical Education Research Study Quality Instrument (MERSQI). Of the 12,758 studies initially identified, 24 articles (7775 total participants) were ultimately included in this review. Seventeen studies assessed the performance of the faculty alone, 2 assessed both the faculty and trainees, 1 assessed trainees alone, and in 4 studies, the level of the operating surgeon was not specified. In 18 studies, a performance assessment tool was used. Patient outcomes were evaluated using intraoperative complications, short-term morbidity, long-term morbidity, short-term mortality, and long-term mortality. The average MERSQI score was 11.67 (range 9.5-14.5). Twenty-one studies demonstrated that superior technical performance was related to improved patient outcomes. The results of this systematic review demonstrated that superior technical performance positively affects patient outcomes. Despite this initial evidence, more robust research is needed to directly assess intraoperative technical performance and its effect on postoperative patient outcomes using meaningful assessment instruments and reliable processes.

  17. Databases for Microbiologists

    DOE PAGES

    Zhulin, Igor B.

    2015-05-26

    Databases play an increasingly important role in biology. They archive, store, maintain, and share information on genes, genomes, expression data, protein sequences and structures, metabolites and reactions, interactions, and pathways. All these data are critically important to microbiologists. Furthermore, microbiology has its own databases that deal with model microorganisms, microbial diversity, physiology, and pathogenesis. Thousands of biological databases are currently available, and it becomes increasingly difficult to keep up with their development. Finally, the purpose of this minireview is to provide a brief survey of current databases that are of interest to microbiologists.

  18. Databases for Microbiologists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhulin, Igor B.

    Databases play an increasingly important role in biology. They archive, store, maintain, and share information on genes, genomes, expression data, protein sequences and structures, metabolites and reactions, interactions, and pathways. All these data are critically important to microbiologists. Furthermore, microbiology has its own databases that deal with model microorganisms, microbial diversity, physiology, and pathogenesis. Thousands of biological databases are currently available, and it becomes increasingly difficult to keep up with their development. Finally, the purpose of this minireview is to provide a brief survey of current databases that are of interest to microbiologists.

  19. Databases for Microbiologists

    PubMed Central

    2015-01-01

    Databases play an increasingly important role in biology. They archive, store, maintain, and share information on genes, genomes, expression data, protein sequences and structures, metabolites and reactions, interactions, and pathways. All these data are critically important to microbiologists. Furthermore, microbiology has its own databases that deal with model microorganisms, microbial diversity, physiology, and pathogenesis. Thousands of biological databases are currently available, and it becomes increasingly difficult to keep up with their development. The purpose of this minireview is to provide a brief survey of current databases that are of interest to microbiologists. PMID:26013493

  20. 75 FR 49918 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings August 6... submitted their baseline filing of its Statement of Operating Conditions for services provided under section... an original and 14 copies of the protest or intervention to the Federal Energy Regulatory Commission...

  1. Online Databases in Physics.

    ERIC Educational Resources Information Center

    Sievert, MaryEllen C.; Verbeck, Alison F.

    1984-01-01

    This overview of 47 online sources for physics information available in the United States--including sub-field databases, transdisciplinary databases, and multidisciplinary databases-- notes content, print source, language, time coverage, and databank. Two discipline-specific databases (SPIN and PHYSICS BRIEFS) are also discussed. (EJS)

  2. Development of a personalized training system using the Lung Image Database Consortium and Image Database resource Initiative Database.

    PubMed

    Lin, Hongli; Wang, Weisheng; Luo, Jiawei; Yang, Xuedong

    2014-12-01

    The aim of this study was to develop a personalized training system using the Lung Image Database Consortium (LIDC) and Image Database resource Initiative (IDRI) Database, because collecting, annotating, and marking a large number of appropriate computed tomography (CT) scans, and providing the capability of dynamically selecting suitable training cases based on the performance levels of trainees and the characteristics of cases are critical for developing a efficient training system. A novel approach is proposed to develop a personalized radiology training system for the interpretation of lung nodules in CT scans using the Lung Image Database Consortium (LIDC) and Image Database Resource Initiative (IDRI) database, which provides a Content-Boosted Collaborative Filtering (CBCF) algorithm for predicting the difficulty level of each case of each trainee when selecting suitable cases to meet individual needs, and a diagnostic simulation tool to enable trainees to analyze and diagnose lung nodules with the help of an image processing tool and a nodule retrieval tool. Preliminary evaluation of the system shows that developing a personalized training system for interpretation of lung nodules is needed and useful to enhance the professional skills of trainees. The approach of developing personalized training systems using the LIDC/IDRL database is a feasible solution to the challenges of constructing specific training program in terms of cost and training efficiency. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  3. Fossil-Fuel C02 Emissions Database and Exploration System

    NASA Astrophysics Data System (ADS)

    Krassovski, M.; Boden, T.

    2012-04-01

    Fossil-Fuel C02 Emissions Database and Exploration System Misha Krassovski and Tom Boden Carbon Dioxide Information Analysis Center Oak Ridge National Laboratory The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production each year at global, regional, and national spatial scales. These estimates are vital to climate change research given the strong evidence suggesting fossil-fuel emissions are responsible for unprecedented levels of carbon dioxide (CO2) in the atmosphere. The CDIAC fossil-fuel emissions time series are based largely on annual energy statistics published for all nations by the United Nations (UN). Publications containing historical energy statistics make it possible to estimate fossil-fuel CO2 emissions back to 1751 before the Industrial Revolution. From these core fossil-fuel CO2 emission time series, CDIAC has developed a number of additional data products to satisfy modeling needs and to address other questions aimed at improving our understanding of the global carbon cycle budget. For example, CDIAC also produces a time series of gridded fossil-fuel CO2 emission estimates and isotopic (e.g., C13) emissions estimates. The gridded data are generated using the methodology described in Andres et al. (2011) and provide monthly and annual estimates for 1751-2008 at 1° latitude by 1° longitude resolution. These gridded emission estimates are being used in the latest IPCC Scientific Assessment (AR4). Isotopic estimates are possible thanks to detailed information for individual nations regarding the carbon content of select fuels (e.g., the carbon signature of natural gas from Russia). CDIAC has recently developed a relational database to house these baseline emissions estimates and associated derived products and a web-based interface to help users worldwide query these data holdings. Users can identify, explore and download desired CDIAC

  4. Mycobacteriophage genome database.

    PubMed

    Joseph, Jerrine; Rajendran, Vasanthi; Hassan, Sameer; Kumar, Vanaja

    2011-01-01

    Mycobacteriophage genome database (MGDB) is an exclusive repository of the 64 completely sequenced mycobacteriophages with annotated information. It is a comprehensive compilation of the various gene parameters captured from several databases pooled together to empower mycobacteriophage researchers. The MGDB (Version No.1.0) comprises of 6086 genes from 64 mycobacteriophages classified into 72 families based on ACLAME database. Manual curation was aided by information available from public databases which was enriched further by analysis. Its web interface allows browsing as well as querying the classification. The main objective is to collect and organize the complexity inherent to mycobacteriophage protein classification in a rational way. The other objective is to browse the existing and new genomes and describe their functional annotation. The database is available for free at http://mpgdb.ibioinformatics.org/mpgdb.php.

  5. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    PubMed

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  6. Maize databases

    USDA-ARS?s Scientific Manuscript database

    This chapter is a succinct overview of maize data held in the species-specific database MaizeGDB (the Maize Genomics and Genetics Database), and selected multi-species data repositories, such as Gramene/Ensembl Plants, Phytozome, UniProt and the National Center for Biotechnology Information (NCBI), ...

  7. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Taehoon; Lee, Dongeun; Choi, Jaesik

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB onmore » data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.« less

  8. An evaluation of selected NASA scientific and technical information products: Results of a pilot study

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Myron

    1989-01-01

    A pilot study was conducted to evaluate selected NASA scientific and technical information (STI) products. The study, which utilized survey research in the form of a self-administered mail questionnaire, had a two-fold purpose -- to gather baseline data regarding the use and perceived usefulness of selected NASA STI products and to develop/validate questions that could be used in a future study concerned with the role of the U.S. government technical report in aeronautics. The sample frame consisted of 25,000 members of the American Institute of Aeronautics and Astronautics in the U.S. with academic, government or industrial affiliation. Simple random sampling was used to select 2000 individuals to participate in the study. Three hundred fifty-three usable questionnaires (17 percent response rate) were received by the established cutoff date. The findings indicate that: (1) NASA STI is used and is generally perceived as being important; (2) the use rate for NASA-authored conference/meeting papers, journal articles, and technical reports is fairly uniform; (3) a considerable number of respondents are unfamiliar with STAR (Scientific and Technical Aerospace Reports), IAA (International Aerospace Abstracts), SCAN (Selected Current Aerospace Notices), and the RECON on-line retrieval system; (4) a considerable number of respondents who are familiar with these media do not use them; and (5) the perceived quality of NASA-authored journal articles and technical reports is very good.

  9. Specialist Bibliographic Databases

    PubMed Central

    2016-01-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls. PMID:27134485

  10. Specialist Bibliographic Databases.

    PubMed

    Gasparyan, Armen Yuri; Yessirkepov, Marlen; Voronov, Alexander A; Trukhachev, Vladimir I; Kostyukova, Elena I; Gerasimov, Alexey N; Kitas, George D

    2016-05-01

    Specialist bibliographic databases offer essential online tools for researchers and authors who work on specific subjects and perform comprehensive and systematic syntheses of evidence. This article presents examples of the established specialist databases, which may be of interest to those engaged in multidisciplinary science communication. Access to most specialist databases is through subscription schemes and membership in professional associations. Several aggregators of information and database vendors, such as EBSCOhost and ProQuest, facilitate advanced searches supported by specialist keyword thesauri. Searches of items through specialist databases are complementary to those through multidisciplinary research platforms, such as PubMed, Web of Science, and Google Scholar. Familiarizing with the functional characteristics of biomedical and nonbiomedical bibliographic search tools is mandatory for researchers, authors, editors, and publishers. The database users are offered updates of the indexed journal lists, abstracts, author profiles, and links to other metadata. Editors and publishers may find particularly useful source selection criteria and apply for coverage of their peer-reviewed journals and grey literature sources. These criteria are aimed at accepting relevant sources with established editorial policies and quality controls.

  11. RTO Technical Publications: A Quarterly Listing

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information covering the period from July 1, 2005 to September 30, 2005; and available in the NASA Aeronautics and Space Database. Contents include: Aeroelastic Deformation: Adaptation of Wind Tunnel Measurement Concepts to Full-Scale Vehicle Flight Testing; Actively Controlling Buffet-Induced Excitations; Modelling and Simulation to Address NATO's New and Existing Military Requirements; Latency in Visionic Systems: Test Methods and Requirements; Personal Hearing Protection including Active Noise Reduction; Virtual Laboratory Enabling Collaborative Research in Applied Vehicle Technologies; A Method to Analyze Tail Buffet Loads of Aircraft; Particle Image Velocimetry Measurements to Evaluate the Effectiveness of Deck-Edge Columnar Vortex Generators on Aircraft Carriers; Introduction to Flight Test Engineering, Volume 14; Pathological Aspects and Associated Biodynamics in Aircraft Accident Investigation;

  12. Creating Your Own Database.

    ERIC Educational Resources Information Center

    Blair, John C., Jr.

    1982-01-01

    Outlines the important factors to be considered in selecting a database management system for use with a microcomputer and presents a series of guidelines for developing a database. General procedures, report generation, data manipulation, information storage, word processing, data entry, database indexes, and relational databases are among the…

  13. NASA STI Database, Aerospace Database and ARIN coverage of 'space law'

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    The space-law coverage provided by the NASA STI Database, the Aerospace Database, and ARIN is briefly described. Particular attention is given to the space law content of the two Databases and of ARIN, the NASA Thesauras space law terminology, space law publication forms, and the availability of the space law literature.

  14. Integrated application of active controls (IAAC) technology to an advanced subsonic transport project. Conventional baseline configuration study

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Characteristics of the U.S. domestic fleet were evaluated to determine the mission characteristics that would have the most impact on U. S. transport fuel use in the future. This resulted in selection of a 197-passenger (plus cargo), about 3710-km (2000 nmi) mission. The existing data base was reviewed and additional analysis was conducted as necessary to complete the technical descriptions. The resulting baseline configuration utilizes a double-lobe, but nearly circular, body with seven-abreast seating. External characteristics feature an 8.71 aspect ratio, 31.5-degree sweep wing, a T-tail empennage, and a dual CF6-6D2, wing-mounted engine arrangement. It provides for 22 LD-2 or 11 LD-3 containers plus bulk cargo in the lower lobe. Passenger/cargo loading, servicing provisions, taxi/takeoff speeds, and field length characteristics are all compatible with accepted airline operations and regulatory provisions. The baseline configuration construction uses conventional aluminum structure except for advanced aluminum alloys and a limited amount of graphite epoxy secondary structure. Modern systems are used, including advanced guidance, navigation, and controls which emphasize application of digital electronics and advanced displays.

  15. Databases: Beyond the Basics.

    ERIC Educational Resources Information Center

    Whittaker, Robert

    This presented paper offers an elementary description of database characteristics and then provides a survey of databases that may be useful to the teacher and researcher in Slavic and East European languages and literatures. The survey focuses on commercial databases that are available, usable, and needed. Individual databases discussed include:…

  16. 75 FR 47291 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-05

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings July 29, 2010. ONEOK Gas Storage, L.L.C Docket No. PR10-67-000. Atmos Energy--Kentucky/Mid-States Division Docket No... applicants listed above submitted their baseline filing of its Statement of Operating Conditions for services...

  17. Database Dictionary for Ethiopian National Ground-Water DAtabase (ENGDA) Data Fields

    USGS Publications Warehouse

    Kuniansky, Eve L.; Litke, David W.; Tucci, Patrick

    2007-01-01

    Introduction This document describes the data fields that are used for both field forms and the Ethiopian National Ground-water Database (ENGDA) tables associated with information stored about production wells, springs, test holes, test wells, and water level or water-quality observation wells. Several different words are used in this database dictionary and in the ENGDA database to describe a narrow shaft constructed in the ground. The most general term is borehole, which is applicable to any type of hole. A well is a borehole specifically constructed to extract water from the ground; however, for this data dictionary and for the ENGDA database, the words well and borehole are used interchangeably. A production well is defined as any well used for water supply and includes hand-dug wells, small-diameter bored wells equipped with hand pumps, or large-diameter bored wells equipped with large-capacity motorized pumps. Test holes are borings made to collect information about the subsurface with continuous core or non-continuous core and/or where geophysical logs are collected. Test holes are not converted into wells. A test well is a well constructed for hydraulic testing of an aquifer in order to plan a larger ground-water production system. A water-level or water-quality observation well is a well that is used to collect information about an aquifer and not used for water supply. A spring is any naturally flowing, local, ground-water discharge site. The database dictionary is designed to help define all fields on both field data collection forms (provided in attachment 2 of this report) and for the ENGDA software screen entry forms (described in Litke, 2007). The data entered into each screen entry field are stored in relational database tables within the computer database. The organization of the database dictionary is designed based on field data collection and the field forms, because this is what the majority of people will use. After each field, however, the

  18. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  19. Knowledge Discovery in Biological Databases for Revealing Candidate Genes Linked to Complex Phenotypes.

    PubMed

    Hassani-Pak, Keywan; Rawlings, Christopher

    2017-06-13

    Genetics and "omics" studies designed to uncover genotype to phenotype relationships often identify large numbers of potential candidate genes, among which the causal genes are hidden. Scientists generally lack the time and technical expertise to review all relevant information available from the literature, from key model species and from a potentially wide range of related biological databases in a variety of data formats with variable quality and coverage. Computational tools are needed for the integration and evaluation of heterogeneous information in order to prioritise candidate genes and components of interaction networks that, if perturbed through potential interventions, have a positive impact on the biological outcome in the whole organism without producing negative side effects. Here we review several bioinformatics tools and databases that play an important role in biological knowledge discovery and candidate gene prioritization. We conclude with several key challenges that need to be addressed in order to facilitate biological knowledge discovery in the future.

  20. STRBase: a short tandem repeat DNA database for the human identity testing community

    PubMed Central

    Ruitberg, Christian M.; Reeder, Dennis J.; Butler, John M.

    2001-01-01

    The National Institute of Standards and Technology (NIST) has compiled and maintained a Short Tandem Repeat DNA Internet Database (http://www.cstl.nist.gov/biotech/strbase/) since 1997 commonly referred to as STRBase. This database is an information resource for the forensic DNA typing community with details on commonly used short tandem repeat (STR) DNA markers. STRBase consolidates and organizes the abundant literature on this subject to facilitate on-going efforts in DNA typing. Observed alleles and annotated sequence for each STR locus are described along with a review of STR analysis technologies. Additionally, commercially available STR multiplex kits are described, published polymerase chain reaction (PCR) primer sequences are reported, and validation studies conducted by a number of forensic laboratories are listed. To supplement the technical information, addresses for scientists and hyperlinks to organizations working in this area are available, along with the comprehensive reference list of over 1300 publications on STRs used for DNA typing purposes. PMID:11125125

  1. Feasibility and utility of applications of the common data model to multiple, disparate observational health databases.

    PubMed

    Voss, Erica A; Makadia, Rupa; Matcho, Amy; Ma, Qianli; Knoll, Chris; Schuemie, Martijn; DeFalco, Frank J; Londhe, Ajit; Zhu, Vivienne; Ryan, Patrick B

    2015-05-01

    To evaluate the utility of applying the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) across multiple observational databases within an organization and to apply standardized analytics tools for conducting observational research. Six deidentified patient-level datasets were transformed to the OMOP CDM. We evaluated the extent of information loss that occurred through the standardization process. We developed a standardized analytic tool to replicate the cohort construction process from a published epidemiology protocol and applied the analysis to all 6 databases to assess time-to-execution and comparability of results. Transformation to the CDM resulted in minimal information loss across all 6 databases. Patients and observations excluded were due to identified data quality issues in the source system, 96% to 99% of condition records and 90% to 99% of drug records were successfully mapped into the CDM using the standard vocabulary. The full cohort replication and descriptive baseline summary was executed for 2 cohorts in 6 databases in less than 1 hour. The standardization process improved data quality, increased efficiency, and facilitated cross-database comparisons to support a more systematic approach to observational research. Comparisons across data sources showed consistency in the impact of inclusion criteria, using the protocol and identified differences in patient characteristics and coding practices across databases. Standardizing data structure (through a CDM), content (through a standard vocabulary with source code mappings), and analytics can enable an institution to apply a network-based approach to observational research across multiple, disparate observational health databases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Methodological Issues Surrounding the Use of Baseline Health-Related Quality of Life Data to Inform Trial-Based Economic Evaluations of Interventions Within Emergency and Critical Care Settings: A Systematic Literature Review.

    PubMed

    Dritsaki, Melina; Achana, Felix; Mason, James; Petrou, Stavros

    2017-05-01

    Trial-based cost-utility analyses require health-related quality of life data that generate utility values in order to express health outcomes in terms of quality-adjusted life years (QALYs). Assessments of baseline health-related quality of life are problematic where trial participants are incapacitated or critically ill at the time of randomisation. This review aims to identify and critique methods for handling non-availability of baseline health-related quality of life data in trial-based cost-utility analyses within emergency and critical illness settings. A systematic literature review was conducted, following PRISMA guidelines, to identify trial-based cost-utility analyses of interventions within emergency and critical care settings. Databases searched included the National Institute for Health Research (NIHR) Journals Library (1991-July 2016), Cochrane Library (all years); National Health Service (NHS) Economic Evaluation Database (all years) and Ovid MEDLINE/Embase (without time restriction). Strategies employed to handle non-availability of baseline health-related quality of life data in final QALY estimations were identified and critiqued. A total of 4224 published reports were screened, 19 of which met the study inclusion criteria (mean trial size 1670): 14 (74 %) from the UK, four (21%) from other European countries and one (5%) from India. Twelve studies (63%) were based in emergency departments and seven (37%) in intensive care units. Only one study was able to elicit patient-reported health-related quality of life at baseline. To overcome the lack of baseline data when estimating QALYs, eight studies (42%) assigned a fixed utility weight corresponding to either death, an unconscious health state or a country-specific norm to patients at baseline, four (21%) ignored baseline utilities, three (16%) applied values from another study, one (5%) generated utility values via retrospective recall and one (5%) elicited utilities from experts. A preliminary

  3. Multi-baseline bootstrapping at the Navy precision optical interferometer

    NASA Astrophysics Data System (ADS)

    Armstrong, J. T.; Schmitt, H. R.; Mozurkewich, D.; Jorgensen, A. M.; Muterspaugh, M. W.; Baines, E. K.; Benson, J. A.; Zavala, Robert T.; Hutter, D. J.

    2014-07-01

    The Navy Precision Optical Interferometer (NPOI) was designed from the beginning to support baseline boot- strapping with equally-spaced array elements. The motivation was the desire to image the surfaces of resolved stars with the maximum resolution possible with a six-element array. Bootstrapping two baselines together to track fringes on a third baseline has been used at the NPOI for many years, but the capabilities of the fringe tracking software did not permit us to bootstrap three or more baselines together. Recently, both a new backend (VISION; Tennessee State Univ.) and new hardware and firmware (AZ Embedded Systems and New Mexico Tech, respectively) for the current hybrid backend have made multi-baseline bootstrapping possible.

  4. 76 FR 5797 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  5. Short and Long Baseline Neutrino Experiments

    NASA Astrophysics Data System (ADS)

    Autiero, Dario

    2005-04-01

    These two lectures discuss the past and current neutrino oscillation experiments performed with man-made neutrino sources, like accelerators and nuclear reactors. The search for neutrino oscillations is a remarkable effort, which has been performed over three decades. It is therefore interesting to discuss the short and long baseline neutrino experiments in their historical context and to see how this line of research evolved up to the present generation of experiments, looking at what was learnt from past experiments and how this experience is used in the current ones. The first lecture focuses on the past generation of short baseline experiments (NOMAD and CHORUS) performed at CERN and ends with LSND and MINIBOONE. The second lecture discusses how after the CHOOZ and the atmospheric neutrino results the line of the long baseline experiments developed and presents in details the K2K and MINOS experiments and the CNGS program.

  6. HormoneBase, a population-level database of steroid hormone levels across vertebrates

    PubMed Central

    Vitousek, Maren N.; Johnson, Michele A.; Donald, Jeremy W.; Francis, Clinton D.; Fuxjager, Matthew J.; Goymann, Wolfgang; Hau, Michaela; Husak, Jerry F.; Kircher, Bonnie K.; Knapp, Rosemary; Martin, Lynn B.; Miller, Eliot T.; Schoenle, Laura A.; Uehling, Jennifer J.; Williams, Tony D.

    2018-01-01

    Hormones are central regulators of organismal function and flexibility that mediate a diversity of phenotypic traits from early development through senescence. Yet despite these important roles, basic questions about how and why hormone systems vary within and across species remain unanswered. Here we describe HormoneBase, a database of circulating steroid hormone levels and their variation across vertebrates. This database aims to provide all available data on the mean, variation, and range of plasma glucocorticoids (both baseline and stress-induced) and androgens in free-living and un-manipulated adult vertebrates. HormoneBase (www.HormoneBase.org) currently includes >6,580 entries from 476 species, reported in 648 publications from 1967 to 2015, and unpublished datasets. Entries are associated with data on the species and population, sex, year and month of study, geographic coordinates, life history stage, method and latency of hormone sampling, and analysis technique. This novel resource could be used for analyses of the function and evolution of hormone systems, and the relationships between hormonal variation and a variety of processes including phenotypic variation, fitness, and species distributions. PMID:29786693

  7. Method and apparatus for reliable inter-antenna baseline determination

    NASA Technical Reports Server (NTRS)

    Wilson, John M. (Inventor)

    2001-01-01

    Disclosed is a method for inter-antenna baseline determination that uses an antenna configuration comprising a pair of relatively closely spaced antennas and other pairs of distant antennas. The closely spaced pair provides a short baseline having an integer ambiguity that may be searched exhaustively to identify the correct set of integers. This baseline is then used as a priori information to aid the determination of longer baselines that, once determined, may be used for accurate run time attitude determination.

  8. Image Databases.

    ERIC Educational Resources Information Center

    Pettersson, Rune

    Different kinds of pictorial databases are described with respect to aims, user groups, search possibilities, storage, and distribution. Some specific examples are given for databases used for the following purposes: (1) labor markets for artists; (2) document management; (3) telling a story; (4) preservation (archives and museums); (5) research;…

  9. ALMA Long Baseline Campaigns: Phase Characteristics of Atmosphere at Long Baselines in the Millimeter and Submillimeter Wavelengths

    NASA Astrophysics Data System (ADS)

    Matsushita, Satoki; Asaki, Yoshiharu; Fomalont, Edward B.; Morita, Koh-Ichiro; Barkats, Denis; Hills, Richard E.; Kawabe, Ryohei; Maud, Luke T.; Nikolic, Bojan; Tilanus, Remo P. J.; Vlahakis, Catherine; Whyborn, Nicholas D.

    2017-03-01

    We present millimeter- and submillimeter-wave phase characteristics measured between 2012 and 2014 of Atacama Large Millimeter/submillimeter Array long baseline campaigns. This paper presents the first detailed investigation of the characteristics of phase fluctuation and phase correction methods obtained with baseline lengths up to ˜15 km. The basic phase fluctuation characteristics can be expressed with the spatial structure function (SSF). Most of the SSFs show that the phase fluctuation increases as a function of baseline length, with a power-law slope of ˜0.6. In many cases, we find that the slope becomes shallower (average of ˜0.2-0.3) at baseline lengths longer than ˜1 km, namely showing a turn-over in SSF. These power law slopes do not change with the amount of precipitable water vapor (PWV), but the fitted constants have a weak correlation with PWV, so that the phase fluctuation at a baseline length of 10 km also increases as a function of PWV. The phase correction method using water vapor radiometers (WVRs) works well, especially for the cases where PWV > 1 {mm}, which reduces the degree of phase fluctuations by a factor of two in many cases. However, phase fluctuations still remain after the WVR phase correction, suggesting the existence of other turbulent constituent that cause the phase fluctuation. This is supported by occasional SSFs that do not exhibit any turn-over; these are only seen when the PWV is low (i.e., when the WVR phase correction works less effectively) or after WVR phase correction. This means that the phase fluctuation caused by this turbulent constituent is inherently smaller than that caused by water vapor. Since in these rare cases there is no turn-over in the SSF up to the maximum baseline length of ˜15 km, this turbulent constituent must have scale height of 10 km or more, and thus cannot be water vapor, whose scale height is around 1 km. Based on the characteristics, this large scale height turbulent constituent is likely

  10. 75 FR 70732 - Notice of Baseline Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  11. Does teaching non-technical skills to medical students improve those skills and simulated patient outcome?

    PubMed

    Hagemann, Vera; Herbstreit, Frank; Kehren, Clemens; Chittamadathil, Jilson; Wolfertz, Sandra; Dirkmann, Daniel; Kluge, Annette; Peters, Jürgen

    2017-03-29

    The purpose of this study is to evaluate the effects of a tailor-made, non-technical skills seminar on medical student's behaviour, attitudes, and performance during simulated patient treatment. Seventy-seven students were randomized to either a non-technical skills seminar (NTS group, n=43) or a medical seminar (control group, n=34). The human patient simulation was used as an evaluation tool. Before the seminars, all students performed the same simulated emergency scenario to provide baseline measurements. After the seminars, all students were exposed to a second scenario, and behavioural markers for evaluating their non-technical skills were rated. Furthermore, teamwork-relevant attitudes were measured before and after the scenarios, and perceived stress was measured following each simulation. All simulations were also evaluated for various medical endpoints. Non-technical skills concerning situation awareness (p<.01, r=0.5) and teamwork (p<.01, r=0.45) improved from simulation I to II in the NTS group. Decision making improved in both groups (NTS: p<.01, r=0.39; control: p<.01, r=0.46). The attitude 'handling errors' improved significantly in the NTS group (p<.05, r=0.34). Perceived stress decreased from simulation I to II in both groups. Medical endpoints and patients´ outcome did not differ significantly between the groups in simulation II. This study highlights the effectiveness of a single brief seminar on non-technical skills to improve student's non-technical skills. In a next step, to improve student's handling of emergencies and patient outcomes, non-technical skills seminars should be accompanied by exercises and more broadly embedded in the medical school curriculum.

  12. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    ERIC Educational Resources Information Center

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  13. Pan-cancer analysis reveals technical artifacts in TCGA germline variant calls.

    PubMed

    Buckley, Alexandra R; Standish, Kristopher A; Bhutani, Kunal; Ideker, Trey; Lasken, Roger S; Carter, Hannah; Harismendy, Olivier; Schork, Nicholas J

    2017-06-12

    Cancer research to date has largely focused on somatically acquired genetic aberrations. In contrast, the degree to which germline, or inherited, variation contributes to tumorigenesis remains unclear, possibly due to a lack of accessible germline variant data. Here we called germline variants on 9618 cases from The Cancer Genome Atlas (TCGA) database representing 31 cancer types. We identified batch effects affecting loss of function (LOF) variant calls that can be traced back to differences in the way the sequence data were generated both within and across cancer types. Overall, LOF indel calls were more sensitive to technical artifacts than LOF Single Nucleotide Variant (SNV) calls. In particular, whole genome amplification of DNA prior to sequencing led to an artificially increased burden of LOF indel calls, which confounded association analyses relating germline variants to tumor type despite stringent indel filtering strategies. The samples affected by these technical artifacts include all acute myeloid leukemia and practically all ovarian cancer samples. We demonstrate how technical artifacts induced by whole genome amplification of DNA can lead to false positive germline-tumor type associations and suggest TCGA whole genome amplified samples be used with caution. This study draws attention to the need to be sensitive to problems associated with a lack of uniformity in data generation in TCGA data.

  14. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    USGS Publications Warehouse

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  15. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  16. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  17. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  18. 40 CFR 80.93 - Individual baseline submission and approval.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of its individual baseline to EPA (Fuel Studies and Standards Branch, Baseline Submission, U.S. EPA... Studies and Standards Branch, Baseline Petition, U.S. EPA, 2565 Plymouth Road, Ann Arbor, Michigan 48105..., used in the determination of a given fuel parameter; (iii) Identification of test method. If not per...

  19. The Chicago Thoracic Oncology Database Consortium: A Multisite Database Initiative.

    PubMed

    Won, Brian; Carey, George B; Tan, Yi-Hung Carol; Bokhary, Ujala; Itkonen, Michelle; Szeto, Kyle; Wallace, James; Campbell, Nicholas; Hensing, Thomas; Salgia, Ravi

    2016-03-16

    An increasing amount of clinical data is available to biomedical researchers, but specifically designed database and informatics infrastructures are needed to handle this data effectively. Multiple research groups should be able to pool and share this data in an efficient manner. The Chicago Thoracic Oncology Database Consortium (CTODC) was created to standardize data collection and facilitate the pooling and sharing of data at institutions throughout Chicago and across the world. We assessed the CTODC by conducting a proof of principle investigation on lung cancer patients who took erlotinib. This study does not look into epidermal growth factor receptor (EGFR) mutations and tyrosine kinase inhibitors, but rather it discusses the development and utilization of the database involved.  We have implemented the Thoracic Oncology Program Database Project (TOPDP) Microsoft Access, the Thoracic Oncology Research Program (TORP) Velos, and the TORP REDCap databases for translational research efforts. Standard operating procedures (SOPs) were created to document the construction and proper utilization of these databases. These SOPs have been made available freely to other institutions that have implemented their own databases patterned on these SOPs. A cohort of 373 lung cancer patients who took erlotinib was identified. The EGFR mutation statuses of patients were analyzed. Out of the 70 patients that were tested, 55 had mutations while 15 did not. In terms of overall survival and duration of treatment, the cohort demonstrated that EGFR-mutated patients had a longer duration of erlotinib treatment and longer overall survival compared to their EGFR wild-type counterparts who received erlotinib. The investigation successfully yielded data from all institutions of the CTODC. While the investigation identified challenges, such as the difficulty of data transfer and potential duplication of patient data, these issues can be resolved with greater cross-communication between

  20. Relationship between intraoperative non-technical performance and technical events in bariatric surgery.

    PubMed

    Fecso, A B; Kuzulugil, S S; Babaoglu, C; Bener, A B; Grantcharov, T P

    2018-03-30

    The operating theatre is a unique environment with complex team interactions, where technical and non-technical performance affect patient outcomes. The correlation between technical and non-technical performance, however, remains underinvestigated. The purpose of this study was to explore these interactions in the operating theatre. A prospective single-centre observational study was conducted at a tertiary academic medical centre. One surgeon and three fellows participated as main operators. All patients who underwent a laparoscopic Roux-en-Y gastric bypass and had the procedures captured using the Operating Room Black Box ® platform were included. Technical assessment was performed using the Objective Structured Assessment of Technical Skills and Generic Error Rating Tool instruments. For non-technical assessment, the Non-Technical Skills for Surgeons (NOTSS) and Scrub Practitioners' List of Intraoperative Non-Technical Skills (SPLINTS) tools were used. Spearman rank-order correlation and N-gram statistics were conducted. Fifty-six patients were included in the study and 90 procedural steps (gastrojejunostomy and jejunojejunostomy) were analysed. There was a moderate to strong correlation between technical adverse events (r s  = 0·417-0·687), rectifications (r s  = 0·380-0·768) and non-technical performance of the surgical and nursing teams (NOTSS and SPLINTS). N-gram statistics showed that after technical errors, events and prior rectifications, the staff surgeon and the scrub nurse exhibited the most positive non-technical behaviours, irrespective of operator (staff surgeon or fellow). This study demonstrated that technical and non-technical performances are related, on both an individual and a team level. Valuable data can be obtained around intraoperative errors, events and rectifications. © 2018 BJS Society Ltd Published by John Wiley & Sons Ltd.

  1. Interpretative Ruling: Allowable Emissions Baseline

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  2. Simulator training and non-technical factors improve laparoscopic performance among OBGYN trainees.

    PubMed

    Ahlborg, Liv; Hedman, Leif; Nisell, Henry; Felländer-Tsai, Li; Enochsson, Lars

    2013-10-01

    To investigate how simulator training and non-technical factors affect laparoscopic performance among residents in obstetrics and gynecology. In this prospective study, trainees were randomized into three groups. The first group was allocated to proficiency-based training in the LapSimGyn(®) virtual reality simulator. The second group received additional structured mentorship during subsequent laparoscopies. The third group served as control group. At baseline an operation was performed and visuospatial ability, flow and self-efficacy were assessed. All groups subsequently performed three tubal occlusions. Self-efficacy and flow were assessed before and/or after each operation. Simulator training was conducted at the Center for Advanced Medical Simulation and Training, Karolinska University Hospital. Sterilizations were performed at each trainee's home clinic. Twenty-eight trainees/residents from 21 hospitals in Sweden were included. Visuospatial ability was tested by the Mental Rotation Test-A. Flow and self-efficacy were assessed by validated scales and questionnaires. Laparoscopic performance was measured as the duration of surgery. Visuospatial ability, self-efficacy and flow were correlated to the laparoscopic performance using Spearman's correlations. Differences between groups were analyzed by the Mann-Whitney U-test. No differences across groups were detected at baseline. Self-efficacy scores before and flow scores after the third operation were significantly higher in the trained groups. Duration of surgery was significantly shorter in the trained groups. Flow and self-efficacy correlate positively with laparoscopic performance. Simulator training and non-technical factors appear to improve the laparoscopic performance among trainees/residents in obstetrics and gynecology. © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.

  3. The importance of archiving baseline wilderness data

    Treesearch

    David N. Cole

    2007-01-01

    Baseline wilderness data are of considerable importance for several reasons. One of the primary values of wilderness is as a reference that contrasts with those lands where humans dominate the landscape. Leopold (1941) called wilderness "a base-datum of normality, a picture of how healthy land maintains itself." To realize this value, baseline data on...

  4. Baseline budgeting for continuous improvement.

    PubMed

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  5. Passive range estimation using dual baseline triangulation

    NASA Astrophysics Data System (ADS)

    Pieper, Ronald J.; Cooper, Alfred W.; Pelegris, G.

    1996-03-01

    Modern combat systems based on active radar sensing suffer disadvantages against low-flying targets in cluttered backgrounds. Use of passive infrared sensors with these systems, either in cooperation or as an alternative, shows potential for improving target detection and declaration range for targets crossing the horizon. Realization of this potential requires fusion of target position data from dissimilar sensors, or passive sensor measurement of target range. The availability of passive sensors that can supply both range and bearing data on such targets would significantly extend the robustness of an integrated ship self-defense system. This paper considers a new method of range determination with passive sensors based on the principle of triangulation, extending the principle to two orthogonal baselines. The performance of single or double baseline triangulation depends on sensor bearing precision and direction to target. An expression for maximum triangulation range at a required accuracy is derived as a function of polar angle relative to the center of the dual-baseline system. Limitations in the dual- baseline model due to the geometrically assessed horizon are also considered.

  6. Preliminary design study of a baseline MIUS

    NASA Technical Reports Server (NTRS)

    Wolfer, B. M.; Shields, V. E.; Rippey, J. O.; Roberts, H. L.; Wadle, R. C.; Wallin, S. P.; Gill, W. L.; White, E. H.; Monzingo, R.

    1977-01-01

    Results of a conceptual design study to establish a baseline design for a modular integrated utility system (MIUS) are presented. The system concept developed a basis for evaluating possible projects to demonstrate an MIUS. For the baseline study, climate conditions for the Washington, D.C., area were used. The baseline design is for a high density apartment complex of 496 dwelling units with a planned full occupancy of approximately 1200 residents. Environmental considerations and regulations for the MIUS installation are discussed. Detailed cost data for the baseline MIUS are given together with those for design and operating variations under climate conditions typified by Las Vegas, Nevada, Houston, Texas, and Minneapolis, Minnesota. In addition, results of an investigation of size variation effects, for 300 and 1000 unit apartment complexes, are presented. Only conceptual aspects of the design are discussed. Results regarding energy savings and costs are intended only as trend information and for use in relative comparisons. Alternate heating, ventilation, and air conditioning concepts are considered in the appendix.

  7. Safe and sensible preprocessing and baseline correction of pupil-size data.

    PubMed

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  8. Efficient Wide Baseline Structure from Motion

    NASA Astrophysics Data System (ADS)

    Michelini, Mario; Mayer, Helmut

    2016-06-01

    This paper presents a Structure from Motion approach for complex unorganized image sets. To achieve high accuracy and robustness, image triplets are employed and (an approximate) camera calibration is assumed to be known. The focus lies on a complete linking of images even in case of large image distortions, e.g., caused by wide baselines, as well as weak baselines. A method for embedding image descriptors into Hamming space is proposed for fast image similarity ranking. The later is employed to limit the number of pairs to be matched by a wide baseline method. An iterative graph-based approach is proposed formulating image linking as the search for a terminal Steiner minimum tree in a line graph. Finally, additional links are determined and employed to improve the accuracy of the pose estimation. By this means, loops in long image sequences are implicitly closed. The potential of the proposed approach is demonstrated by results for several complex image sets also in comparison with VisualSFM.

  9. Detecting chronic kidney disease in population-based administrative databases using an algorithm of hospital encounter and physician claim codes.

    PubMed

    Fleet, Jamie L; Dixon, Stephanie N; Shariff, Salimah Z; Quinn, Robert R; Nash, Danielle M; Harel, Ziv; Garg, Amit X

    2013-04-05

    Large, population-based administrative healthcare databases can be used to identify patients with chronic kidney disease (CKD) when serum creatinine laboratory results are unavailable. We examined the validity of algorithms that used combined hospital encounter and physician claims database codes for the detection of CKD in Ontario, Canada. We accrued 123,499 patients over the age of 65 from 2007 to 2010. All patients had a baseline serum creatinine value to estimate glomerular filtration rate (eGFR). We developed an algorithm of physician claims and hospital encounter codes to search administrative databases for the presence of CKD. We determined the sensitivity, specificity, positive and negative predictive values of this algorithm to detect our primary threshold of CKD, an eGFR <45 mL/min per 1.73 m² (15.4% of patients). We also assessed serum creatinine and eGFR values in patients with and without CKD codes (algorithm positive and negative, respectively). Our algorithm required evidence of at least one of eleven CKD codes and 7.7% of patients were algorithm positive. The sensitivity was 32.7% [95% confidence interval: (95% CI): 32.0 to 33.3%]. Sensitivity was lower in women compared to men (25.7 vs. 43.7%; p <0.001) and in the oldest age category (over 80 vs. 66 to 80; 28.4 vs. 37.6 %; p < 0.001). All specificities were over 94%. The positive and negative predictive values were 65.4% (95% CI: 64.4 to 66.3%) and 88.8% (95% CI: 88.6 to 89.0%), respectively. In algorithm positive patients, the median [interquartile range (IQR)] baseline serum creatinine value was 135 μmol/L (106 to 179 μmol/L) compared to 82 μmol/L (69 to 98 μmol/L) for algorithm negative patients. Corresponding eGFR values were 38 mL/min per 1.73 m² (26 to 51 mL/min per 1.73 m²) vs. 69 mL/min per 1.73 m² (56 to 82 mL/min per 1.73 m²), respectively. Patients with CKD as identified by our database algorithm had distinctly higher baseline serum creatinine values and lower eGFR values

  10. Detecting chronic kidney disease in population-based administrative databases using an algorithm of hospital encounter and physician claim codes

    PubMed Central

    2013-01-01

    Background Large, population-based administrative healthcare databases can be used to identify patients with chronic kidney disease (CKD) when serum creatinine laboratory results are unavailable. We examined the validity of algorithms that used combined hospital encounter and physician claims database codes for the detection of CKD in Ontario, Canada. Methods We accrued 123,499 patients over the age of 65 from 2007 to 2010. All patients had a baseline serum creatinine value to estimate glomerular filtration rate (eGFR). We developed an algorithm of physician claims and hospital encounter codes to search administrative databases for the presence of CKD. We determined the sensitivity, specificity, positive and negative predictive values of this algorithm to detect our primary threshold of CKD, an eGFR <45 mL/min per 1.73 m2 (15.4% of patients). We also assessed serum creatinine and eGFR values in patients with and without CKD codes (algorithm positive and negative, respectively). Results Our algorithm required evidence of at least one of eleven CKD codes and 7.7% of patients were algorithm positive. The sensitivity was 32.7% [95% confidence interval: (95% CI): 32.0 to 33.3%]. Sensitivity was lower in women compared to men (25.7 vs. 43.7%; p <0.001) and in the oldest age category (over 80 vs. 66 to 80; 28.4 vs. 37.6 %; p < 0.001). All specificities were over 94%. The positive and negative predictive values were 65.4% (95% CI: 64.4 to 66.3%) and 88.8% (95% CI: 88.6 to 89.0%), respectively. In algorithm positive patients, the median [interquartile range (IQR)] baseline serum creatinine value was 135 μmol/L (106 to 179 μmol/L) compared to 82 μmol/L (69 to 98 μmol/L) for algorithm negative patients. Corresponding eGFR values were 38 mL/min per 1.73 m2 (26 to 51 mL/min per 1.73 m2) vs. 69 mL/min per 1.73 m2 (56 to 82 mL/min per 1.73 m2), respectively. Conclusions Patients with CKD as identified by our database algorithm had distinctly higher baseline serum

  11. The International Database of Efficient Appliances (IDEA): A New Resource for Global Efficiency Policy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerke, Brian F; McNeil, Michael A; Tu, Thomas

    A major barrier to effective appliance efficiency program design and evaluation is a lack of data for determination of market baselines and cost-effective energy savings potential. The data gap is particularly acute in developing countries, which may have the greatest savings potential per unit GDP. To address this need, we are developing the International Database of Efficient Appliances (IDEA), which automatically compiles data from a wide variety of online sources to create a unified repository of information on efficiency, price, and features for a wide range of energy-consuming products across global markets. This paper summarizes the database framework and demonstratesmore » the power of IDEA as a resource for appliance efficiency research and policy development. Using IDEA data for refrigerators in China and India, we develop robust cost-effectiveness indicators that allow rapid determination of savings potential within each market, as well as comparison of that potential across markets and appliance types. We discuss implications for future energy efficiency policy development.« less

  12. The extent of intestinal failure-associated liver disease in patients referred for intestinal rehabilitation is associated with increased mortality: an analysis of the pediatric intestinal failure consortium database.

    PubMed

    Javid, Patrick J; Oron, Assaf P; Duggan, Christopher; Squires, Robert H; Horslen, Simon P

    2017-09-05

    The advent of regional multidisciplinary intestinal rehabilitation programs has been associated with improved survival in pediatric intestinal failure. Yet, the optimal timing of referral for intestinal rehabilitation remains unknown. We hypothesized that the degree of intestinal failure-associated liver disease (IFALD) at initiation of intestinal rehabilitation would be associated with overall outcome. The multicenter, retrospective Pediatric Intestinal Failure Consortium (PIFCon) database was used to identify all subjects with baseline bilirubin data. Conjugated bilirubin (CBili) was used as a marker for IFALD, and we stratified baseline bilirubin values as CBili<2 mg/dL, CBili 2-4 mg/dL, and CBili>4 mg/dL. The association between baseline CBili and mortality was examined using Cox proportional hazards regression. Of 272 subjects in the database, 191 (70%) children had baseline bilirubin data collected. 38% and 28% of patients had CBili >4 mg/dL and CBili <2 mg/dL, respectively, at baseline. All-cause mortality was 23%. On univariate analysis, mortality was associated with CBili 2-4 mg/dL, CBili >4 mg/dL, prematurity, race, and small bowel atresia. On regression analysis controlling for age, prematurity, and diagnosis, the risk of mortality was increased by 3-fold for baseline CBili 2-4 mg/dL (HR 3.25 [1.07-9.92], p=0.04) and 4-fold for baseline CBili >4 mg/dL (HR 4.24 [1.51-11.92], p=0.006). On secondary analysis, CBili >4 mg/dL at baseline was associated with a lower chance of attaining enteral autonomy. In children with intestinal failure treated at intestinal rehabilitation programs, more advanced IFALD at referral is associated with increased mortality and decreased prospect of attaining enteral autonomy. Early referral of children with intestinal failure to intestinal rehabilitation programs should be strongly encouraged. Treatment Study, Level III. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Chemical Kinetics Database

    National Institute of Standards and Technology Data Gateway

    SRD 17 NIST Chemical Kinetics Database (Web, free access)   The NIST Chemical Kinetics Database includes essentially all reported kinetics results for thermal gas-phase chemical reactions. The database is designed to be searched for kinetics data based on the specific reactants involved, for reactions resulting in specified products, for all the reactions of a particular species, or for various combinations of these. In addition, the bibliography can be searched by author name or combination of names. The database contains in excess of 38,000 separate reaction records for over 11,700 distinct reactant pairs. These data have been abstracted from over 12,000 papers with literature coverage through early 2000.

  14. Database Search Strategies & Tips. Reprints from the Best of "ONLINE" [and]"DATABASE."

    ERIC Educational Resources Information Center

    Online, Inc., Weston, CT.

    Reprints of 17 articles presenting strategies and tips for searching databases online appear in this collection, which is one in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1)…

  15. Smoothness of In vivo Spectral Baseline Determined by Mean Squared Error

    PubMed Central

    Zhang, Yan; Shen, Jun

    2013-01-01

    Purpose A nonparametric smooth line is usually added to spectral model to account for background signals in vivo magnetic resonance spectroscopy (MRS). The assumed smoothness of the baseline significantly influences quantitative spectral fitting. In this paper, a method is proposed to minimize baseline influences on estimated spectral parameters. Methods In this paper, the non-parametric baseline function with a given smoothness was treated as a function of spectral parameters. Its uncertainty was measured by root-mean-squared error (RMSE). The proposed method was demonstrated with a simulated spectrum and in vivo spectra of both short echo time (TE) and averaged echo times. The estimated in vivo baselines were compared with the metabolite-nulled spectra, and the LCModel-estimated baselines. The accuracies of estimated baseline and metabolite concentrations were further verified by cross-validation. Results An optimal smoothness condition was found that led to the minimal baseline RMSE. In this condition, the best fit was balanced against minimal baseline influences on metabolite concentration estimates. Conclusion Baseline RMSE can be used to indicate estimated baseline uncertainties and serve as the criterion for determining the baseline smoothness of in vivo MRS. PMID:24259436

  16. Association of baseline sex hormone levels with baseline and longitudinal changes in waist-to-hip ratio: Multi-Ethnic Study of Atherosclerosis.

    PubMed

    Vaidya, D; Dobs, A; Gapstur, S M; Golden, S H; Cushman, M; Liu, K; Ouyang, P

    2012-12-01

    Waist-to-hip ratio (WHR) is strongly associated with prevalent atherosclerosis. We analyzed the associations of baseline serum levels of testosterone (T), estradiol (E2), sex-hormone-binding globulin (SHBG) and dehydroepiandrosterone (DHEA) with WHR in the Multi-Ethnic Study of Atherosclerosis (MESA) cohort. Baseline data was available for 3144 men and 2038 postmenopausal women, who were non-users of hormone therapy, who were 45-84 years of age, and of White, Chinese, Black or Hispanic racial/ethnic groups. Of these, 2708 men and 1678 women also had longitudinal measurements of WHR measured at the second and/or the third study visits (median follow-up 578 days and 1135 days, respectively). In cross-sectional analyses adjusted for age, race and cardiovascular disease risk factors, T was negatively associated with baseline WHR in men, whereas in both sexes, E2 was positively associated and SHBG was negatively associated with WHR (all P<0.001). In longitudinal analyses, further adjusted for follow-up time and baseline WHR, baseline T was negatively associated with WHR at follow-up (P=0.001) in men, whereas in both sexes, E2 was positively associated (P=0.004) and SHBG was negatively associated with WHR (P<0.001). The longitudinal association of E2, but not T, was independent of SHBG. In cross-sectional or longitudinal analyses, there were no associations between DHEA and WHR in either men or women. Sex hormones are associated with WHR at baseline and also during follow-up above and beyond their baseline association. Future research is needed to determine if manipulation of hormones is associated with changes in central obesity.

  17. Technical opportunities to reduce global anthropogenic emissions of nitrous oxide

    NASA Astrophysics Data System (ADS)

    Winiwarter, Wilfried; Höglund-Isaksson, Lena; Klimont, Zbigniew; Schöpp, Wolfgang; Amann, Markus

    2018-01-01

    We describe a consistent framework developed to quantify current and future anthropogenic emissions of nitrous oxide and the available technical abatement options by source sector for 172 regions globally. About 65% of the current emissions derive from agricultural soils, 8% from waste, and 4% from the chemical industry. Low-cost abatement options are available in industry, wastewater, and agriculture, where they are limited to large industrial farms. We estimate that by 2030, emissions can be reduced by about 6% ±2% applying abatement options at a cost lower than 10 €/t CO2-eq. The largest abatement potential at higher marginal costs is available from agricultural soils, employing precision fertilizer application technology as well as chemical treatment of fertilizers to suppress conversion processes in soil (nitrification inhibitors). At marginal costs of up to 100 €/t CO2-eq, about 18% ±6% of baseline emissions can be removed and when considering all available options, the global abatement potential increases to about 26% ±9%. Due to expected future increase in activities driving nitrous oxide emissions, the limited technical abatement potential available means that even at full implementation of reduction measures by 2030, global emissions can be at most stabilized at the pre-2010 level. In order to achieve deeper reductions in emissions, considerable technological development will be required as well as non-technical options like adjusting human diets towards moderate animal protein consumption.

  18. Preliminary Assessment of the Proposed Closure of the National Technical Information Service (NTIS): A Report to the President and the Congress.

    ERIC Educational Resources Information Center

    Horton, Forest Woody, Ed.; Kadec, Sarah T., Ed.

    This document reports on an NCLIS (National Commission on Libraries and Information Science) study of the proposal to close NTIS (National Technical Information Service) and shift its paper, microfiche, digital archives, and bibliographic database to the Library of Congress. The report documents the results of research, interviews, public…

  19. 76 FR 64083 - Reliability Technical Conference; Notice of Technical Conference

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-17

    ... Technical Conference; Notice of Technical Conference Take notice that the Federal Energy Regulatory Commission will hold a Technical Conference on Tuesday, November 29, 2011, from 1 p.m. to 5 p.m. and... System. The conference will explore the progress made on the priorities for addressing risks to...

  20. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    PubMed

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  1. Naval Ship Database: Database Design, Implementation, and Schema

    DTIC Science & Technology

    2013-09-01

    incoming data. The solution allows database users to store and analyze data collected by navy ships in the Royal Canadian Navy ( RCN ). The data...understanding RCN jargon and common practices on a typical RCN vessel. This experience led to the development of several error detection methods to...data to be stored in the database. Mr. Massel has also collected data pertaining to day to day activities on RCN vessels that has been imported into

  2. Simulation fails to replicate stress in trainees performing a technical procedure in the clinical environment.

    PubMed

    Baker, B G; Bhalla, A; Doleman, B; Yarnold, E; Simons, S; Lund, J N; Williams, J P

    2017-01-01

    Simulation-based training (SBT) has become an increasingly important method by which doctors learn. Stress has an impact upon learning, performance, technical, and non-technical skills. However, there are currently no studies that compare stress in the clinical and simulated environment. We aimed to compare objective (heart rate variability, HRV) and subjective (state trait anxiety inventory, STAI) measures of stress theatre with a simulated environment. HRV recordings were obtained from eight anesthetic trainees performing an uncomplicated rapid sequence induction at pre-determined procedural steps using a wireless Polar RS800CX monitor © in an emergency theatre setting. This was repeated in the simulated environment. Participants completed an STAI before and after the procedure. Eight trainees completed the study. The theatre environment caused an increase in objective stress vs baseline (p = .004). There was no significant difference between average objective stress levels across all time points (p = .20) between environments. However, there was a significant interaction between the variables of objective stress and environment (p = .045). There was no significant difference in subjective stress (p = .27) between environments. Simulation was unable to accurately replicate the stress of the technical procedure. This is the first study that compares the stress during SBT with the theatre environment and has implications for the assessment of simulated environments for use in examinations, rating of technical and non-technical skills, and stress management training.

  3. Hawaii bibliographic database

    USGS Publications Warehouse

    Wright, T.L.; Takahashi, T.J.

    1998-01-01

    The Hawaii bibliographic database has been created to contain all of the literature, from 1779 to the present, pertinent to the volcanological history of the Hawaiian-Emperor volcanic chain. References are entered in a PC- and Macintosh-compatible EndNote Plus bibliographic database with keywords and abstracts or (if no abstract) with annotations as to content. Keywords emphasize location, discipline, process, identification of new chemical data or age determinations, and type of publication. The database is updated approximately three times a year and is available to upload from an ftp site. The bibliography contained 8460 references at the time this paper was submitted for publication. Use of the database greatly enhances the power and completeness of library searches for anyone interested in Hawaiian volcanism.

  4. Hanford Waste Vitrification Plant technical manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, D.E.; Watrous, R.A.; Kruger, O.L.

    1996-03-01

    A key element of the Hanford waste management strategy is the construction of a new facility, the Hanford Waste Vitrification Plant (HWVP), to vitrify existing and future liquid high-level waste produced by defense activities at the Hanford Site. The HWVP mission is to vitrify pretreated waste in borosilicate glass, cast the glass into stainless steel canisters, and store the canisters at the Hanford Site until they are shipped to a federal geological repository. The HWVP Technical Manual (Manual) documents the technical bases of the current HWVP process and provides a physical description of the related equipment and the plant. Themore » immediate purpose of the document is to provide the technical bases for preparation of project baseline documents that will be used to direct the Title 1 and Title 2 design by the A/E, Fluor. The content of the Manual is organized in the following manner. Chapter 1.0 contains the background and context within which the HWVP was designed. Chapter 2.0 describes the site, plant, equipment and supporting services and provides the context for application of the process information in the Manual. Chapter 3.0 provides plant feed and product requirements, which are primary process bases for plant operation. Chapter 4.0 summarizes the technology for each plant process. Chapter 5.0 describes the engineering principles for designing major types of HWVP equipment. Chapter 6.0 describes the general safety aspects of the plant and process to assist in safe and prudent facility operation. Chapter 7.0 includes a description of the waste form qualification program and data. Chapter 8.0 indicates the current status of quality assurance requirements for the Manual. The Appendices provide data that are too extensive to be placed in the main text, such as extensive tables and sets of figures. The Manual is a revision of the 1987 version.« less

  5. Treadmill Kinematics Baseline Data Collection

    NASA Image and Video Library

    2011-05-12

    PHOTO DATE: 5-12-11 LOCATION: Building 261 - Room 138 SUBJECT: Expedition 29 Preflight Training with Dan Burbank during Treadmill Kinematics Baseline Data Collection. WORK ORDER: 2011-1214 PHOTOGRAPHER: Lauren Harnett

  6. The Chicago Thoracic Oncology Database Consortium: A Multisite Database Initiative

    PubMed Central

    Carey, George B; Tan, Yi-Hung Carol; Bokhary, Ujala; Itkonen, Michelle; Szeto, Kyle; Wallace, James; Campbell, Nicholas; Hensing, Thomas; Salgia, Ravi

    2016-01-01

    Objective: An increasing amount of clinical data is available to biomedical researchers, but specifically designed database and informatics infrastructures are needed to handle this data effectively. Multiple research groups should be able to pool and share this data in an efficient manner. The Chicago Thoracic Oncology Database Consortium (CTODC) was created to standardize data collection and facilitate the pooling and sharing of data at institutions throughout Chicago and across the world. We assessed the CTODC by conducting a proof of principle investigation on lung cancer patients who took erlotinib. This study does not look into epidermal growth factor receptor (EGFR) mutations and tyrosine kinase inhibitors, but rather it discusses the development and utilization of the database involved. Methods:  We have implemented the Thoracic Oncology Program Database Project (TOPDP) Microsoft Access, the Thoracic Oncology Research Program (TORP) Velos, and the TORP REDCap databases for translational research efforts. Standard operating procedures (SOPs) were created to document the construction and proper utilization of these databases. These SOPs have been made available freely to other institutions that have implemented their own databases patterned on these SOPs. Results: A cohort of 373 lung cancer patients who took erlotinib was identified. The EGFR mutation statuses of patients were analyzed. Out of the 70 patients that were tested, 55 had mutations while 15 did not. In terms of overall survival and duration of treatment, the cohort demonstrated that EGFR-mutated patients had a longer duration of erlotinib treatment and longer overall survival compared to their EGFR wild-type counterparts who received erlotinib. Discussion: The investigation successfully yielded data from all institutions of the CTODC. While the investigation identified challenges, such as the difficulty of data transfer and potential duplication of patient data, these issues can be resolved

  7. The Pregnancy in Polycystic Ovary Syndrome Study II: Baseline Characteristics and Effects of Obesity from a Multi-Center Randomized Clinical Trial

    PubMed Central

    Legro, Richard S.; Brzyski, Robert G.; Diamond, Michael P.; Coutifaris, Christos; Schlaff, William D.; Alvero, Ruben; Casson, Peter; Christman, Gregory M.; Huang, Hao; Yan, Qingshang; Haisenleder, Daniel J.; Barnhart, Kurt T.; Bates, G. Wright; Usadi, Rebecca; Lucidi, Richard; Baker, Valerie; Trussell, J.C.; Krawetz, Stephen A.; Snyder, Peter; Ohl, Dana; Santoro, Nanette; Eisenberg, Esther; Zhang, Heping

    2014-01-01

    Objective To summarize baseline characteristics from a large multi-center infertility clinical trial. Design Cross-sectional baseline data from a double-blind randomized trial of 2 treatment regimens (letrozole vs. clomiphene). Setting Academic Health Centers throughout the U.S. Interventions None Main Outcome Measure(s) Historical, biometric, biochemical and questionnaire parameters. Participants 750 women with PCOS and their male partners took part in the study. Results Females averaged ~30 years old and were obese (BMI 35) with ~20% from a racial/ethnic minority. Most (87%) were hirsute and nulligravid (63%). . Most of the females had an elevated antral follicle count and enlarged ovarian volume on ultrasound. Women had elevated mean circulating androgens, LH:FSH ratio (~2), and AMH levels (8.0 ng/mL). Additionally, women had evidence for metabolic dysfunction with elevated mean fasting insulin and dyslipidemia. Increasing obesity was associated with decreased LH:FSH levels, AMH levels and antral follicle counts but increasing cardiovascular risk factors, including prevalence of the metabolic syndrome. Males were obese (BMI 30) and had normal mean semen parameters. Conclusions The treatment groups were well-matched at baseline. Obesity exacerbates select female reproductive and most metabolic parameters. We have also established a database and sample repository that will eventually be accessible to investigators. PMID:24156957

  8. [NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 4:] Technical communications in aerospace: An analysis of the practices reported by US and European aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Barclay, Rebecca O.; Kennedy, John M.; Glassman, Myron

    1990-01-01

    Two pilot studies were conducted that investigated the technical communications practices of U.S. and European aerospace engineers and scientists. Both studies had the same five objectives: (1) solicit opinions regarding the importance of technical communications; (2) determine the use and production of technical communications; (3) seek views about the appropriate content of an undergraduate course in technical communications; (4) determine use of libraries, information centers, and online database; (5) determine use and importance of computer and information technology to them. A self-administered questionnaire was mailed to randomly selected aerospace engineers and scientists, with a slightly modified version sent to European colleagues. Their responses to selected questions are presented in this paper.

  9. Kentucky geotechnical database.

    DOT National Transportation Integrated Search

    2005-03-01

    Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...

  10. DTIC (Defense Technical Information Center) Model Action Plan for Incorporating DGIS (DOD Gateway Information System) Capabilities.

    DTIC Science & Technology

    1986-05-01

    Information System (DGIS) is being developed to provide the DD crmjnj t with a modern tool to access diverse dtabaiees and extract information products...this community with a modern tool for accessing these databases and extracting information products from them. Since the Defense Technical Information...adjunct to DROLS xesults. The study , thereor. centerd around obtaining background information inside the unit on that unit’s users who request DROLS

  11. NREL: U.S. Life Cycle Inventory Database - About the LCI Database Project

    Science.gov Websites

    About the LCI Database Project The U.S. Life Cycle Inventory (LCI) Database is a publicly available data collection and analysis methods. Finding consistent and transparent LCI data for life cycle and maintain the database. The 2009 U.S. Life Cycle Inventory (LCI) Data Stakeholder meeting was an

  12. The EpiSLI Database: A Publicly Available Database on Speech and Language

    ERIC Educational Resources Information Center

    Tomblin, J. Bruce

    2010-01-01

    Purpose: This article describes a database that was created in the process of conducting a large-scale epidemiologic study of specific language impairment (SLI). As such, this database will be referred to as the EpiSLI database. Children with SLI have unexpected and unexplained difficulties learning and using spoken language. Although there is no…

  13. Open-access databases as unprecedented resources and drivers of cultural change in fisheries science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Utz, Ryan

    2014-01-01

    Open-access databases with utility in fisheries science have grown exponentially in quantity and scope over the past decade, with profound impacts to our discipline. The management, distillation, and sharing of an exponentially growing stream of open-access data represents several fundamental challenges in fisheries science. Many of the currently available open-access resources may not be universally known among fisheries scientists. We therefore introduce many national- and global-scale open-access databases with applications in fisheries science and provide an example of how they can be harnessed to perform valuable analyses without additional field efforts. We also discuss how the development, maintenance, and utilizationmore » of open-access data are likely to pose technical, financial, and educational challenges to fisheries scientists. Such cultural implications that will coincide with the rapidly increasing availability of free data should compel the American Fisheries Society to actively address these problems now to help ease the forthcoming cultural transition.« less

  14. Drinking Water Database

    NASA Technical Reports Server (NTRS)

    Murray, ShaTerea R.

    2004-01-01

    This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.

  15. Database for chemical weapons detection: first results

    NASA Astrophysics Data System (ADS)

    Bellecci, C.; Gaudio, P.; Gelfusa, M.; Martellucci, S.; Richetta, M.; Ventura, P.; Antonucci, A.; Pasquino, F.; Ricci, V.; Sassolini, A.

    2008-10-01

    The quick increase of terrorism and asymmetric war is leading towards new needs involving defense and security. Nowadays we have to fight several kind of threats and use of chemical weapons against civil or military objectives is one of the most dangerous. For this reason it is necessary to find equipment, know-how and information that are useful in order to detect and identify dangerous molecules as quickly and far away as possible, so to minimize damage. Lidar/Dial are some of the most powerful optical technologies. Dial technology use two different wavelengths, in order to measure concentration profile of an investigated molecule. For this reason it is needed a "fingerprint" database which consists of an exhaustive collection of absorption coefficients data so to identify each molecule avoiding confusion with interfering ones. Nowadays there is not such a collection of data in scientific and technical literature. We used an FT-IR spectrometer and a CO2 laser source for absorption spectroscopy measurements using cells filled with the investigated molecules. The CO2 source is the transmitter of our DIAL facility. In this way we can make a proper "fingerprint" database necessary to identify dangerous molecules. The CO2 laser has been chosen because it is eye safe and, mainly, because it covers a spectral band where there is good absorption for this kind of molecules. In this paper IR spectra of mustard will be presented and compared to other substances which may interfere producing a false alarm. Methodology, experimental setup and first results are described.

  16. Aviation Safety Issues Database

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A.; Ricks, Wendell R.

    2009-01-01

    The aviation safety issues database was instrumental in the refinement and substantiation of the National Aviation Safety Strategic Plan (NASSP). The issues database is a comprehensive set of issues from an extremely broad base of aviation functions, personnel, and vehicle categories, both nationally and internationally. Several aviation safety stakeholders such as the Commercial Aviation Safety Team (CAST) have already used the database. This broader interest was the genesis to making the database publically accessible and writing this report.

  17. A National Survey on Teaching and Assessing Technical Proficiency in Vascular Surgery in Canada.

    PubMed

    Drudi, Laura; Hossain, Sajjid; Mackenzie, Kent S; Corriveau, Marc-Michel; Abraham, Cherrie Z; Obrand, Daniel I; Vassiliou, Melina; Gill, Heather; Steinmetz, Oren K

    2016-05-01

    This survey aims to explore trainees' perspectives on how Canadian vascular surgery training programs are using simulation in teaching and assessing technical skills through a cross-sectional national survey. A 10-min online questionnaire was sent to Program Directors of Canada's Royal College of Physicians and Surgeons' of Canada approved training programs in vascular surgery. This survey was distributed among residents and fellows who were studying in the 2013-2014 academic year. Twenty-eight (58%) of the 48 Canadian vascular surgery trainees completed the survey. A total of 68% of the respondents were part of the 0 + 5 integrated vascular surgery training program. The use of simulation in the assessment of technical skills at the beginning of training was reported by only 3 (11%) respondents, whereas 43% reported that simulation was used in their programs in the assessment of technical skills at some time during their training. Training programs most often provided simulation as a method of teaching and learning endovascular abdominal aortic or thoracic aneurysm repair (64%). Furthermore, 96% of trainees reported the most common resource to learn and enhance technical skills was dialog with vascular surgery staff. Surveyed vascular surgery trainees in Canada report that simulation is rarely used as a tool to assess baseline technical skills at the beginning of training. Less than half of surveyed trainees in vascular surgery programs in Canada report that simulation is being used for skills acquisition. Currently, in Canadian training programs, simulation is most commonly used to teach endovascular skills. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. BDVC (Bimodal Database of Violent Content): A database of violent audio and video

    NASA Astrophysics Data System (ADS)

    Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro

    2017-09-01

    Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.

  19. The International Linear Collider Technical Design Report - Volume 2: Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Howard; Barklow, Tim; Fujii, Keisuke

    2013-06-26

    The International Linear Collider Technical Design Report (TDR) describes in four volumes the physics case and the design of a 500 GeV centre-of-mass energy linear electron-positron collider based on superconducting radio-frequency technology using Niobium cavities as the accelerating structures. The accelerator can be extended to 1 TeV and also run as a Higgs factory at around 250 GeV and on the Z0 pole. A comprehensive value estimate of the accelerator is give, together with associated uncertainties. It is shown that no significant technical issues remain to be solved. Once a site is selected and the necessary site-dependent engineering is carriedmore » out, construction can begin immediately. The TDR also gives baseline documentation for two high-performance detectors that can share the ILC luminosity by being moved into and out of the beam line in a "push-pull" configuration. These detectors, ILD and SiD, are described in detail. They form the basis for a world-class experimental programme that promises to increase significantly our understanding of the fundamental processes that govern the evolution of the Universe.« less

  20. The International Linear Collider Technical Design Report - Volume 4: Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behnke, Ties

    2013-06-26

    The International Linear Collider Technical Design Report (TDR) describes in four volumes the physics case and the design of a 500 GeV centre-of-mass energy linear electron-positron collider based on superconducting radio-frequency technology using Niobium cavities as the accelerating structures. The accelerator can be extended to 1 TeV and also run as a Higgs factory at around 250 GeV and on the Z0 pole. A comprehensive value estimate of the accelerator is give, together with associated uncertainties. It is shown that no significant technical issues remain to be solved. Once a site is selected and the necessary site-dependent engineering is carriedmore » out, construction can begin immediately. The TDR also gives baseline documentation for two high-performance detectors that can share the ILC luminosity by being moved into and out of the beam line in a "push-pull" configuration. These detectors, ILD and SiD, are described in detail. They form the basis for a world-class experimental programme that promises to increase significantly our understanding of the fundamental processes that govern the evolution of the Universe.« less

  1. Baseline estimation from simultaneous satellite laser tracking

    NASA Technical Reports Server (NTRS)

    Dedes, George C.

    1987-01-01

    Simultaneous Range Differences (SRDs) to Lageos are obtained by dividing the observing stations into pairs with quasi-simultaneous observations. For each of those pairs the station with the least number of observations is identified, and at its observing epochs interpolated ranges for the alternate station are generated. The SRD observables are obtained by subtracting the actually observed laser range of the station having the least number of observations from the interpolated ranges of the alternate station. On the basis of these observables semidynamic single baseline solutions were performed. The aim of these solutions is to further develop and implement the SRD method in the real data environment, to assess its accuracy, its advantages and disadvantages as related to the range dynamic mode methods, when the baselines are the only parameters of interest. Baselines, using simultaneous laser range observations to Lageos, were also estimated through the purely geometric method. These baselines formed the standards the standards of comparison in the accuracy assessment of the SRD method when compared to that of the range dynamic mode methods. On the basis of this comparison it was concluded that for baselines of regional extent the SRD method is very effective, efficient, and at least as accurate as the range dynamic mode methods, and that on the basis of a simple orbital modeling and a limited orbit adjustment. The SRD method is insensitive to the inconsistencies affecting the terrestrial reference frame and simultaneous adjustment of the Earth Rotation Parameters (ERPs) is not necessary.

  2. Learning lessons from Natech accidents - the eNATECH accident database

    NASA Astrophysics Data System (ADS)

    Krausmann, Elisabeth; Girgin, Serkan

    2016-04-01

    When natural hazards impact industrial facilities that house or process hazardous materials, fires, explosions and toxic releases can occur. This type of accident is commonly referred to as Natech accident. In order to prevent the recurrence of accidents or to better mitigate their consequences, lessons-learned type studies using available accident data are usually carried out. Through post-accident analysis, conclusions can be drawn on the most common damage and failure modes and hazmat release paths, particularly vulnerable storage and process equipment, and the hazardous materials most commonly involved in these types of accidents. These analyses also lend themselves to identifying technical and organisational risk-reduction measures that require improvement or are missing. Industrial accident databases are commonly used for retrieving sets of Natech accident case histories for further analysis. These databases contain accident data from the open literature, government authorities or in-company sources. The quality of reported information is not uniform and exhibits different levels of detail and accuracy. This is due to the difficulty of finding qualified information sources, especially in situations where accident reporting by the industry or by authorities is not compulsory, e.g. when spill quantities are below the reporting threshold. Data collection has then to rely on voluntary record keeping often by non-experts. The level of detail is particularly non-uniform for Natech accident data depending on whether the consequences of the Natech event were major or minor, and whether comprehensive information was available for reporting. In addition to the reporting bias towards high-consequence events, industrial accident databases frequently lack information on the severity of the triggering natural hazard, as well as on failure modes that led to the hazmat release. This makes it difficult to reconstruct the dynamics of the accident and renders the development of

  3. Baseline Optimization for the Measurement of CP Violation, Mass Hierarchy, and $$\\theta_{23}$$ Octant in a Long-Baseline Neutrino Oscillation Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bass, M.; Bishai, M.; Cherdack, D.

    2015-03-19

    Next-generation long-baseline electron neutrino appearance experiments will seek to discover C P violation, determine the mass hierarchy and resolve the θ 23 octant. In light of the recent precision measurements of θ 13 , we consider the sensitivity of these measurements in a study to determine the optimal baseline, including practical considerations regarding beam and detector performance. We conclude that a detector at a baseline of at least 1000 km in a wide-band muon neutrino beam is themore » optimal configuration.« less

  4. The Danish Testicular Cancer database.

    PubMed

    Daugaard, Gedske; Kier, Maria Gry Gundgaard; Bandak, Mikkel; Mortensen, Mette Saksø; Larsson, Heidi; Søgaard, Mette; Toft, Birgitte Groenkaer; Engvad, Birte; Agerbæk, Mads; Holm, Niels Vilstrup; Lauritsen, Jakob

    2016-01-01

    The nationwide Danish Testicular Cancer database consists of a retrospective research database (DaTeCa database) and a prospective clinical database (Danish Multidisciplinary Cancer Group [DMCG] DaTeCa database). The aim is to improve the quality of care for patients with testicular cancer (TC) in Denmark, that is, by identifying risk factors for relapse, toxicity related to treatment, and focusing on late effects. All Danish male patients with a histologically verified germ cell cancer diagnosis in the Danish Pathology Registry are included in the DaTeCa databases. Data collection has been performed from 1984 to 2007 and from 2013 onward, respectively. The retrospective DaTeCa database contains detailed information with more than 300 variables related to histology, stage, treatment, relapses, pathology, tumor markers, kidney function, lung function, etc. A questionnaire related to late effects has been conducted, which includes questions regarding social relationships, life situation, general health status, family background, diseases, symptoms, use of medication, marital status, psychosocial issues, fertility, and sexuality. TC survivors alive on October 2014 were invited to fill in this questionnaire including 160 validated questions. Collection of questionnaires is still ongoing. A biobank including blood/sputum samples for future genetic analyses has been established. Both samples related to DaTeCa and DMCG DaTeCa database are included. The prospective DMCG DaTeCa database includes variables regarding histology, stage, prognostic group, and treatment. The DMCG DaTeCa database has existed since 2013 and is a young clinical database. It is necessary to extend the data collection in the prospective database in order to answer quality-related questions. Data from the retrospective database will be added to the prospective data. This will result in a large and very comprehensive database for future studies on TC patients.

  5. Drinking Water Treatability Database (Database)

    EPA Science Inventory

    The drinking Water Treatability Database (TDB) will provide data taken from the literature on the control of contaminants in drinking water, and will be housed on an interactive, publicly-available USEPA web site. It can be used for identifying effective treatment processes, rec...

  6. Scopus database: a review.

    PubMed

    Burnham, Judy F

    2006-03-08

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs.

  7. Scopus database: a review

    PubMed Central

    Burnham, Judy F

    2006-01-01

    The Scopus database provides access to STM journal articles and the references included in those articles, allowing the searcher to search both forward and backward in time. The database can be used for collection development as well as for research. This review provides information on the key points of the database and compares it to Web of Science. Neither database is inclusive, but complements each other. If a library can only afford one, choice must be based in institutional needs. PMID:16522216

  8. Public variant databases: liability?

    PubMed

    Thorogood, Adrian; Cook-Deegan, Robert; Knoppers, Bartha Maria

    2017-07-01

    Public variant databases support the curation, clinical interpretation, and sharing of genomic data, thus reducing harmful errors or delays in diagnosis. As variant databases are increasingly relied on in the clinical context, there is concern that negligent variant interpretation will harm patients and attract liability. This article explores the evolving legal duties of laboratories, public variant databases, and physicians in clinical genomics and recommends a governance framework for databases to promote responsible data sharing.Genet Med advance online publication 15 December 2016.

  9. An experimental investigation of masking in the US FDA adverse event reporting system database.

    PubMed

    Wang, Hsin-wei; Hochberg, Alan M; Pearson, Ronald K; Hauben, Manfred

    2010-12-01

    adjudication was available from a previous study. The original disproportionality analysis identified 8719 SDRs for the 63 PTs. The SU-based unmasking protocols generated variable numbers of masked SDRs ranging from 38 to 156, representing a 0.43-1.8% increase over the number of baseline SDRs. A significant number of baseline SDRs were also lost in the course of our experiments. The trend in the number of gained SDRs per report removed was inversely related to the number of lost SDRs per protocol. Both the number and nature of the reports removed influenced the number of gained SDRs observed. The purely empirical protocols unmasked up to ten times as many SDRs. None of the masked SDRs had strong external evidence supporting a causal association. Most involved associations for which there was no external supporting evidence or were in the original product label. For two masked SDRs, there was external evidence of a possible causal association. We documented masking in the FDA AERS database. Attempts at unmasking SDRs using practically implementable protocols produced only small changes in the output of SDRs in our analysis. This is undoubtedly related to the large size and diversity of the database, but the complex interdependencies between drugs and events in authentic spontaneous reporting system (SRS) databases, and the impact of measures of statistical variability that are typically used in real-world disproportionality analysis, may be additional factors that constrain the discovery of masked SDRs and which may also operate in pharmaceutical company databases. Empirical determination of the most influential drugs may uncover significantly more SDRs than protocols based on predetermined statistical selection rules but are impractical except possibly for evaluating specific events. Routine global exercises to elicit masking, especially in large health authority databases are not justified based on results available to date. Exercises to elicit unmasking should be driven by

  10. 324 Building Baseline Radiological Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.J. Reeder, J.C. Cooper

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  11. Technical Facilities Management, Loan Pool, and Calibration

    NASA Technical Reports Server (NTRS)

    Smith, Jacob

    2011-01-01

    My work at JPL for the SURF program began on June 11, 2012 with the Technical Facilities Management group (TFM). As well as TFM, I worked with Loan Pool and Metrology to help them out with various tasks. Unlike a lot of other interns, I did not have a specific project rather many different tasks to be completed over the course of the 10 weeks.The first task to be completed was to sort through old certification reports in 6 different boxes to locate reports that needed to be archived into a digital database. There were no reports within these boxes that needed to be archived but rather were to be shredded. The reports went back to the early 1980's and up to the early 2000's. I was looking for reports dated from 2002 to 2012

  12. THE ECOTOX DATABASE

    EPA Science Inventory

    The database provides chemical-specific toxicity information for aquatic life, terrestrial plants, and terrestrial wildlife. ECOTOX is a comprehensive ecotoxicology database and is therefore essential for providing and suppoirting high quality models needed to estimate population...

  13. The use of a medico economic database as a part of French apheresis registry.

    PubMed

    Kanouni, T; Aubas, P; Heshmati, F

    2017-02-01

    An apheresis registry is a part of each learned apheresis society. The interest in this is obvious, in terms of knowledge of the practice of apheresis, adverse events, and technical issues. However, because of the weight of data entry it could never be exhaustive and some data will be missing. While continuing our registry efforts and our efforts to match with other existing registries, we decided to extend the data collection to a medico-economic database that is available in France, the Programme de Médicalisation du Système d'Information (PMSI) that has covered reimbursement information for each public or private hospital since 2007. It contains almost all apheresis procedures in all apheresis fields, demographic patient data, and primary and related diagnoses, among other data. Although this data does not include technical apheresis issues or other complications of the procedures, its interest is great and it is complementary to the registry. From 2003-2014, we have recorded 250,585 apheresis procedures, for 48,428 patients. We showed that the data are reliable and exhaustive. The information shows a perfect real life practice in apheresis, regarding indications, the rhythm and the duration of apheresis treatment. This prospective data collection is sustainable and allows us to assess the impact of healthcare guidelines. Our objective is to extend the data collection and match it to other existing databases; this will allow us to conduct, for example, a cohort study specifically for ECP. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Public variant databases: liability?

    PubMed Central

    Thorogood, Adrian; Cook-Deegan, Robert; Knoppers, Bartha Maria

    2017-01-01

    Public variant databases support the curation, clinical interpretation, and sharing of genomic data, thus reducing harmful errors or delays in diagnosis. As variant databases are increasingly relied on in the clinical context, there is concern that negligent variant interpretation will harm patients and attract liability. This article explores the evolving legal duties of laboratories, public variant databases, and physicians in clinical genomics and recommends a governance framework for databases to promote responsible data sharing. Genet Med advance online publication 15 December 2016 PMID:27977006

  15. Database for propagation models

    NASA Astrophysics Data System (ADS)

    Kantak, Anil V.

    1991-07-01

    A propagation researcher or a systems engineer who intends to use the results of a propagation experiment is generally faced with various database tasks such as the selection of the computer software, the hardware, and the writing of the programs to pass the data through the models of interest. This task is repeated every time a new experiment is conducted or the same experiment is carried out at a different location generating different data. Thus the users of this data have to spend a considerable portion of their time learning how to implement the computer hardware and the software towards the desired end. This situation may be facilitated considerably if an easily accessible propagation database is created that has all the accepted (standardized) propagation phenomena models approved by the propagation research community. Also, the handling of data will become easier for the user. Such a database construction can only stimulate the growth of the propagation research it if is available to all the researchers, so that the results of the experiment conducted by one researcher can be examined independently by another, without different hardware and software being used. The database may be made flexible so that the researchers need not be confined only to the contents of the database. Another way in which the database may help the researchers is by the fact that they will not have to document the software and hardware tools used in their research since the propagation research community will know the database already. The following sections show a possible database construction, as well as properties of the database for the propagation research.

  16. First Grade Baseline Evaluation

    ERIC Educational Resources Information Center

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  17. Precise regional baseline estimation using a priori orbital information

    NASA Technical Reports Server (NTRS)

    Lindqwister, Ulf J.; Lichten, Stephen M.; Blewitt, Geoffrey

    1990-01-01

    A solution using GPS measurements acquired during the CASA Uno campaign has resulted in 3-4 mm horizontal daily baseline repeatability and 13 mm vertical repeatability for a 729 km baseline, located in North America. The agreement with VLBI is at the level of 10-20 mm for all components. The results were obtained with the GIPSY orbit determination and baseline estimation software and are based on five single-day data arcs spanning the 20, 21, 25, 26, and 27 of January, 1988. The estimation strategy included resolving the carrier phase integer ambiguities, utilizing an optial set of fixed reference stations, and constraining GPS orbit parameters by applying a priori information. A multiday GPS orbit and baseline solution has yielded similar 2-4 mm horizontal daily repeatabilities for the same baseline, consistent with the constrained single-day arc solutions. The application of weak constraints to the orbital state for single-day data arcs produces solutions which approach the precise orbits obtained with unconstrained multiday arc solutions.

  18. FishTraits Database

    USGS Publications Warehouse

    Angermeier, Paul L.; Frimpong, Emmanuel A.

    2009-01-01

    The need for integrated and widely accessible sources of species traits data to facilitate studies of ecology, conservation, and management has motivated development of traits databases for various taxa. In spite of the increasing number of traits-based analyses of freshwater fishes in the United States, no consolidated database of traits of this group exists publicly, and much useful information on these species is documented only in obscure sources. The largely inaccessible and unconsolidated traits information makes large-scale analysis involving many fishes and/or traits particularly challenging. FishTraits is a database of >100 traits for 809 (731 native and 78 exotic) fish species found in freshwaters of the conterminous United States, including 37 native families and 145 native genera. The database contains information on four major categories of traits: (1) trophic ecology, (2) body size and reproductive ecology (life history), (3) habitat associations, and (4) salinity and temperature tolerances. Information on geographic distribution and conservation status is also included. Together, we refer to the traits, distribution, and conservation status information as attributes. Descriptions of attributes are available here. Many sources were consulted to compile attributes, including state and regional species accounts and other databases.

  19. The 2014 Nucleic Acids Research Database Issue and an updated NAR online Molecular Biology Database Collection.

    PubMed

    Fernández-Suárez, Xosé M; Rigden, Daniel J; Galperin, Michael Y

    2014-01-01

    The 2014 Nucleic Acids Research Database Issue includes descriptions of 58 new molecular biology databases and recent updates to 123 databases previously featured in NAR or other journals. For convenience, the issue is now divided into eight sections that reflect major subject categories. Among the highlights of this issue are six databases of the transcription factor binding sites in various organisms and updates on such popular databases as CAZy, Database of Genomic Variants (DGV), dbGaP, DrugBank, KEGG, miRBase, Pfam, Reactome, SEED, TCDB and UniProt. There is a strong block of structural databases, which includes, among others, the new RNA Bricks database, updates on PDBe, PDBsum, ArchDB, Gene3D, ModBase, Nucleic Acid Database and the recently revived iPfam database. An update on the NCBI's MMDB describes VAST+, an improved tool for protein structure comparison. Two articles highlight the development of the Structural Classification of Proteins (SCOP) database: one describes SCOPe, which automates assignment of new structures to the existing SCOP hierarchy; the other one describes the first version of SCOP2, with its more flexible approach to classifying protein structures. This issue also includes a collection of articles on bacterial taxonomy and metagenomics, which includes updates on the List of Prokaryotic Names with Standing in Nomenclature (LPSN), Ribosomal Database Project (RDP), the Silva/LTP project and several new metagenomics resources. The NAR online Molecular Biology Database Collection, http://www.oxfordjournals.org/nar/database/c/, has been expanded to 1552 databases. The entire Database Issue is freely available online on the Nucleic Acids Research website (http://nar.oxfordjournals.org/).

  20. GPS baseline configuration design based on robustness analysis

    NASA Astrophysics Data System (ADS)

    Yetkin, M.; Berber, M.

    2012-11-01

    The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configurationof the observed baselines. The selection of optimal GPS baselines may allow for a cost effective survey campaign and a sufficiently robustnetwork. Furthermore, using the approach described in this paper, the required number of sessions, the baselines to be observed, and thesignificance levels for statistical testing and robustness analysis can be determined even before the GPS campaign starts. In this study, wepropose a robustness criterion for the optimal design of geodetic networks, and present a very simple and efficient algorithm based on thiscriterion for the selection of optimal GPS baselines. We also show the relationship between the number of sessions and the non-centralityparameter. Finally, a numerical example is given to verify the efficacy of the proposed approach.