Sample records for modern database management

  1. Database Management

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.

  2. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  3. Planning the future of JPL's management and administrative support systems around an integrated database

    NASA Technical Reports Server (NTRS)

    Ebersole, M. M.

    1983-01-01

    JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.

  4. Taking control of your digital library: how modern citation managers do more than just referencing.

    PubMed

    Mahajan, Amit K; Hogarth, D Kyle

    2013-12-01

    Physicians are constantly navigating the overwhelming body of medical literature available on the Internet. Although early citation managers were capable of limited searching of index databases and tedious bibliography production, modern versions of citation managers such as EndNote, Zotero, and Mendeley are powerful web-based tools for searching, organizing, and sharing medical literature. Effortless point-and-click functions provide physicians with the ability to develop robust digital libraries filled with literature relevant to their fields of interest. In addition to easily creating manuscript bibliographies, various citation managers allow physicians to readily access medical literature, share references for teaching purposes, collaborate with colleagues, and even participate in social networking. If physicians are willing to invest the time to familiarize themselves with modern citation managers, they will reap great benefits in the future.

  5. Organizing, exploring, and analyzing antibody sequence data: the case for relational-database managers.

    PubMed

    Owens, John

    2009-01-01

    Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.

  6. Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment

    NASA Astrophysics Data System (ADS)

    Hölzner, Stefan; Kästle, Jan

    In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.

  7. NNDC Stand: Activities and Services of the National Nuclear Data Center

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.

    2005-05-01

    The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.

  8. Contraception supply chain challenges: a review of evidence from low- and middle-income countries.

    PubMed

    Mukasa, Bakali; Ali, Moazzam; Farron, Madeline; Van de Weerdt, Renee

    2017-10-01

    To identify and assess factors determining the functioning of supply chain systems for modern contraception in low- and middle-income countries (LMICs), and to identify challenges contributing to contraception stockouts that may lead to unmet need. Scientific databases and grey literature were searched including Database of Abstracts of Reviews of Effectiveness (DARE), PubMed, MEDLINE, POPLINE, CINAHL, Academic Search Complete, Science Direct, Web of Science, Cochrane Central, Google Scholar, WHO databases and websites of key international organisations. Studies indicated that supply chain system inefficiencies significantly affect availability of modern FP and contraception commodities in LMICs, especially in rural public facilities where distribution barriers may be acute. Supply chain failures or bottlenecks may be attributed to: weak and poorly institutionalized logistic management information systems (LMIS), poor physical infrastructures in LMICs, lack of trained and dedicated staff for supply chain management, inadequate funding, and rigid government policies on task sharing. However, there is evidence that implementing effective LMISs and involving public and private providers will distribution channels resulted in reduction in medical commodities' stockout rates. Supply chain bottlenecks contribute significantly to persistent high stockout rates for modern contraceptives in LMICs. Interventions aimed at enhancing uptake of contraceptives to reduce the problem of unmet need in LMICs should make strong commitments towards strengthening these countries' health commodities supply chain management systems. Current evidence is limited and additional, and well-designed implementation research on contraception supply chain systems is warranted to gain further understanding and insights on the determinants of supply chain bottlenecks and their impact on stockouts of contraception commodities.

  9. A new Volcanic managEment Risk Database desIgn (VERDI): Application to El Hierro Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Bartolini, S.; Becerril, L.; Martí, J.

    2014-11-01

    One of the most important issues in modern volcanology is the assessment of volcanic risk, which will depend - among other factors - on both the quantity and quality of the available data and an optimum storage mechanism. This will require the design of purpose-built databases that take into account data format and availability and afford easy data storage and sharing, and will provide for a more complete risk assessment that combines different analyses but avoids any duplication of information. Data contained in any such database should facilitate spatial and temporal analysis that will (1) produce probabilistic hazard models for future vent opening, (2) simulate volcanic hazards and (3) assess their socio-economic impact. We describe the design of a new spatial database structure, VERDI (Volcanic managEment Risk Database desIgn), which allows different types of data, including geological, volcanological, meteorological, monitoring and socio-economic information, to be manipulated, organized and managed. The root of the question is to ensure that VERDI will serve as a tool for connecting different kinds of data sources, GIS platforms and modeling applications. We present an overview of the database design, its components and the attributes that play an important role in the database model. The potential of the VERDI structure and the possibilities it offers in regard to data organization are here shown through its application on El Hierro (Canary Islands). The VERDI database will provide scientists and decision makers with a useful tool that will assist to conduct volcanic risk assessment and management.

  10. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. That Over-Used and Much-Abused 4-Letter Word: DATA

    NASA Astrophysics Data System (ADS)

    Griffin, Elizabeth M.

    2015-08-01

    In its prime state, DATA is a Latin word meaning "[things] given", a plural noun derived from the verb "To Give". Its singular form is DATUM. Modern conversation equates DATA with "Information", while modern philosophies on information management are getting entwined with parallel philosophies on knowledge management. In some ways that is a positive development, and is greatly assisted by Open Access and Internet policies, but in others it is more detrimental, by threatening to blur the essential distinction between objectivity and subjectivity in our science. We examine that essentialdistinction from the view-points of observers, authors (and publishers), and database managers, and suggest where, when and how the distinctiveness of their fundamental contributions to the communication and validation of research results should be respected and upheld.

  12. That over-used and much abused 4-letter word: DATA

    NASA Astrophysics Data System (ADS)

    Griffin, Elizabeth

    2016-10-01

    In its prime state, DATA is a Latin word meaning ``[things] given'', a plural noun derived from the verb ``To Give''. Its singular form is DATUM. Modern conversation equates DATA with ``Information'', while modern philosophies on information management are getting entwined with parallel philosophies on knowledge management. In some ways that is a positive development, and is greatly assisted by Open Access and Internet policies, but in others it is more detrimental, by threatening to blur the essential distinction between objectivity and subjectivity in our science. We examine that essential distinction from the view-points of observers, authors (and publishers), and database managers, and suggest where, when and how the distinctiveness of their fundamental contributions to the communication and validation of research results should be respected and upheld.

  13. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  14. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  15. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  16. The National Deep-Sea Coral and Sponge Database: A Comprehensive Resource for United States Deep-Sea Coral and Sponge Records

    NASA Astrophysics Data System (ADS)

    Dornback, M.; Hourigan, T.; Etnoyer, P.; McGuinn, R.; Cross, S. L.

    2014-12-01

    Research on deep-sea corals has expanded rapidly over the last two decades, as scientists began to realize their value as long-lived structural components of high biodiversity habitats and archives of environmental information. The NOAA Deep Sea Coral Research and Technology Program's National Database for Deep-Sea Corals and Sponges is a comprehensive resource for georeferenced data on these organisms in U.S. waters. The National Database currently includes more than 220,000 deep-sea coral records representing approximately 880 unique species. Database records from museum archives, commercial and scientific bycatch, and from journal publications provide baseline information with relatively coarse spatial resolution dating back as far as 1842. These data are complemented by modern, in-situ submersible observations with high spatial resolution, from surveys conducted by NOAA and NOAA partners. Management of high volumes of modern high-resolution observational data can be challenging. NOAA is working with our data partners to incorporate this occurrence data into the National Database, along with images and associated information related to geoposition, time, biology, taxonomy, environment, provenance, and accuracy. NOAA is also working to link associated datasets collected by our program's research, to properly archive them to the NOAA National Data Centers, to build a robust metadata record, and to establish a standard protocol to simplify the process. Access to the National Database is provided through an online mapping portal. The map displays point based records from the database. Records can be refined by taxon, region, time, and depth. The queries and extent used to view the map can also be used to download subsets of the database. The database, map, and website is already in use by NOAA, regional fishery management councils, and regional ocean planning bodies, but we envision it as a model that can expand to accommodate data on a global scale.

  17. LARCRIM user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Davis, John S.; Heaphy, William J.

    1993-01-01

    LARCRIM is a relational database management system (RDBMS) which performs the conventional duties of an RDBMS with the added feature that it can store attributes which consist of arrays or matrices. This makes it particularly valuable for scientific data management. It is accessible as a stand-alone system and through an application program interface. The stand-alone system may be executed in two modes: menu or command. The menu mode prompts the user for the input required to create, update, and/or query the database. The command mode requires the direct input of LARCRIM commands. Although LARCRIM is an update of an old database family, its performance on modern computers is quite satisfactory. LARCRIM is written in FORTRAN 77 and runs under the UNIX operating system. Versions have been released for the following computers: SUN (3 & 4), Convex, IRIS, Hewlett-Packard, CRAY 2 & Y-MP.

  18. Ultra-Structure database design methodology for managing systems biology data and analyses

    PubMed Central

    Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C

    2009-01-01

    Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849

  19. 36 CFR § 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  20. Computer networks for financial activity management, control and statistics of databases of economic administration at the Joint Institute for Nuclear Research

    NASA Astrophysics Data System (ADS)

    Tyupikova, T. V.; Samoilov, V. N.

    2003-04-01

    Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.

  1. An image database management system for conducting CAD research

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas; Drukker, Karen; Giger, Maryellen L.

    2007-03-01

    The development of image databases for CAD research is not a trivial task. The collection and management of images and their related metadata from multiple sources is a time-consuming but necessary process. By standardizing and centralizing the methods in which these data are maintained, one can generate subsets of a larger database that match the specific criteria needed for a particular research project in a quick and efficient manner. A research-oriented management system of this type is highly desirable in a multi-modality CAD research environment. An online, webbased database system for the storage and management of research-specific medical image metadata was designed for use with four modalities of breast imaging: screen-film mammography, full-field digital mammography, breast ultrasound and breast MRI. The system was designed to consolidate data from multiple clinical sources and provide the user with the ability to anonymize the data. Input concerning the type of data to be stored as well as desired searchable parameters was solicited from researchers in each modality. The backbone of the database was created using MySQL. A robust and easy-to-use interface for entering, removing, modifying and searching information in the database was created using HTML and PHP. This standardized system can be accessed using any modern web-browsing software and is fundamental for our various research projects on computer-aided detection, diagnosis, cancer risk assessment, multimodality lesion assessment, and prognosis. Our CAD database system stores large amounts of research-related metadata and successfully generates subsets of cases that match the user's desired search criteria.

  2. Evolution of the use of relational and NoSQL databases in the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Barberis, D.

    2016-09-01

    The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of "NoSQL" databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to be orchestrated by specialised services that run on front-end machines and shield the user from the complexity of data storage infrastructure. This paper describes this technology evolution in the ATLAS database infrastructure and presents a few examples of large database applications that benefit from it.

  3. MPD3: a useful medicinal plants database for drug designing.

    PubMed

    Mumtaz, Arooj; Ashfaq, Usman Ali; Ul Qamar, Muhammad Tahir; Anwar, Farooq; Gulzar, Faisal; Ali, Muhammad Amjad; Saari, Nazamid; Pervez, Muhammad Tariq

    2017-06-01

    Medicinal plants are the main natural pools for the discovery and development of new drugs. In the modern era of computer-aided drug designing (CADD), there is need of prompt efforts to design and construct useful database management system that allows proper data storage, retrieval and management with user-friendly interface. An inclusive database having information about classification, activity and ready-to-dock library of medicinal plant's phytochemicals is therefore required to assist the researchers in the field of CADD. The present work was designed to merge activities of phytochemicals from medicinal plants, their targets and literature references into a single comprehensive database named as Medicinal Plants Database for Drug Designing (MPD3). The newly designed online and downloadable MPD3 contains information about more than 5000 phytochemicals from around 1000 medicinal plants with 80 different activities, more than 900 literature references and 200 plus targets. The designed database is deemed to be very useful for the researchers who are engaged in medicinal plants research, CADD and drug discovery/development with ease of operation and increased efficiency. The designed MPD3 is a comprehensive database which provides most of the information related to the medicinal plants at a single platform. MPD3 is freely available at: http://bioinform.info .

  4. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    PubMed Central

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839

  5. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    USGS Publications Warehouse

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.

  6. Remote sensing and geographic database management systems applications for the protection and conservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    Palumbo, Gaetano; Powlesland, Dominic

    1996-12-01

    The Getty Conservation Institute is exploring the feasibility of using remote sensing associated with a geographic database management system (GDBMS) in order to provide archaeological and historic site managers with sound evaluations of the tools available for site and information management. The World Heritage Site of Chaco Canyon, New Mexico, a complex of archeological sites dating to the 10th to the 13th centuries AD, was selected as a test site. Information from excavations conducted there since the 1930s, and a range of documentation generated by the National Park Service was gathered. NASA's John C. Stennis Space Center contributed multispectral data of the area, and the Jet Propulsion Laboratory contributed data from ATLAS (airborne terrestrial applications sensor) and CAMS (calibrated airborne multispectral scanner) scanners. Initial findings show that while 'automatic monitoring systems' will probably never be a reality, with careful comparisons of historic and modern photographs, and performing digital analysis of remotely sensed data, excellent results are possible.

  7. DataHub: Knowledge-based data management for data discovery

    NASA Astrophysics Data System (ADS)

    Handley, Thomas H.; Li, Y. Philip

    1993-08-01

    Currently available database technology is largely designed for business data-processing applications, and seems inadequate for scientific applications. The research described in this paper, the DataHub, will address the issues associated with this shortfall in technology utilization and development. The DataHub development is addressing the key issues in scientific data management of scientific database models and resource sharing in a geographically distributed, multi-disciplinary, science research environment. Thus, the DataHub will be a server between the data suppliers and data consumers to facilitate data exchanges, to assist science data analysis, and to provide as systematic approach for science data management. More specifically, the DataHub's objectives are to provide support for (1) exploratory data analysis (i.e., data driven analysis); (2) data transformations; (3) data semantics capture and usage; analysis-related knowledge capture and usage; and (5) data discovery, ingestion, and extraction. Applying technologies that vary from deductive databases, semantic data models, data discovery, knowledge representation and inferencing, exploratory data analysis techniques and modern man-machine interfaces, DataHub will provide a prototype, integrated environement to support research scientists' needs in multiple disciplines (i.e. oceanography, geology, and atmospheric) while addressing the more general science data management issues. Additionally, the DataHub will provide data management services to exploratory data analysis applications such as LinkWinds and NCSA's XIMAGE.

  8. 3MdB: the Mexican Million Models database

    NASA Astrophysics Data System (ADS)

    Morisset, C.; Delgado-Inglada, G.

    2014-10-01

    The 3MdB is an original effort to construct a large multipurpose database of photoionization models. This is a more modern version of a previous attempt based on Cloudy3D and IDL tools. It is accessed by MySQL requests. The models are obtained using the well known and widely used Cloudy photoionization code (Ferland et al, 2013). The database is aimed to host grids of models with different references to identify each project and to facilitate the extraction of the desired data. We present here a description of the way the database is managed and some of the projects that use 3MdB. Anybody can ask for a grid to be run and stored in 3MdB, to increase the visibility of the grid and the potential side applications of it.

  9. Incorporating client-server database architecture and graphical user interface into outpatient medical records.

    PubMed Central

    Fiacco, P. A.; Rice, W. H.

    1991-01-01

    Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732

  10. Database Design and Management in Engineering Optimization.

    DTIC Science & Technology

    1988-02-01

    scientific and engineer- Q.- ’ method In the mid-19SOs along with modern digital com- ing applications. The paper highlights the difference puters, have made...is continuously tion software can call standard subroutines from the DBMS redefined in an application program, DDL must have j libary to define...operations. .. " type data usually encountered in engineering applications. GFDGT: Computes the number of digits needed to display " "’ A user

  11. Design research about coastal zone planning and management information system based on GIS and database technologies

    NASA Astrophysics Data System (ADS)

    Huang, Pei; Wu, Sangyun; Feng, Aiping; Guo, Yacheng

    2008-10-01

    As littoral areas in possession of concentrated population, abundant resources, developed industry and active economy, the coastal areas are bound to become the forward positions and supported regions for marine exploitation. In the 21st century, the pressure that coastal zones are faced with is as follows: growth of population and urbanization, rise of sea level and coastal erosion, shortage of freshwater resource and deterioration of water resource, and degradation of fishery resource and so on. So the resources of coastal zones should be programmed and used reasonably for the sustainable development of economy and environment. This paper proposes a design research on the construction of coastal zone planning and management information system based on GIS and database technologies. According to this system, the planning results of coastal zones could be queried and displayed expediently through the system interface. It is concluded that the integrated application of GIS and database technologies provides a new modern method for the management of coastal zone resources, and makes it possible to ensure the rational development and utilization of the coastal zone resources, along with the sustainable development of economy and environment.

  12. PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Lin, Lianshan

    2013-01-01

    To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less

  13. Experiment Management System for the SND Detector

    NASA Astrophysics Data System (ADS)

    Pugachev, K.

    2017-10-01

    We present a new experiment management system for the SND detector at the VEPP-2000 collider (Novosibirsk). An important part to report about is access to experimental databases (configuration, conditions and metadata). The system is designed in client-server architecture. User interaction comes true using web-interface. The server side includes several logical layers: user interface templates; template variables description and initialization; implementation details. The templates are meant to involve as less IT knowledge as possible. Experiment configuration, conditions and metadata are stored in a database. To implement the server side Node.js, a modern JavaScript framework, has been chosen. A new template engine having an interesting feature is designed. A part of the system is put into production. It includes templates dealing with showing and editing first level trigger configuration and equipment configuration and also showing experiment metadata and experiment conditions data index.

  14. Developing genomic knowledge bases and databases to support clinical management: current perspectives.

    PubMed

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.

  15. Developing genomic knowledge bases and databases to support clinical management: current perspectives

    PubMed Central

    Huser, Vojtech; Sincan, Murat; Cimino, James J

    2014-01-01

    Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward. PMID:25276091

  16. DCMS: A data analytics and management system for molecular simulation.

    PubMed

    Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni

    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.

  17. Geoinformatics paves the way for a zoo information system

    NASA Astrophysics Data System (ADS)

    Michel, Ulrich

    2008-10-01

    The use of modern electronic media offers new ways of (environmental) knowledge transfer. All kind of information can be made quickly available as well as queryable and can be processed individually. The Institute for Geoinformatics and Remote Sensing (IGF) in collaboration with the Osnabrueck Zoo, is developing a zoo information system, especially for new media (e.g. mobile devices), which provides information about the animals living there, their natural habitat and endangerment status. Thereby multimedia information is being offered to the zoo visitors. The implementation of the 2D/3D components is realized by modern database and Mapserver technologies. Among other technologies, the VRML (Virtual Reality Modeling Language) standard is used for the realization of the 3D visualization so that it can be viewed in every conventional web browser. Also, a mobile information system for Pocket PCs, Smartphones and Ultra Mobile PCs (UMPC) is being developed. All contents, including the coordinates, are stored in a PostgreSQL database. The data input, the processing and other administrative operations are executed by a content management system (CMS).

  18. Providing R-Tree Support for Mongodb

    NASA Astrophysics Data System (ADS)

    Xiang, Longgang; Shao, Xiaotian; Wang, Dehao

    2016-06-01

    Supporting large amounts of spatial data is a significant characteristic of modern databases. However, unlike some mature relational databases, such as Oracle and PostgreSQL, most of current burgeoning NoSQL databases are not well designed for storing geospatial data, which is becoming increasingly important in various fields. In this paper, we propose a novel method to provide R-tree index, as well as corresponding spatial range query and nearest neighbour query functions, for MongoDB, one of the most prevalent NoSQL databases. First, after in-depth analysis of MongoDB's features, we devise an efficient tabular document structure which flattens R-tree index into MongoDB collections. Further, relevant mechanisms of R-tree operations are issued, and then we discuss in detail how to integrate R-tree into MongoDB. Finally, we present the experimental results which show that our proposed method out-performs the built-in spatial index of MongoDB. Our research will greatly facilitate big data management issues with MongoDB in a variety of geospatial information applications.

  19. Relax with CouchDB - Into the non-relational DBMS era of Bioinformatics

    PubMed Central

    Manyam, Ganiraju; Payton, Michelle A.; Roth, Jack A.; Abruzzo, Lynne V.; Coombes, Kevin R.

    2012-01-01

    With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. PMID:22609849

  20. Modernization and multiscale databases at the U.S. geological survey

    USGS Publications Warehouse

    Morrison, J.L.

    1992-01-01

    The U.S. Geological Survey (USGS) has begun a digital cartographic modernization program. Keys to that program are the creation of a multiscale database, a feature-based file structure that is derived from a spatial data model, and a series of "templates" or rules that specify the relationships between instances of entities in reality and features in the database. The database will initially hold data collected from the USGS standard map products at scales of 1:24,000, 1:100,000, and 1:2,000,000. The spatial data model is called the digital line graph-enhanced model, and the comprehensive rule set consists of collection rules, product generation rules, and conflict resolution rules. This modernization program will affect the USGS mapmaking process because both digital and graphic products will be created from the database. In addition, non-USGS map users will have more flexibility in uses of the databases. These remarks are those of the session discussant made in response to the six papers and the keynote address given in the session. ?? 1992.

  1. The New Zealand Tsunami Database: historical and modern records

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.

    2016-12-01

    A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.

  2. The BioImage Database Project: organizing multidimensional biological images in an object-relational database.

    PubMed

    Carazo, J M; Stelzer, E H

    1999-01-01

    The BioImage Database Project collects and structures multidimensional data sets recorded by various microscopic techniques relevant to modern life sciences. It provides, as precisely as possible, the circumstances in which the sample was prepared and the data were recorded. It grants access to the actual data and maintains links between related data sets. In order to promote the interdisciplinary approach of modern science, it offers a large set of key words, which covers essentially all aspects of microscopy. Nonspecialists can, therefore, access and retrieve significant information recorded and submitted by specialists in other areas. A key issue of the undertaking is to exploit the available technology and to provide a well-defined yet flexible structure for dealing with data. Its pivotal element is, therefore, a modern object relational database that structures the metadata and ameliorates the provision of a complete service. The BioImage database can be accessed through the Internet. Copyright 1999 Academic Press.

  3. Discovering Knowledge from AIS Database for Application in VTS

    NASA Astrophysics Data System (ADS)

    Tsou, Ming-Cheng

    The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.

  4. TheHiveDB image data management and analysis framework.

    PubMed

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-06

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative.

  5. TheHiveDB image data management and analysis framework

    PubMed Central

    Muehlboeck, J-Sebastian; Westman, Eric; Simmons, Andrew

    2014-01-01

    The hive database system (theHiveDB) is a web-based brain imaging database, collaboration, and activity system which has been designed as an imaging workflow management system capable of handling cross-sectional and longitudinal multi-center studies. It can be used to organize and integrate existing data from heterogeneous projects as well as data from ongoing studies. It has been conceived to guide and assist the researcher throughout the entire research process, integrating all relevant types of data across modalities (e.g., brain imaging, clinical, and genetic data). TheHiveDB is a modern activity and resource management system capable of scheduling image processing on both private compute resources and the cloud. The activity component supports common image archival and management tasks as well as established pipeline processing (e.g., Freesurfer for extraction of scalar measures from magnetic resonance images). Furthermore, via theHiveDB activity system algorithm developers may grant access to virtual machines hosting versioned releases of their tools to collaborators and the imaging community. The application of theHiveDB is illustrated with a brief use case based on organizing, processing, and analyzing data from the publically available Alzheimer Disease Neuroimaging Initiative. PMID:24432000

  6. Bioinformatics.

    PubMed

    Moore, Jason H

    2007-11-01

    Bioinformatics is an interdisciplinary field that blends computer science and biostatistics with biological and biomedical sciences such as biochemistry, cell biology, developmental biology, genetics, genomics, and physiology. An important goal of bioinformatics is to facilitate the management, analysis, and interpretation of data from biological experiments and observational studies. The goal of this review is to introduce some of the important concepts in bioinformatics that must be considered when planning and executing a modern biological research study. We review database resources as well as data mining software tools.

  7. SISYPHUS: A high performance seismic inversion factory

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with branches for the static process setup, inversion iterations, and solver runs, each branch specifying information at the event, station and channel levels. The workflow management framework is based on an embedded scripting engine that allows definition of various workflow scenarios using a high-level scripting language and provides access to all available inversion components represented as standard library functions. At present the SES3D wave propagation solver is integrated in the solution; the work is in progress for interfacing with SPECFEM3D. A separate framework is designed for interoperability with an optimization module; the workflow manager and optimization process run in parallel and cooperate by exchanging messages according to a specially designed protocol. A library of high-performance modules implementing signal pre-processing, misfit and adjoint computations according to established good practices is included. Monitoring is based on information stored in the inversion state database and at present implements a command line interface; design of a graphical user interface is in progress. The software design fits well into the common massively parallel system architecture featuring a large number of computational nodes running distributed applications under control of batch-oriented resource managers. The solution prototype has been implemented on the "Piz Daint" supercomputer provided by the Swiss Supercomputing Centre (CSCS).

  8. [Design and establishment of modern literature database about acupuncture Deqi].

    PubMed

    Guo, Zheng-rong; Qian, Gui-feng; Pan, Qiu-yin; Wang, Yang; Xin, Si-yuan; Li, Jing; Hao, Jie; Hu, Ni-juan; Zhu, Jiang; Ma, Liang-xiao

    2015-02-01

    A search on acupuncture Deqi was conducted using four Chinese-language biomedical databases (CNKI, Wan-Fang, VIP and CBM) and PubMed database and using keywords "Deqi" or "needle sensation" "needling feeling" "needle feel" "obtaining qi", etc. Then, a "Modern Literature Database for Acupuncture Deqi" was established by employing Microsoft SQL Server 2005 Express Edition, introducing the contents, data types, information structure and logic constraint of the system table fields. From this Database, detailed inquiries about general information of clinical trials, acupuncturists' experience, ancient medical works, comprehensive literature, etc. can be obtained. The present databank lays a foundation for subsequent evaluation of literature quality about Deqi and data mining of undetected Deqi knowledge.

  9. Improving machine operation management efficiency via improving the vehicle park structure and using the production operation information database

    NASA Astrophysics Data System (ADS)

    Koptev, V. Yu

    2017-02-01

    The work represents the results of studying basic interconnected criteria of separate equipment units of the transport network machines fleet, depending on production and mining factors to improve the transport systems management. Justifying the selection of a control system necessitates employing new methodologies and models, augmented with stability and transport flow criteria, accounting for mining work development dynamics on mining sites. A necessary condition is the accounting of technical and operating parameters related to vehicle operation. Modern open pit mining dispatching systems must include such kinds of the information database. An algorithm forming a machine fleet is presented based on multi-variation task solution in connection with defining reasonable operating features of a machine working as a part of a complex. Proposals cited in the work may apply to mining machines (drilling equipment, excavators) and construction equipment (bulldozers, cranes, pile-drivers), city transport and other types of production activities using machine fleet.

  10. Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.

    2016-09-01

    Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.

  11. Relax with CouchDB--into the non-relational DBMS era of bioinformatics.

    PubMed

    Manyam, Ganiraju; Payton, Michelle A; Roth, Jack A; Abruzzo, Lynne V; Coombes, Kevin R

    2012-07-01

    With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Prognosis and management of myocardial infarction: Comparisons between the French FAST-MI 2010 registry and the French public health database.

    PubMed

    Massoullié, Grégoire; Wintzer-Wehekind, Jérome; Chenaf, Chouki; Mulliez, Aurélien; Pereira, Bruno; Authier, Nicolas; Eschalier, Alain; Clerfond, Guillaume; Souteyrand, Géraud; Tabassome, Simon; Danchin, Nicolas; Citron, Bernard; Lusson, Jean-René; Puymirat, Étienne; Motreff, Pascal; Eschalier, Romain

    2016-05-01

    Multicentre registries of myocardial infarction management show a steady improvement in prognosis and greater access to myocardial revascularization in a more timely manner. While French registries are the standard references, the question arises: are data stemming solely from the activity of French cardiac intensive care units (ICUs) a true reflection of the entire French population with ST-segment elevation myocardial infarction (STEMI)? To compare data on patients hospitalized for STEMI from two French registries: the French registry of acute ST-elevation or non-ST-elevation myocardial infarction (FAST-MI) and the Échantillon généraliste des bénéficiaires (EGB) database. We compared patients treated for STEMI listed in the FAST-MI 2010 registry (n=1716) with those listed in the EGB database, which comprises a sample of 1/97th of the French population, also from 2010 (n=403). Compared with the FAST-MI 2010 registry, the EGB database population were older (67.2±15.3 vs 63.3±14.5 years; P<0.001), had a higher percentage of women (36.0% vs 24.7%; P<0.001), were less likely to undergo emergency coronary angiography (75.2% vs 96.3%; P<0.001) and were less often treated in university hospitals (27.1% vs 37.0%; P=0.001). There were no significant differences between the two registries in terms of cardiovascular risk factors, comorbidities and drug treatment at admission. Thirty-day mortality was higher in the EGB database (10.2% vs 4.4%; P<0.001). Registries such as FAST-MI are indispensable, not only for assessing epidemiological changes over time, but also for evaluating the prognostic effect of modern STEMI management. Meanwhile, exploitation of data from general databases, such as EGB, provides additional relevant information, as they include a broader population not routinely admitted to cardiac ICUs. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  13. [Characteristics of acupoint application for the sub-healthy condition treated with ancient and modern acupuncture based on data mining exploration].

    PubMed

    Cai, Liyan; Wu, Jie; Ma, Tingting; Yang, Lijie

    2015-10-01

    The acupoint selection was retrieved from the ancient and modern literature on the treatment of sub-healthy condition with acupuncture. The law of acupoint application was analyzed so as to provide a certain reference to the determination of acupoint prescription in clinical acupuncture. The ancient literature was retrieved from Chinese basic ancient literature database. The modern literature was retrieved from Cochrane Library, Medline, PubMed, Ovid evidence-based medicine database, Chinese biomedical literature database, China journal full-text database, VIP journal full-text database and Wanfang database. The database mining software was adopted to explore the law of acupoint application in treatment of sub-healthy conditions with ancient and modern acupuncture. The acupoint use frequency, compatibility association rule, law for meridian use and the use regularity of specific points were analyzed. In the ancient treatment for sub-healthy condition, the top five commonly used acupoints are Shenmen (HT 7), Zhaohai (KI 6), Taibai (SP 3), Daling (PC 7) and Taixi (KI 3). The most commonly combined points are Zhangmen (LR 13), Taibai (SP 3) and Zhaohai (KI 6). The most commonly used meridians are the bladder meridian of foot-taiyang, kidney meridian of foot-shaoyin and liver meridian of foot-jueyin. The most commonly used specific points are the five-shu points. The most commonly used acupoints are located in the lower limbs. In the modern treatment, the top five commonly used acupoints are Zusanli (ST 36), Sanyinjiao (SP 6), Baihui (GV 20), Shenshu (BL 23) and Guanyuan (CV 4). The most commonly supplemented points are Hegu (LI 4) and Taichong (LR 3). The most commonly used meridians are the bladder meridian of foot-taiyang, the conception vessel and the governor vessel. The most commonly used specific points are the back-shu points. The most commonly used acupoints are located in the lower limbs. After the systematic comprehension of the relevant ancient and modern literature, the most commonly used acupoints are selected along the bladder meridian of foot-taiyang, and the most commonly used specific points are the back-shu points, the five-shu points and the front-mu-points. the acupoints are mostly located in the lower limbs.

  14. Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop

    NASA Technical Reports Server (NTRS)

    Cottrell, William L.

    1994-01-01

    The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.

  15. Deploying anaerobic digesters: Current status and future possibilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusk, P.; Wheeler, P.; Rivard, C.

    1996-01-01

    Unmanaged pollutants from putrescible farm, industrial, and municipal wastes degrade in the environment, and methane emitted from their decomposition may contribute to global climate change. Under modern environmental regulations, these wastes are becoming difficult to dispose of using traditional means. One waste management system, anaerobic digestion or AD, not only provides pollution prevention but can also convert a disposal problem into a new profit center. This report is drawn from a special session of the Second Biomass Conference of the Americas. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  16. GIS-based spatial decision support system for grain logistics management

    NASA Astrophysics Data System (ADS)

    Zhen, Tong; Ge, Hongyi; Jiang, Yuying; Che, Yi

    2010-07-01

    Grain logistics is the important component of the social logistics, which can be attributed to frequent circulation and the great quantity. At present time, there is no modern grain logistics distribution management system, and the logistics cost is the high. Geographic Information Systems (GIS) have been widely used for spatial data manipulation and model operations and provide effective decision support through its spatial database management capabilities and cartographic visualization. In the present paper, a spatial decision support system (SDSS) is proposed to support policy makers and to reduce the cost of grain logistics. The system is composed of two major components: grain logistics goods tracking model and vehicle routing problem optimization model and also allows incorporation of data coming from external sources. The proposed system is an effective tool to manage grain logistics in order to increase the speed of grain logistics and reduce the grain circulation cost.

  17. Architecture design of the national plant treasure management information system based on GIS: a case study of Gugong Date Garden in Hebei province

    NASA Astrophysics Data System (ADS)

    Shen, Shaoling; Li, Renjie; Shen, Dongdong; Tong, Chunyan; Fu, Xueqing

    2007-06-01

    "Gugong Date Garden", lies in Juguan Village, Qijiawu County, Huanghua City, China. It is the largest forest of winter date in this world, which is the longest in history, largest in area and best in quality and it is also included in the first group of national main protected units of botanic cultural relics. However, it is lacking of uniform management platform and modes. According to the specific characteristics of botanic cultural relics preservation, the author sets up the "Plant Treasure Management Information System" for "Gugong Date Garden", based on the Geographic information system (GIS), Internet, database and virtual reality technologies, along with the idea of modern customer management systems. This system is designed for five types of users, named system administrators, cultural relic supervisors, researchers, farmers and tourists, with the aim of realizing integrated managements of ancient trees' protection, scientific researches, tourism and explorations altogether, so as to make better management, protection, and utilizations.

  18. A geo-spatial data management system for potentially active volcanoes—GEOWARN project

    NASA Astrophysics Data System (ADS)

    Gogu, Radu C.; Dietrich, Volker J.; Jenny, Bernhard; Schwandner, Florian M.; Hurni, Lorenz

    2006-02-01

    Integrated studies of active volcanic systems for the purpose of long-term monitoring and forecast and short-term eruption prediction require large numbers of data-sets from various disciplines. A modern database concept has been developed for managing and analyzing multi-disciplinary volcanological data-sets. The GEOWARN project (choosing the "Kos-Yali-Nisyros-Tilos volcanic field, Greece" and the "Campi Flegrei, Italy" as test sites) is oriented toward potentially active volcanoes situated in regions of high geodynamic unrest. This article describes the volcanological database of the spatial and temporal data acquired within the GEOWARN project. As a first step, a spatial database embedded in a Geographic Information System (GIS) environment was created. Digital data of different spatial resolution, and time-series data collected at different intervals or periods, were unified in a common, four-dimensional representation of space and time. The database scheme comprises various information layers containing geographic data (e.g. seafloor and land digital elevation model, satellite imagery, anthropogenic structures, land-use), geophysical data (e.g. from active and passive seismicity, gravity, tomography, SAR interferometry, thermal imagery, differential GPS), geological data (e.g. lithology, structural geology, oceanography), and geochemical data (e.g. from hydrothermal fluid chemistry and diffuse degassing features). As a second step based on the presented database, spatial data analysis has been performed using custom-programmed interfaces that execute query scripts resulting in a graphical visualization of data. These query tools were designed and compiled following scenarios of known "behavior" patterns of dormant volcanoes and first candidate signs of potential unrest. The spatial database and query approach is intended to facilitate scientific research on volcanic processes and phenomena, and volcanic surveillance.

  19. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  20. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  1. Benchmarking distributed data warehouse solutions for storing genomic variant information

    PubMed Central

    Wiewiórka, Marek S.; Wysakowicz, Dawid P.; Okoniewski, Michał J.

    2017-01-01

    Abstract Genomic-based personalized medicine encompasses storing, analysing and interpreting genomic variants as its central issues. At a time when thousands of patientss sequenced exomes and genomes are becoming available, there is a growing need for efficient database storage and querying. The answer could be the application of modern distributed storage systems and query engines. However, the application of large genomic variant databases to this problem has not been sufficiently far explored so far in the literature. To investigate the effectiveness of modern columnar storage [column-oriented Database Management System (DBMS)] and query engines, we have developed a prototypic genomic variant data warehouse, populated with large generated content of genomic variants and phenotypic data. Next, we have benchmarked performance of a number of combinations of distributed storages and query engines on a set of SQL queries that address biological questions essential for both research and medical applications. In addition, a non-distributed, analytical database (MonetDB) has been used as a baseline. Comparison of query execution times confirms that distributed data warehousing solutions outperform classic relational DBMSs. Moreover, pre-aggregation and further denormalization of data, which reduce the number of distributed join operations, significantly improve query performance by several orders of magnitude. Most of distributed back-ends offer a good performance for complex analytical queries, while the Optimized Row Columnar (ORC) format paired with Presto and Parquet with Spark 2 query engines provide, on average, the lowest execution times. Apache Kudu on the other hand, is the only solution that guarantees a sub-second performance for simple genome range queries returning a small subset of data, where low-latency response is expected, while still offering decent performance for running analytical queries. In summary, research and clinical applications that require the storage and analysis of variants from thousands of samples can benefit from the scalability and performance of distributed data warehouse solutions. Database URL: https://github.com/ZSI-Bio/variantsdwh PMID:29220442

  2. Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.

    PubMed

    Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas

    2016-01-01

    More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  3. An Investigation of the Applicability of Modern Management Processes by Industrial Managers in Turkey.

    ERIC Educational Resources Information Center

    Lauter, Geza Peter

    This study noted American concepts of modern management which Turkish industrial managers tend to find difficult: identified cultural, economic, and other factors that impede application of modern management processes; and compared the practices of American overseas managers with those of Turkish managers of domestic firms. Managerial performance…

  4. New perspectives in toxicological information management, and the role of ISSTOX databases in assessing chemical mutagenicity and carcinogenicity.

    PubMed

    Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre

    2013-07-01

    Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.

  5. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.

  6. [Trauma and accident documentation in Germany compared with elsewhere in Europe].

    PubMed

    Probst, C; Richter, M; Haasper, C; Lefering, R; Otte, D; Oestern, H J; Krettek, C; Hüfner, T

    2008-07-01

    The role of trauma documentation has grown continuously since the 1970s. Prevention and management of injuries were adapted according to the results of many analyses. Since 1993 there have been two different trauma databases in Germany: the German trauma registry (TR) and the database of the Accident Research Unit (UFO). Modern computer applications improved the data processing. Our study analysed the pros and cons of each system and compared them with those of our European neighbours. We compared the TR and the UFO databases with respect to aims and goals, advantages and disadvantages, and current status. Results were reported as means +/- standard errors of the mean. The level of significance was set at P<0.05. There were differences between the two databases concerning number and types of items, aims and goals, and demographics. The TR documents care for severely injured patients and the clinical course of different types of accidents. The UFO describes traffic accidents, accident conditions, and interrelations. The German and British systems are similar, and the French system shows interesting differences. The German trauma documentation systems focus on different points. Therefore both can be used for substantiated analyses of different hypotheses. Certain intersections of both databases may help to answer very special questions in the future.

  7. Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline

    DOE PAGES

    Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.

    2016-09-28

    A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.

  8. Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.

    A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.

  9. Historical seismometry database project: A comprehensive relational database for historical seismic records

    NASA Astrophysics Data System (ADS)

    Bono, Andrea

    2007-01-01

    The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.

  10. Scoping of Flood Hazard Mapping Needs for Androscoggin County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed and as funds allow. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Androscoggin County. Scoping activities included assembling existing data and map needs information for communities in Androscoggin County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Androscoggin County, Maine, is at least 17 years. Most studies were published in the early 1990s, and some towns have partial maps that are more recent than their study date. Since the studies were done, development has occurred in many of the watersheds and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  11. Scoping of Flood Hazard Mapping Needs for Lincoln County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Lincoln County. Scoping activities included assembling existing data and map needs information for communities in Lincoln County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Lincoln County, Maine is at least 17 years. Many of these studies were published in the mid- to late-1980s, and some towns have partial maps that are more recent than their study. However, in the ensuing 15-20 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  12. Distributed computing for macromolecular crystallography

    PubMed Central

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Ballard, Charles

    2018-01-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community. PMID:29533240

  13. Distributed computing for macromolecular crystallography.

    PubMed

    Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles

    2018-02-01

    Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.

  14. Major technology issues in surgical data collection.

    PubMed

    Kirschenbaum, I H

    1995-10-01

    Surgical scheduling and data collection is a field that has a long history as well as a bright future. Historically, surgical cases have always involved some amount of data collection. Surgical cases are scheduled and then reviewed. The classic method, that large black surgical log, actually still exists in many hospitals. In fact, there is nothing new about the recording or reporting of surgical cases. If we only needed to record the information and produce a variety of reports on the data, then modern electronic technology would function as a glorified fast index card box--or, in computer database terms, a simple flat file database. But, this is not the future of technology in surgical case management. This article makes the general case for integrating surgical data systems. Instead of reviewing specific software, it essentially addresses the issues of strategic planning related to this important aspect of medical information systems.

  15. Management information system of medical equipment using mobile devices

    NASA Astrophysics Data System (ADS)

    Núñez, C.; Castro, D.

    2011-09-01

    The large numbers of technologies currently incorporated into mobile devices transform them into excellent tools for capture and to manage the information, because of the increasing computing power and storage that allow to add many miscellaneous applications. In order to obtain benefits of these technologies, in the biomedical engineering field, it was developed a mobile information system for medical equipment management. The central platform for the system it's a mobile phone, which by a connection with a web server, it's capable to send and receive information relative to any medical equipment. Decoding a type of barcodes, known as QR-Codes, the management process is simplified and improved. These barcodes identified the medical equipments in a database, when these codes are photographed and decoded with the mobile device, you can access to relevant information about the medical equipment in question. This Project in it's actual state is a basic support tool for the maintenance of medical equipment. It is also a modern alternative, competitive and economic in the actual market.

  16. Hawai`i forest bird monitoring database: Database dictionary

    USGS Publications Warehouse

    Camp, Richard J.; Genz, Ayesha

    2017-01-01

    Between 1976 and 1981, the U.S. Fish and Wildlife Service (now U.S. Geological Survey – Pacific Island Ecosystems Research Center [USGS-PIERC]) conducted systematic surveys of forest birds and plant communities on all the main Hawaiian Islands, except O‘ahu, as part of the Hawai‘i Forest Bird Surveys (HFBS). Results of this monumental effort have guided conservation efforts and provided the basis for many plant and bird recovery plans and land acquisition decisions in Hawai‘i. Unfortunately, these estimates and range maps are now seriously outdated, hindering modern conservation decision-making and recovery planning. HFBIDP staff work closely with land managers and others to identify the location of bird populations in need of protection. In addition, HFBIDP is able to assess field collection methods, census areas, and survey frequency for their effectiveness. Survey and geographical data are refined and released in successive versions, each more inclusive, detailed, and accurate than the previous release. Incrementally releasing data gives land managers and survey coordinators reasonably good data to work with early on rather than waiting for the release of ‘perfect’ data, ‘perfectly’ analyzed. Consequently, summary results are available in a timely manner.

  17. A web-based, relational database for studying glaciers in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Nigrelli, G.; Chiarle, M.; Nuzzi, A.; Perotti, L.; Torta, G.; Giardino, M.

    2013-02-01

    Glaciers are among the best terrestrial indicators of climate change and thus glacier inventories have attracted a growing, worldwide interest in recent years. In Italy, the first official glacier inventory was completed in 1925 and 774 glacial bodies were identified. As the amount of data continues to increase, and new techniques become available, there is a growing demand for computer tools that can efficiently manage the collected data. The Research Institute for Geo-hydrological Protection of the National Research Council, in cooperation with the Departments of Computer Science and Earth Sciences of the University of Turin, created a database that provides a modern tool for storing, processing and sharing glaciological data. The database was developed according to the need of storing heterogeneous information, which can be retrieved through a set of web search queries. The database's architecture is server-side, and was designed by means of an open source software. The website interface, simple and intuitive, was intended to meet the needs of a distributed public: through this interface, any type of glaciological data can be managed, specific queries can be performed, and the results can be exported in a standard format. The use of a relational database to store and organize a large variety of information about Italian glaciers collected over the last hundred years constitutes a significant step forward in ensuring the safety and accessibility of such data. Moreover, the same benefits also apply to the enhanced operability for handling information in the future, including new and emerging types of data formats, such as geographic and multimedia files. Future developments include the integration of cartographic data, such as base maps, satellite images and vector data. The relational database described in this paper will be the heart of a new geographic system that will merge data, data attributes and maps, leading to a complete description of Italian glacial environments.

  18. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project.

    PubMed

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A

    2011-01-01

    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical measurements.

  19. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  20. Modernized Techniques for Dealing with Quality Data and Derived Products

    NASA Astrophysics Data System (ADS)

    Neiswender, C.; Miller, S. P.; Clark, D.

    2008-12-01

    "I just want a picture of the ocean floor in this area" is expressed all too often by researchers, educators, and students in the marine geosciences. As more sophisticated systems are developed to handle data collection and processing, the demand for quality data, and standardized products continues to grow. Data management is an invisible bridge between science and researchers/educators. The SIOExplorer digital library presents more than 50 years of ocean-going research. Prior to publication, all data is checked for quality using standardized criterion developed for each data stream. Despite the evolution of data formats and processing systems, SIOExplorer continues to present derived products in well- established formats. Standardized products are published for each cruise, and include a cruise report, MGD77 merged data, multi-beam flipbook, and underway profiles. Creation of these products is made possible by processing scripts, which continue to change with ever-evolving data formats. We continue to explore the potential of database-enabled creation of standardized products, such as the metadata-rich MGD77 header file. Database-enabled, automated processing produces standards-compliant metadata for each data and derived product. Metadata facilitates discovery and interpretation of published products. This descriptive information is stored both in an ASCII file, and a searchable digital library database. SIOExplorer's underlying technology allows focused search and retrieval of data and products. For example, users can initiate a search of only multi-beam data, which includes data-specific parameters. This customization is made possible with a synthesis of database, XML, and PHP technology. The combination of standardized products and digital library technology puts quality data and derived products in the hands of scientists. Interoperable systems enable distribution these published resources using technology such as web services. By developing modernized strategies to deal with data, Scripps Institution of Oceanography is able to produce and distribute well-formed, and quality-tested derived products, which aid research, understanding, and education.

  1. Technology and the Modern Library.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    1984-01-01

    Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…

  2. Department of Defense Healthcare Management System Modernization (DHMSM)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Department of Defense Healthcare Management System Modernization (DHMSM) Defense...DSN Fax: Date Assigned: November 16, 2015 Program Information Program Name Department of Defense Healthcare Management System Modernization...DHMSM) DoD Component DoD The acquiring DoD Component is Program Executive Office (PEO) Department of Defense (DoD) Healthcare Management Systems (DHMS

  3. CUSTOMS SERVICE MODERNIZATION: Management Improvements Needed on High-Risk Automated Commercial Environment Project

    DTIC Science & Technology

    2002-05-01

    GAO United States General Accounting OfficeReport to Congressional CommitteesMay 2002 CUSTOMS SERVICE MODERNIZATION Management Improvements Needed...from... to) - Title and Subtitle CUSTOMS SERVICE MODERNIZATION: Management Improvements Needed on High-Risk Automated Commercial Environment... Customs management of ACE. Subject Terms Report Classification unclassified Classification of this page unclassified Classification of Abstract

  4. [Current status and trends in the health of the Moscow population].

    PubMed

    Tishuk, E A; Plavunov, N F; Soboleva, N P

    1997-01-01

    Based on vast comprehensive medical statistical database, the authors analyze the health status of the population and the efficacy of public health service in Moscow. The pre-crisis tendencies and the modern status of public health under modern socioeconomic conditions are noted.

  5. Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian

    2016-01-01

    The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.

  6. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  7. Modernization and new technologies: Coping with the information explosion

    NASA Technical Reports Server (NTRS)

    Blados, Walter R.; Cotter, Gladys A.

    1993-01-01

    Information has become a valuable and strategic resource in all societies and economies. Scientific and technical information is especially important in developing and maintaining a strong national science and technology base. The expanding use of information technology, the growth of interdisciplinary research, and an increase in international collaboration are changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls'. Notwithstanding this modernization program, it is recogized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected to cover expanded sources of information. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative, one that can optimize the cost/benefit equation for all participants.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Edward J., Jr.; Henry, Karen Lynne

    Sandia National Laboratories develops technologies to: (1) sustain, modernize, and protect our nuclear arsenal (2) Prevent the spread of weapons of mass destruction; (3) Provide new capabilities to our armed forces; (4) Protect our national infrastructure; (5) Ensure the stability of our nation's energy and water supplies; and (6) Defend our nation against terrorist threats. We identified the need for a single overarching Integrated Workplace Management System (IWMS) that would enable us to focus on customer missions and improve FMOC processes. Our team selected highly configurable commercial-off-the-shelf (COTS) software with out-of-the-box workflow processes that integrate strategic planning, project management, facilitymore » assessments, and space management, and can interface with existing systems, such as Oracle, PeopleSoft, Maximo, Bentley, and FileNet. We selected the Integrated Workplace Management System (IWMS) from Tririga, Inc. Facility Management System (FMS) Benefits are: (1) Create a single reliable source for facility data; (2) Improve transparency with oversight organizations; (3) Streamline FMOC business processes with a single, integrated facility-management tool; (4) Give customers simple tools and real-time information; (5) Reduce indirect costs; (6) Replace approximately 30 FMOC systems and 60 homegrown tools (such as Microsoft Access databases); and (7) Integrate with FIMS.« less

  9. Evolution of a Structure-Searchable Database into a Prototype for a High-Fidelity SmartPhone App for 62 Common Pesticides Used in Delaware.

    PubMed

    D'Souza, Malcolm J; Barile, Benjamin; Givens, Aaron F

    2015-05-01

    Synthetic pesticides are widely used in the modern world for human benefit. They are usually classified according to their intended pest target. In Delaware (DE), approximately 42 percent of the arable land is used for agriculture. In order to manage insectivorous and herbaceous pests (such as insects, weeds, nematodes, and rodents), pesticides are used profusely to biologically control the normal pest's life stage. In this undergraduate project, we first created a usable relational database containing 62 agricultural pesticides that are common in Delaware. Chemically pertinent quantitative and qualitative information was first stored in Bio-Rad's KnowItAll® Informatics System. Next, we extracted the data out of the KnowItAll® system and created additional sections on a Microsoft® Excel spreadsheet detailing pesticide use(s) and safety and handling information. Finally, in an effort to promote good agricultural practices, to increase efficiency in business decisions, and to make pesticide data globally accessible, we developed a mobile application for smartphones that displayed the pesticide database using Appery.io™; a cloud-based HyperText Markup Language (HTML5), jQuery Mobile and Hybrid Mobile app builder.

  10. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    NASA Astrophysics Data System (ADS)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  11. Column Store for GWAC: A High-cadence, High-density, Large-scale Astronomical Light Curve Pipeline and Distributed Shared-nothing Database

    NASA Astrophysics Data System (ADS)

    Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan

    2016-11-01

    The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.

  12. Scoping of flood hazard mapping needs for Kennebec County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Kennebec County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Kennebec County. Scoping activities included assembling existing data and map needs information for communities in Kennebec County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Kennebec County, Maine is 16 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Kennebec County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Kennebec County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how e

  13. Scoping of flood hazard mapping needs for Somerset County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Somerset County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Somerset County. Scoping activities included assembling existing data and map needs information for communities in Somerset County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Somerset County, Maine is 18.1 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Somerset County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Somerset County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and ho

  14. Scoping of flood hazard mapping needs for Cumberland County, Maine

    USGS Publications Warehouse

    Dudley, Robert W.; Schalk, Charles W.

    2006-01-01

    This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Cumberland County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Cumberland County. Scoping activities included assembling existing data and map needs information for communities in Cumberland County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Cumberland County, Maine is 21 years. Most of these studies were in the early to mid 1980s. However, in the ensuing 20-25 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Cumberland County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Cumberland County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how effective FIS data can be used in new project, and identify other data needed to

  15. Scoping of Flood Hazard Mapping Needs for Penobscot County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program (MFMP), began scoping work in 2006 for Penobscot County. Scoping activities included assembling existing data and map needs information for communities in Penobscot County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. As of 2007, the average age of the FEMA floodplain maps in Penobscot County, Maine, is 22 years, based on the most recent revisions to the maps. Because the revisions did not affect all the map panels in each town, however, the true average date probably is more than 22 years. Many of the studies were published in the mid-1980s. Since the studies were completed, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.

  16. Scoping of Flood Hazard Mapping Needs for Hancock County, Maine

    USGS Publications Warehouse

    Schalk, Charles W.; Dudley, Robert W.

    2007-01-01

    Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Hancock County. Scoping activities included assembling existing data and map needs information for communities in Hancock County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps (all types) in Hancock County, Maine, is at least 19 years. Most of these studies were published in the late 1980s and early 1990s, and no study is more recent than 1992. Some towns have partial maps that are more recent than their study, indicating that the true average age of the data is probably more than 19 years. Since the studies were done, development has occurred in some of the watersheds and the characteristics of the watersheds have changed. Therefore, many of the older studies may not depict current conditions or accurately estimate risk in terms of flood heights or flood mapping.

  17. New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo

    2012-11-01

    Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.

  18. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  19. Medicinal Plants for the Treatment of Asthma: A Traditional Persian Medicine Perspective.

    PubMed

    Javadi, Behjat; Sahebkar, Amirhossein; Emami, Seyed Ahmad

    2017-01-01

    To search major Traditional Persian Medicine (TPM) textbooks for medicinal plants used to treat asthma. The conformity of the TPM findings on the anti-asthmatic efficacy of plants with the findings of pharmacological studies was also explored. Major TPM textbooks were hand searched to find medicinal plants used for the treatment of asthma. Scientific names of TPM-suggested plants were determined using botanical databases and were used for a multidatabase electronic search in PubMed, Scopus, ScienceDirect and Google Scholar databases. Then, the antiasthmatic effectiveness of TPM-recommended plants was verified in view of the findings from modern pharmacological investigations. According to the main TPM texts, Adianthum capillus-veneris, Boswellia oleogumresin, Crocus sativus, Glycyrrhiza glabra, Hyssopus officinalis and Ruta graveolens were the most efficacious medicinal plants for the treatment of asthma. This finding was confirmed by pharmacological studies which showed counterbalancing effects of the above-mentioned plants on inflammation, oxidative stress, allergic response, tracheal smooth muscle cell constriction and airway remodeling. The strong ethnobotanical background of plants used in TPM could be a valuable tool to find new anti-asthmatic medications. In this review, TPM-suggested anti-asthmatic plants were found to possess several mechanisms relevant to the treatment of respiratory diseases according to the information retrieved from modern pharmacological studies. This high degree of conformity suggested further proof-of-concept trials to ascertain the role of these plants in the routine management of asthmatic patients. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Pre-hospital management of mass casualty civilian shootings: a systematic literature review.

    PubMed

    Turner, Conor D A; Lockey, David J; Rehn, Marius

    2016-11-08

    Mass casualty civilian shootings present an uncommon but recurring challenge to emergency services around the world and produce unique management demands. On the background of a rising threat of transnational terrorism worldwide, emergency response strategies are of critical importance. This study aims to systematically identify, describe and appraise the quality of indexed and non-indexed literature on the pre-hospital management of modern civilian mass shootings to guide future practice. Systematic literature searches of PubMed, Cochrane Database of Systematic Reviews and Scopus were conducted in conjunction with simple searches of non-indexed databases; Web of Science, OpenDOAR and Evidence Search. The searches were last carried out on 20 April 2016 and only identified those papers published after the 1 January 1980. Included documents had to contain descriptions, discussions or experiences of the pre-hospital management of civilian mass shootings. From the 494 identified manuscripts, 73 were selected on abstract and title and after full text reading 47 were selected for inclusion in analysis. The search yielded reports of 17 mass shooting events, the majority from the USA with additions from France, Norway, the UK and Kenya. Between 1994 and 2015 the shooting of 1649 people with 578 deaths at 17 separate events are described. Quality appraisal demonstrated considerable heterogeneity in reporting and revealed limited data on mass shootings globally. Key themes were identified to improve future practice: tactical emergency medical support may harmonise inner cordon interventions, a need for inter-service education on effective haemorrhage control, the value of senior triage operators and the need for regular mass casualty incident simulation.

  1. Innovative work behavior of managers: Implications regarding stressful challenges of modernized public- and private-sector organizations

    PubMed Central

    Mukherjee, Sudeshna Basu; Ray, Anjali

    2009-01-01

    Background: The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Materials and Methods: Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for ‘t’ and ANOVA. Results: Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them. PMID:21180486

  2. Innovative work behavior of managers: Implications regarding stressful challenges of modernized public- and private-sector organizations.

    PubMed

    Mukherjee, Sudeshna Basu; Ray, Anjali

    2009-07-01

    The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for 't' and ANOVA. Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them.

  3. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  4. Migrating a lecture in nursing informatics to a blended learning format--A bottom-up approach to implement an open-source web-based learning management system.

    PubMed

    Schrader, Ulrich

    2006-01-01

    At the university of applied sciences in Germany a learning management system has been implemented. The migration of classic courses to a web-enhances curriculum can be categorized into three phases independent of the technology used. The first two phases "dedicated website" and "database supported content management system" are mainly concerned with bringing the learning material and current information online and making it available to the students. The goal is here to make the maintenance of the learning material easier. The third phase characterized by the use of a learning management system offers the support of more modern didactic principles like social constructionism or problem-oriented learning. In this papers the phases as they occurred with the migration of a course of nursing informatics are described and experiences discussed.. The absence of institutional goals associated with the use of a learning management system led to a bottom-up approach triggered by faculty activities that can be described by a promoter model rather than by a process management model. The use of an open source learning management systems made this process easier to realize since no financial commitment is required up front.

  5. SIOExplorer: Modern IT Methods and Tools for Digital Library Management

    NASA Astrophysics Data System (ADS)

    Sutton, D. W.; Helly, J.; Miller, S.; Chase, A.; Clarck, D.

    2003-12-01

    With more geoscience disciplines becoming data-driven it is increasingly important to utilize modern techniques for data, information and knowledge management. SIOExplorer is a new digital library project with 2 terabytes of oceanographic data collected over the last 50 years on 700 cruises by the Scripps Institution of Oceanography. It is built using a suite of information technology tools and methods that allow for an efficient and effective digital library management system. The library consists of a number of independent collections, each with corresponding metadata formats. The system architecture allows each collection to be built and uploaded based on a collection dependent metadata template file (MTF). This file is used to create the hierarchical structure of the collection, create metadata tables in a relational database, and to populate object metadata files and the collection as a whole. Collections are comprised of arbitrary digital objects stored at the San Diego Supercomputer Center (SDSC) High Performance Storage System (HPSS) and managed using the Storage Resource Broker (SRB), data handling middle ware developed at SDSC. SIOExplorer interoperates with other collections as a data provider through the Open Archives Initiative (OAI) protocol. The user services for SIOExplorer are accessed from CruiseViewer, a Java application served using Java Web Start from the SIOExplorer home page. CruiseViewer is an advanced tool for data discovery and access. It implements general keyword and interactive geospatial search methods for the collections. It uses a basemap to georeference search results on user selected basemaps such as global topography or crustal age. User services include metadata viewing, opening of selective mime type digital objects (such as images, documents and grid files), and downloading of objects (including the brokering of proprietary hold restrictions).

  6. Intrusion Detection in Database Systems

    NASA Astrophysics Data System (ADS)

    Javidi, Mohammad M.; Sohrabi, Mina; Rafsanjani, Marjan Kuchaki

    Data represent today a valuable asset for organizations and companies and must be protected. Ensuring the security and privacy of data assets is a crucial and very difficult problem in our modern networked world. Despite the necessity of protecting information stored in database systems (DBS), existing security models are insufficient to prevent misuse, especially insider abuse by legitimate users. One mechanism to safeguard the information in these databases is to use an intrusion detection system (IDS). The purpose of Intrusion detection in database systems is to detect transactions that access data without permission. In this paper several database Intrusion detection approaches are evaluated.

  7. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  8. KAGLVis - On-line 3D Visualisation of Earth-observing-satellite Data

    NASA Astrophysics Data System (ADS)

    Szuba, Marek; Ameri, Parinaz; Grabowski, Udo; Maatouki, Ahmad; Meyer, Jörg

    2015-04-01

    One of the goals of the Large-Scale Data Management and Analysis project is to provide a high-performance framework facilitating management of data acquired by Earth-observing satellites such as Envisat. On the client-facing facet of this framework, we strive to provide visualisation and basic analysis tool which could be used by scientists with minimal to no knowledge of the underlying infrastructure. Our tool, KAGLVis, is a JavaScript client-server Web application which leverages modern Web technologies to provide three-dimensional visualisation of satellite observables on a wide range of client systems. It takes advantage of the WebGL API to employ locally available GPU power for 3D rendering; this approach has been demonstrated to perform well even on relatively weak hardware such as integrated graphics chipsets found in modern laptop computers and with some user-interface tuning could even be usable on embedded devices such as smartphones or tablets. Data is fetched from the database back-end using a ReST API and cached locally, both in memory and using HTML5 Web Storage, to minimise network use. Computations, calculation of cloud altitude from cloud-index measurements for instance, can depending on configuration be performed on either the client or the server side. Keywords: satellite data, Envisat, visualisation, 3D graphics, Web application, WebGL, MEAN stack.

  9. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-07-01

    "Prediflood" is a database of historical floods occurred in Catalonia (NE Iberian Peninsula), between 10th Century and 21th Century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, apt to be used in multidisciplinary flood analysis techniques, as meteorological or hydraullic reconstructions.

  10. Negative Effects of Learning Spreadsheet Management on Learning Database Management

    ERIC Educational Resources Information Center

    Vágner, Anikó; Zsakó, László

    2015-01-01

    A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…

  11. An automated calibration laboratory - Requirements and design approach

    NASA Technical Reports Server (NTRS)

    O'Neil-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  12. 76 FR 59170 - Hartford Financial Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-23

    ... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...

  13. Rebound effect of modern drugs: serious adverse event unknown by health professionals.

    PubMed

    Teixeira, Marcus Zulian

    2013-01-01

    Supported in the Hippocratic aphorism primum non nocere, the bioethical principle of non-maleficence pray that the medical act cause the least damage or injury to the health of the patient, leaving it to the doctor to assess the risks of a particular therapy through knowledge of possible adverse events of drugs. Among these, the rebound effect represents a common side effect to numerous classes of modern drugs, may cause serious and fatal disorders in patients. This review aims to clarify the health professionals on clinical and epidemiological aspects of rebound phenomenon. A qualitative, exploratory and bibliographic review was held in the PubMed database using the keywords 'rebound', 'withdrawal', 'paradoxical', 'acetylsalicylic acid', 'anti-inflammatory', 'bronchodilator', 'antidepressant', 'statin', 'proton pump inhibitor' and 'bisphosphonate'. The rebound effect occurs after discontinuation of numerous classes of drugs that act contrary to the disease disorders, exacerbating them at levels above those prior to treatment. Regardless of the disease, the drug and duration of treatment, the phenomenon manifests itself in a small proportion of susceptible individuals. However, it may cause serious and fatal adverse events should be considered a public health problem in view of the enormous consumption of drugs by population. Bringing together a growing and unquestionable body of evidence, the physician needs to have knowledge of the consequences of the rebound effect and how to minimize it, increasing safety in the management of modern drugs. On the other hand, this rebound can be used in a curative way, broadening the spectrum of the modern therapeutics. Copyright © 2012 Elsevier Editora Ltda. All rights reserved.

  14. Machine Learning and Decision Support in Critical Care

    PubMed Central

    Johnson, Alistair E. W.; Ghassemi, Mohammad M.; Nemati, Shamim; Niehaus, Katherine E.; Clifton, David A.; Clifford, Gari D.

    2016-01-01

    Clinical data management systems typically provide caregiver teams with useful information, derived from large, sometimes highly heterogeneous, data sources that are often changing dynamically. Over the last decade there has been a significant surge in interest in using these data sources, from simply re-using the standard clinical databases for event prediction or decision support, to including dynamic and patient-specific information into clinical monitoring and prediction problems. However, in most cases, commercial clinical databases have been designed to document clinical activity for reporting, liability and billing reasons, rather than for developing new algorithms. With increasing excitement surrounding “secondary use of medical records” and “Big Data” analytics, it is important to understand the limitations of current databases and what needs to change in order to enter an era of “precision medicine.” This review article covers many of the issues involved in the collection and preprocessing of critical care data. The three challenges in critical care are considered: compartmentalization, corruption, and complexity. A range of applications addressing these issues are covered, including the modernization of static acuity scoring; on-line patient tracking; personalized prediction and risk assessment; artifact detection; state estimation; and incorporation of multimodal data sources such as genomic and free text data. PMID:27765959

  15. Why different countries manage death differently: a comparative analysis of modern urban societies.

    PubMed

    Walter, Tony

    2012-03-01

    The sociology of death, dying and bereavement tends to take as its implicit frame either the nation state or a homogenous modernity. Between-nation differences in the management of death and dying are either ignored or untheorized. This article seeks to identify the factors that can explain both similarities and differences in the management of death between different modern western nations. Structural factors which affect all modern nations include urbanization and the division of labour leading to the dominance of professionals, migration, rationality and bureaucracy, information technology and the risk society. How these sociologically familiar structural features are responded to, however, depends on national histories, institutions and cultures. Historically, key transitional periods to modernity, different in different nations, necessitated particular institutional responses in the management of dying and dead bodies. Culturally, key factors include individualism versus collectivism, religion, secularization, boundary regulation, and expressivism. Global flows of death practices depend significantly on subjugated nations' perceptions of colonialism, neo-colonialism and modernity, which can lead to a dominant power's death practices being either imitated or rejected. © London School of Economics and Political Science 2012.

  16. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-12-01

    "Prediflood" is a database of historical floods that occurred in Catalonia (NE Iberian Peninsula), between the 11th century and the 21st century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, suitable for use in multidisciplinary flood analysis techniques, such as meteorological or hydraulic reconstructions.

  17. MOSAIC: An organic geochemical and sedimentological database for marine surface sediments

    NASA Astrophysics Data System (ADS)

    Tavagna, Maria Luisa; Usman, Muhammed; De Avelar, Silvania; Eglinton, Timothy

    2015-04-01

    Modern ocean sediments serve as the interface between the biosphere and the geosphere, play a key role in biogeochemical cycles and provide a window on how contemporary processes are written into the sedimentary record. Research over past decades has resulted in a wealth of information on the content and composition of organic matter in marine sediments, with ever-more sophisticated techniques continuing to yield information of greater detail and as an accelerating pace. However, there has been no attempt to synthesize this wealth of information. We are establishing a new database that incorporates information relevant to local, regional and global-scale assessment of the content, source and fate of organic materials accumulating in contemporary marine sediments. In the MOSAIC (Modern Ocean Sediment Archive and Inventory of Carbon) database, particular emphasis is placed on molecular and isotopic information, coupled with relevant contextual information (e.g., sedimentological properties) relevant to elucidating factors that influence the efficiency and nature of organic matter burial. The main features of MOSAIC include: (i) Emphasis on continental margin sediments as major loci of carbon burial, and as the interface between terrestrial and oceanic realms; (ii) Bulk to molecular-level organic geochemical properties and parameters, including concentration and isotopic compositions; (iii) Inclusion of extensive contextual data regarding the depositional setting, in particular with respect to sedimentological and redox characteristics. The ultimate goal is to create an open-access instrument, available on the web, to be utilized for research and education by the international community who can both contribute to, and interrogate the database. The submission will be accomplished by means of a pre-configured table available on the MOSAIC webpage. The information on the filled tables will be checked and eventually imported, via the Structural Query Language (SQL), into MOSAIC. MOSAIC is programmed with PostgreSQL, an open-source database management system. In order to locate geographically the data, each element/datum is associated to a latitude, longitude and depth, facilitating creation of a geospatial database which can be easily interfaced to a Geographic Information System (GIS). In order to make the database broadly accessible, a HTML-PHP language-based website will ultimately be created and linked to the database. Consulting the website will allow for both data visualization as well as export of data in txt format for utilization with common software solutions (e.g. ODV, Excel, Matlab, Python, Word, PPT, Illustrator…). In this very early stage, MOSAIC presently contains approximately 10000 analyses conducted on more than 1800 samples which were collected from over 1600 different geographical locations around the world. Through participation of the international research community, MOSAIC will rapidly develop into a rich archive and versatile tool for investigation of distribution and composition of organic matter accumulating in seafloor sediments. The present contribution will outline the structure of MOSAIC, provide examples of data output, and solicit feedback on desirable features to be included in the database and associated software tools.

  18. TWRS technical baseline database manager definition document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acree, C.D.

    1997-08-13

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

  19. [Discussion on logistics management of medical consumables].

    PubMed

    Deng, Sutong; Wang, Miao; Jiang, Xiali

    2011-09-01

    Management of medical consumables is an important part of modern hospital management. In modern medical behavior, drugs and medical devices act directly on the patient, and are important factors affecting the quality of medical practice. With the increasing use of medical materials, based on practical application, this article proposes the management model of medical consumables, and discusses the essence of medical materials logistics management.

  20. Current role of modern radiotherapy techniques in the management of breast cancer

    PubMed Central

    Ozyigit, Gokhan; Gultekin, Melis

    2014-01-01

    Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857

  1. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG efforts.

  2. Does adoption of electronic health records improve the quality of care management in France? Results from the French e-SI (PREPS-SIPS) study.

    PubMed

    Plantier, Morgane; Havet, Nathalie; Durand, Thierry; Caquot, Nicolas; Amaz, Camille; Biron, Pierre; Philip, Irène; Perrier, Lionel

    2017-06-01

    Electronic health records (EHR) are increasingly being adopted by healthcare systems worldwide. In France, the "Hôpital numérique 2012-2017" program was implemented as part of a strategic plan to modernize health information technology (HIT), including the promotion of widespread EHR use. With significant upfront investment costs as well as ongoing operational expenses, it is important to assess this system in terms of its ability to result in improvements in hospital performances. The aim of this study was to evaluate the impact of EHR use on the quality of care management in acute care hospitals throughout France. This retrospective study was based on data derived from three national databases for the year 2011: IPAQSS (indicators of improvement in the quality and the management of healthcare, "IPAQSS"), Hospi-Diag (French hospital performance indicators), and the national accreditation database. Several multivariate models were used to examine the association between the use of EHRs and specific EHR features with four quality indicators: the quality of patient record, the delay in sending information at hospital discharge, the pain status evaluation, and the nutritional status evaluation, while also adjusting for hospital characteristics. The models revealed a significant positive impact of EHR use on the four quality indicators. Additionally, they showed a differential impact according to the functionality of the element of the health record that was computerized. All four quality indicators were also impacted by the type of hospital, the geographical region, and the severity of the pathology. These results suggest that, to improve the quality of care management in hospitals, EHR adoption represents an important lever. They complete previous work dealing with EHR and the organizational performance of hospital surgical units. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. [Quality management and participation into clinical database].

    PubMed

    Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi

    2013-07-01

    Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.

  4. Read Code Quality Assurance

    PubMed Central

    Schulz, Erich; Barrett, James W.; Price, Colin

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with “business rules” declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short. PMID:9670131

  5. The Role of Free/Libre and Open Source Software in Learning Health Systems.

    PubMed

    Paton, C; Karopka, T

    2017-08-01

    Objective: To give an overview of the role of Free/Libre and Open Source Software (FLOSS) in the context of secondary use of patient data to enable Learning Health Systems (LHSs). Methods: We conducted an environmental scan of the academic and grey literature utilising the MedFLOSS database of open source systems in healthcare to inform a discussion of the role of open source in developing LHSs that reuse patient data for research and quality improvement. Results: A wide range of FLOSS is identified that contributes to the information technology (IT) infrastructure of LHSs including operating systems, databases, frameworks, interoperability software, and mobile and web apps. The recent literature around the development and use of key clinical data management tools is also reviewed. Conclusions: FLOSS already plays a critical role in modern health IT infrastructure for the collection, storage, and analysis of patient data. The nature of FLOSS systems to be collaborative, modular, and modifiable may make open source approaches appropriate for building the digital infrastructure for a LHS. Georg Thieme Verlag KG Stuttgart.

  6. Read Code quality assurance: from simple syntax to semantic stability.

    PubMed

    Schulz, E B; Barrett, J W; Price, C

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with "business rules" declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short.

  7. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available. Another consideration is the strategy used for partitioning large data collections, and large datasets within collections, using round-robin vs hash partitioning vs range partitioning methods. Each has different characteristics in terms of spatial locality of data and resultant degree of declustering of the computations on the data. Furthermore, we have observed that, in practice, there can be large variations in the frequency of access to different parts of a large data collection and/or dataset, thereby creating "hotspots" in the data. We will evaluate the ability of different approaches for dealing effectively with such hotspots and alternative strategies for dealing with hotspots.

  8. [Research of regional medical consumables reagent logistics system in the modern hospital].

    PubMed

    Wu, Jingjiong; Zhang, Yanwen; Luo, Xiaochen; Zhang, Qing; Zhu, Jianxin

    2013-09-01

    To explore the modern hospital and regional medical consumable reagents logistics system management. The characteristics of regional logistics, through cooperation between medical institutions within the region, and organize a wide range of special logistics activities, to make reasonable of the regional medical consumable reagents logistics. To set the regional management system, dynamic management systems, supply chain information management system, after-sales service system and assessment system. By the research of existing medical market and medical resources, to establish the regional medical supplies reagents directory and the initial data. The emphasis is centralized dispatch of medical supplies reagents, to introduce qualified logistics company for dispatching, to improve the modern hospital management efficiency, to costs down. Regional medical center and regional community health service centers constitute a regional logistics network, the introduction of medical consumable reagents logistics services, fully embodies integrity level, relevance, purpose, environmental adaptability of characteristics by the medical consumable reagents regional logistics distribution. Modern logistics distribution systems can increase the area of medical consumables reagent management efficiency and reduce costs.

  9. Comparative Analysis of Data Structures for Storing Massive Tins in a Dbms

    NASA Astrophysics Data System (ADS)

    Kumar, K.; Ledoux, H.; Stoter, J.

    2016-06-01

    Point cloud data are an important source for 3D geoinformation. Modern day 3D data acquisition and processing techniques such as airborne laser scanning and multi-beam echosounding generate billions of 3D points for simply an area of few square kilometers. With the size of the point clouds exceeding the billion mark for even a small area, there is a need for their efficient storage and management. These point clouds are sometimes associated with attributes and constraints as well. Storing billions of 3D points is currently possible which is confirmed by the initial implementations in Oracle Spatial SDO PC and the PostgreSQL Point Cloud extension. But to be able to analyse and extract useful information from point clouds, we need more than just points i.e. we require the surface defined by these points in space. There are different ways to represent surfaces in GIS including grids, TINs, boundary representations, etc. In this study, we investigate the database solutions for the storage and management of massive TINs. The classical (face and edge based) and compact (star based) data structures are discussed at length with reference to their structure, advantages and limitations in handling massive triangulations and are compared with the current solution of PostGIS Simple Feature. The main test dataset is the TIN generated from third national elevation model of the Netherlands (AHN3) with a point density of over 10 points/m2. PostgreSQL/PostGIS DBMS is used for storing the generated TIN. The data structures are tested with the generated TIN models to account for their geometry, topology, storage, indexing, and loading time in a database. Our study is useful in identifying what are the limitations of the existing data structures for storing massive TINs and what is required to optimise these structures for managing massive triangulations in a database.

  10. Chemical and mineralogical data and processing methods management system prototype with application to study of the North Caucasus Blybsky Metamorphic Complexes metamorphism PT-condition

    NASA Astrophysics Data System (ADS)

    Ivanov, Stanislav; Kamzolkin, Vladimir; Konilov, Aleksandr; Aleshin, Igor

    2014-05-01

    There are many various methods of assessing the conditions of rocks formation based on determining the composition of the constituent minerals. Our objective was to create a universal tool for processing mineral's chemical analysis results and solving geothermobarometry problems by creating a database of existing sensors and providing a user-friendly standard interface. Similar computer assisted tools are based upon large collection of sensors (geothermometers and geobarometers) are known, for example, the project TPF (Konilov A.N., 1999) - text-based sensor collection tool written in PASCAL. The application contained more than 350 different sensors and has been used widely in petrochemical studies (see A.N. Konilov , A.A. Grafchikov, V.I. Fonarev 2010 for review). Our prototype uses the TPF project concept and is designed with modern application development techniques, which allows better flexibility. Main components of the designed system are 3 connected datasets: sensors collection (geothermometers, geobarometers, oxygen geobarometers, etc.), petrochemical data and modeling results. All data is maintained by special management and visualization tools and resides in sql database. System utilities allow user to import and export data in various file formats, edit records and plot graphs. Sensors database contains up to date collections of known methods. New sensors may be added by user. Measured database should be filled in by researcher. User friendly interface allows access to all available data and sensors, automates routine work, reduces the risk of common user mistakes and simplifies information exchange between research groups. We use prototype to evaluate peak pressure during the formation of garnet-amphibolite apoeclogites, gneisses and schists Blybsky metamorphic complex of the Front Range of the Northern Caucasus. In particular, our estimation of formation pressure range (18 ± 4 kbar) agrees on independent research results. The reported study was partially supported by RFBR, research project No. 14-05-00615.

  11. Development of Innovative Business Model of Modern Manager's Qualities

    ERIC Educational Resources Information Center

    Yashkova, Elena V.; Sineva, Nadezda L.; Shkunova, Angelika A.; Bystrova, Natalia V.; Smirnova, Zhanna V.; Kolosova, Tatyana V.

    2016-01-01

    The paper defines a complex of manager's qualities based on theoretical and methodological analysis and synthesis methods, available national and world literature, research papers and publications. The complex approach methodology was used, which provides an innovative view of the development of modern manager's qualities. The methodological…

  12. Gingival Retraction Methods for Fabrication of Fixed Partial Denture: Literature Review

    PubMed Central

    S, Safari; Ma, Vossoghi Sheshkalani; Mi, Vossoghi Sheshkalani; F, Hoseini Ghavam; M, Hamedi

    2016-01-01

    Fixed dental prosthesis success requires appropriate impression taking of the prepared finish line. This is critical in either tooth supported fixed prosthesis (crown and bridge) or implant supported fixed prosthesis (solid abutment). If the prepared finish line is adjacent to the gingival sulcus, gingival retraction techniques should be used to decrease the marginal discrepancy among the restoration and the prepared abutment. Accurate marginal positioning of the restoration in the prepared finish line of the abutment is required for therapeutic, preventive and aesthetic purposes. In this article, conventional and modern methods of gingival retraction in the fixed tooth supported prosthesis and fixed implant supported prosthesis are expressed. PubMed and Google Scholar databases were searched manually for studies on gingival tissue managements prior to impression making in fixed dental prosthesis since 1975. Conclusions were extracted and summarized. Keywords were impression making, gingival retraction, cordless retraction, and implant. Gingival retraction techniques can be classified as mechanical, chemical or surgical. In this article, different gingival management techniques are discussed. PMID:28959744

  13. Gingival Retraction Methods for Fabrication of Fixed Partial Denture: Literature Review.

    PubMed

    S, Safari; Ma, Vossoghi Sheshkalani; Mi, Vossoghi Sheshkalani; F, Hoseini Ghavam; M, Hamedi

    2016-06-01

    Fixed dental prosthesis success requires appropriate impression taking of the prepared finish line. This is critical in either tooth supported fixed prosthesis (crown and bridge) or implant supported fixed prosthesis (solid abutment). If the prepared finish line is adjacent to the gingival sulcus, gingival retraction techniques should be used to decrease the marginal discrepancy among the restoration and the prepared abutment. Accurate marginal positioning of the restoration in the prepared finish line of the abutment is required for therapeutic, preventive and aesthetic purposes. In this article, conventional and modern methods of gingival retraction in the fixed tooth supported prosthesis and fixed implant supported prosthesis are expressed. PubMed and Google Scholar databases were searched manually for studies on gingival tissue managements prior to impression making in fixed dental prosthesis since 1975. Conclusions were extracted and summarized. Keywords were impression making, gingival retraction, cordless retraction, and implant. Gingival retraction techniques can be classified as mechanical, chemical or surgical. In this article, different gingival management techniques are discussed.

  14. A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration

    NASA Technical Reports Server (NTRS)

    Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang

    2014-01-01

    Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.

  15. Short Fiction on Film: A Relational DataBase.

    ERIC Educational Resources Information Center

    May, Charles

    Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…

  16. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  17. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  18. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...

  19. Report of the matrix of biological knowledge workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morowitz, H.J.; Smith, T.

    1987-10-30

    Current understanding of biology involves complex relationships rooted in enormous amounts of data. These data include entries from biochemistry, ecology, genetics, human and veterinary medicine, molecular structure studies, agriculture, embryology, systematics, and many other disciplines. The present wealth of biological data goes beyond past accumulations now include new understandings from molecular biology. Several important biological databases are currently being supported, and more are planned; however, major problems of interdatabase communication and management efficiency abound. Few scientists are currently capable of keeping up with this ever-increasing wealth of knowledge, let alone searching it efficiently for new or unsuspected links and importantmore » analogies. Yet this is what is required if the continued rapid generation of such data is to lead most effectively to the major conceptual, medical, and agricultural advances anticipated over the coming decades in the United States. The opportunity exists to combine the potential of modern computer science, database management, and artificial intelligence in a major effort to organize the vast wealth of biological and clinical data. The time is right because the amount of data is still manageable even in its current highly-fragmented form; important hardware and computer science tools have been greatly improved; and there have been recent fundamental advances in our comprehension of biology. This latter is particularly true at the molecular level where the information for nearly all higher structure and function is encoded. The organization of all biological experimental data coordinately within a structure incorporating our current understanding - the Matrix of Biological Knowledge - will provide the data and structure for the major advances foreseen in the years ahead.« less

  20. Providing Database Services in a Nationwide Research Organisation--Coexistence of Traditional Information Services and a Modern CD-ROM/Online Hybrid Solution.

    ERIC Educational Resources Information Center

    Bowman, Benjamin F.

    For the past two decades the central Information Retrieval Services of the Max Planck Society has been providing database searches for scientists in Max Planck Institutes and Research Groups throughout Germany. As a supplement to traditional search services offered by professional intermediaries, they have recently fostered the introduction of a…

  1. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  2. The Data Base and Decision Making in Public Schools.

    ERIC Educational Resources Information Center

    Hedges, William D.

    1984-01-01

    Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…

  3. Perioperative management of endocrine insufficiency after total pancreatectomy for neoplasia.

    PubMed

    Maker, Ajay V; Sheikh, Raashid; Bhagia, Vinita

    2017-09-01

    Indications for total pancreatectomy (TP) have increased, including for diffuse main duct intrapapillary mucinous neoplasms of the pancreas and malignancy; therefore, the need persists for surgeons to develop appropriate endocrine post-operative management strategies. The brittle diabetes after TP differs from type 1/2 diabetes in that patients have absolute deficiency of insulin and functional glucagon. This makes glucose management challenging, complicates recovery, and predisposes to hospital readmissions. This article aims to define the disease, describe the cause for its occurrence, review the anatomy of the endocrine pancreas, and explain how this condition differs from diabetes mellitus in the setting of post-operative management. The morbidity and mortality of post-TP endocrine insufficiency and practical treatment strategies are systematically reviewed from the literature. Finally, an evidence-based treatment algorithm is created for the practicing pancreatic surgeon and their care team of endocrinologists to aid in managing these complex patients. A PubMed, Science Citation Index/Social sciences Citation Index, and Cochrane Evidence-Based Medicine database search was undertaken along with extensive backward search of the references of published articles to identify studies evaluating endocrine morbidity and treatment after TP and to establish an evidence-based treatment strategy. Indications for TP and the etiology of pancreatogenic diabetes are reviewed. After TP, ~80% patients develop hypoglycemic episodes and 40% experience severe hypoglycemia, resulting in 0-8% mortality and 25-45% morbidity. Referral to a nutritionist and endocrinologist for patient education before surgery followed by surgical reevaluation to determine if the patient has the appropriate understanding, support, and resources preoperatively has significantly reduced morbidity and mortality. The use of modern recombinant long-acting insulin analogues, continuous subcutaneous insulin infusion, and glucagon rescue therapy has greatly improved management in the modern era and constitute the current standard of care. A simple immediate post-operative algorithm was constructed. Successful perioperative surgical management of total pancreatectomy and resulting pancreatogenic diabetes is critical to achieve acceptable post-operative outcomes, and we review the pertinent literature and provide a simple, evidence-based algorithm for immediate post-resection glycemic control.

  4. Socioeconomic thresholds that affect use of customary fisheries management tools.

    PubMed

    Cinner, Joshua E; Sutton, Stephen G; Bond, Trevor G

    2007-12-01

    Customary forms of resource management, such as taboos, have received considerable attention as a potential basis for conservation initiatives in the Indo-Pacific. Yet little is known about how socioeconomic factors influence the ability of communities to use customary management practices and whether socioeconomic transformations within communities will weaken conservation initiatives with a customary foundation. We used a comparative approach to examine how socioeconomic factors may influence whether communities use customary fisheries management in Papua New Guinea. We examined levels of material wealth (modernization), dependence on marine resources, population, and distance to market in 15 coastal communities. We compared these socioeconomic conditions in 5 communities that used a customary method of closing their fishing ground with 10 communities that did not use this type of management. There were apparent threshold levels of dependence on marine resources, modernization, distance to markets (<16.5 km), and population (>600 people) beyond which communities did not use customary fisheries closures. Nevertheless, economic inequality, rather than mean modernization levels seemed to influence the use of closures. Our results suggest that customary management institutions are not resilient to factors such as population growth and economic modernization. If customary management is to be used as a basis for modern conservation initiatives, cross-scale institutional arrangements such as networks and bridging organizations may be required to help filter the impacts of socioeconomic transformations.

  5. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  6. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  7. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  8. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  9. 47 CFR 52.101 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...

  10. 47 CFR 0.241 - Authority delegated.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...

  11. Flexible use and technique extension of logistics management

    NASA Astrophysics Data System (ADS)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  12. A review of post-modern management techniques as currently applied to Turkish forestry.

    PubMed

    Dölarslan, Emre Sahin

    2009-01-01

    This paper reviews the effects of six post-modern management concepts as applied to Turkish forestry. Up to now, Turkish forestry has been constrained, both in terms of its operations and internal organization, by a highly bureaucratic system. The application of new thinking in forestry management, however, has recently resulted in new organizational and production concepts that promise to address problems specific to this Turkish industry and bring about positive changes. This paper will elucidate these specific issues and demonstrate how post-modern management thinking is influencing the administration and operational capacity of Turkish forestry within its current structure.

  13. 77 FR 16551 - Standards for Private Laboratory Analytical Packages and Introduction to Laboratory Related...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-21

    ... Food Modernization Safety Act for Private Laboratory Managers AGENCY: Food and Drug Administration, HHS... Food Modernization Safety Act for Private Laboratory Managers.'' The topic to be discussed is the...

  14. The Neotoma Paleoecology Database

    NASA Astrophysics Data System (ADS)

    Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.

    2015-12-01

    The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community. Research is supported by NSF EAR-0622349.

  15. Database Management Systems: New Homes for Migrating Bibliographic Records.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Bierbaum, Esther G.

    1987-01-01

    Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…

  16. Through Kazan ASPERA to Modern Projects

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha

    Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership

  17. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  18. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  19. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  20. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  1. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  2. Construction of databases: advances and significance in clinical research.

    PubMed

    Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian

    2015-12-01

    Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.

  3. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  4. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  5. Plantago major in Traditional Persian Medicine and modern phytotherapy: a narrative review.

    PubMed

    Najafian, Younes; Hamedi, Shokouh Sadat; Farshchi, Masoumeh Kaboli; Feyzabadi, Zohre

    2018-02-01

    Plantago major has been used widely since ancient times, to manage a wide range of diseases including constipation, coughs and wounds. The aim of this study is to review the traditional application, botanical characterization, pharmacological activities, phytochemistry effects and toxicity of Plantago major. In this review study, medicinal properties of Plantago major are collected from credible pharmacopeias, textbooks of traditional Persian medicine (TPM) belonging to the 10-18th century AD, such as "The Canon of Medicine", "Makhzan-Al- Advia" and so on. Moreover, electronic databases including Scopus, Medline and Web of science were explored for this purpose. Plantago major has been prescribed in various forms such as roasted seeds, decoction, syrup, liniment, gargle, rectal enema, vaginal suppository, eye and nasal drop for each illness by TPM scholars. Some of its traditional properties including wound healing, antipyretic, antitussive, anti-infective, anti-hemorrhagic, anti-inflammatory, diuretic, laxative, astringent and hemostatic have been confirmed in recent researches. Phytochemical investigations showed that Plantago major contains volatile compounds, triterpenoids, phenolic acids and flavonoids. Modern pharmacological studies have proven some of the traditional applications of Plantago major. Nevertheless, more investigations are required on this plant, because it has the potential to be used to produce various natural medications.

  6. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  7. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  8. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  9. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  10. MIRO and IRbase: IT Tools for the Epidemiological Monitoring of Insecticide Resistance in Mosquito Disease Vectors

    PubMed Central

    Dialynas, Emmanuel; Topalis, Pantelis; Vontas, John; Louis, Christos

    2009-01-01

    Background Monitoring of insect vector populations with respect to their susceptibility to one or more insecticides is a crucial element of the strategies used for the control of arthropod-borne diseases. This management task can nowadays be achieved more efficiently when assisted by IT (Information Technology) tools, ranging from modern integrated databases to GIS (Geographic Information System). Here we describe an application ontology that we developed de novo, and a specially designed database that, based on this ontology, can be used for the purpose of controlling mosquitoes and, thus, the diseases that they transmit. Methodology/Principal Findings The ontology, named MIRO for Mosquito Insecticide Resistance Ontology, developed using the OBO-Edit software, describes all pertinent aspects of insecticide resistance, including specific methodology and mode of action. MIRO, then, forms the basis for the design and development of a dedicated database, IRbase, constructed using open source software, which can be used to retrieve data on mosquito populations in a temporally and spatially separate way, as well as to map the output using a Google Earth interface. The dependency of the database on the MIRO allows for a rational and efficient hierarchical search possibility. Conclusions/Significance The fact that the MIRO complies with the rules set forward by the OBO (Open Biomedical Ontologies) Foundry introduces cross-referencing with other biomedical ontologies and, thus, both MIRO and IRbase are suitable as parts of future comprehensive surveillance tools and decision support systems that will be used for the control of vector-borne diseases. MIRO is downloadable from and IRbase is accessible at VectorBase, the NIAID-sponsored open access database for arthropod vectors of disease. PMID:19547750

  11. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  12. 78 FR 28756 - Defense Federal Acquisition Regulation Supplement: System for Award Management Name Changes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...

  13. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows

  14. Crustose coralline algae increased framework and diversity on ancient coral reefs.

    PubMed

    Weiss, Anna; Martindale, Rowan C

    2017-01-01

    Crustose coralline algae (CCA) are key producers of carbonate sediment on reefs today. Despite their importance in modern reef ecosystems, the long-term relationship of CCA with reef development has not been quantitatively assessed in the fossil record. This study includes data from 128 Cenozoic coral reefs collected from the Paleobiology Database, the Paleoreefs Database, as well as the original literature and assesses the correlation of CCA abundance with taxonomic diversity (both corals and reef dwellers) and framework of fossil coral reefs. Chi-squared tests show reef type is significantly correlated with CCA abundance and post-hoc tests indicate higher involvement of CCA is associated with stronger reef structure. Additionally, general linear models show coral reefs with higher amounts of CCA had a higher diversity of reef-dwelling organisms. These data have important implications for paleoecology as they demonstrate that CCA increased building capacity, structural integrity, and diversity of ancient coral reefs. The analyses presented here demonstrate that the function of CCA on modern coral reefs is similar to their function on Cenozoic reefs; thus, studies of ancient coral reef collapse are even more meaningful as modern analogues.

  15. NNDC Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.; Sonzogni,A.

    The National Nuclear Data Center has provided remote access to some of its resources since 1986. The major databases and other resources available currently through NNDC Web site are summarized. The National Nuclear Data Center (NNDC) has provided remote access to the nuclear physics databases it maintains and to other resources since 1986. With considerable innovation access is now mostly through the Web. The NNDC Web pages have been modernized to provide a consistent state-of-the-art style. The improved database services and other resources available from the NNOC site at www.nndc.bnl.govwill be described.

  16. Prevention and Treatment of Flatulence From a Traditional Persian Medicine Perspective.

    PubMed

    Larijani, Bagher; Esfahani, Mohammad Medhi; Moghimi, Maryam; Shams Ardakani, Mohammad Reza; Keshavarz, Mansoor; Kordafshari, Gholamreza; Nazem, Esmaiel; Hasani Ranjbar, Shirin; Mohammadi Kenari, Hoorieh; Zargaran, Arman

    2016-04-01

    The feeling of abdominal fullness, bloating, and movement of gas in the abdomen is a very uncomfortable sensation termed flatulence. Since flatulence is one of the most common gastrointestinal symptoms that is bothersome to patients, it is important to identify effective methods to resolve this issue. In modern medicine, management of flatulence is often not satisfactory. On the other hand, traditional systems of medicine can be considered good potential sources to find new approaches for preventing and treating flatulence. The aim of this study is to review flatulence treatments from a traditional Persian medicine (TPM) viewpoint. In this study, the reasons for flatulence and methods for its prevention and treatment are reviewed in traditional Persian medicine (TPM) texts and then related with evidence from modern medicine by searching in databases, including PubMed, Scopus, Google Scholar, and IranMedex. From a traditional Persian scholar viewpoint, one of the most important causes of flatulence is an incorrect manner of eating; valuable advice to correct bad eating habits will be illustrated. In addition, traditional practitioners describe some herbs and vegetables as well as herbal compounds that are effective food additives to relieve flatulence. The anti-flatulent effect of most of these herbs has been experimentally verified using modern medicine. Attention to TPM can lead to the identification of new preventive and curative approaches to avoid and treat flatulence. In addition, Persian viewpoints from the medieval era regarding flatulence are historically important.

  17. Prevention and Treatment of Flatulence From a Traditional Persian Medicine Perspective

    PubMed Central

    Larijani, Bagher; Esfahani, Mohammad Medhi; Moghimi, Maryam; Shams Ardakani, Mohammad Reza; Keshavarz, Mansoor; Kordafshari, Gholamreza; Nazem, Esmaiel; Hasani Ranjbar, Shirin; Mohammadi Kenari, Hoorieh; Zargaran, Arman

    2016-01-01

    Context The feeling of abdominal fullness, bloating, and movement of gas in the abdomen is a very uncomfortable sensation termed flatulence. Since flatulence is one of the most common gastrointestinal symptoms that is bothersome to patients, it is important to identify effective methods to resolve this issue. In modern medicine, management of flatulence is often not satisfactory. On the other hand, traditional systems of medicine can be considered good potential sources to find new approaches for preventing and treating flatulence. The aim of this study is to review flatulence treatments from a traditional Persian medicine (TPM) viewpoint. Evidence Acquisition In this study, the reasons for flatulence and methods for its prevention and treatment are reviewed in traditional Persian medicine (TPM) texts and then related with evidence from modern medicine by searching in databases, including PubMed, Scopus, Google Scholar, and IranMedex. Results From a traditional Persian scholar viewpoint, one of the most important causes of flatulence is an incorrect manner of eating; valuable advice to correct bad eating habits will be illustrated. In addition, traditional practitioners describe some herbs and vegetables as well as herbal compounds that are effective food additives to relieve flatulence. The anti-flatulent effect of most of these herbs has been experimentally verified using modern medicine. Conclusions Attention to TPM can lead to the identification of new preventive and curative approaches to avoid and treat flatulence. In addition, Persian viewpoints from the medieval era regarding flatulence are historically important. PMID:27275398

  18. Modernization and unification: Strategic goals for NASA STI program

    NASA Technical Reports Server (NTRS)

    Blados, W.; Cotter, Gladys A.

    1993-01-01

    Information is increasingly becoming a strategic resource in all societies and economies. The NASA Scientific and Technical Information (STI) Program has initiated a modernization program to address the strategic importance and changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls.' Notwithstanding this modernization program, it is recognized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected because of resource limitations. If nations cooperate to develop an international aerospace information system, resources can be used efficiently to cover expanded sources of information. We must consider forming a coalition to collect and provide access to disparate, multidisciplinary sources of information, and to develop standardized tools for documenting and manipulating this data and information. In view of recent technological developments in information science and technology, as well as the reality of scarce resources in all nations, it is time to explore the mutually beneficial possibilities offered by cooperation and international resource sharing. International resources need to be mobilized in a coordinated manner to move us towards this goal. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative that can optimize the cost/benefit equation for all participants.

  19. Database Searching by Managers.

    ERIC Educational Resources Information Center

    Arnold, Stephen E.

    Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…

  20. Information technology and public health management of disasters--a model for South Asian countries.

    PubMed

    Mathew, Dolly

    2005-01-01

    This paper highlights the use of information technology (IT) in disaster management and public health management of disasters. Effective health response to disasters will depend on three important lines of action: (1) disaster preparedness; (2) emergency relief; and (3) management of disasters. This is facilitated by the presence of modern communication and space technology, especially the Internet and remote sensing satellites. This has made the use of databases, knowledge bases, geographic information systems (GIS), management information systems (MIS), information transfer, and online connectivity possible in the area of disaster management and medicine. This paper suggests a conceptual model called, "The Model for Public Health Management of Disasters for South Asia". This Model visualizes the use of IT in the public health management of disasters by setting up the Health and Disaster Information Network and Internet Community Centers, which will facilitate cooperation among all those in the areas of disaster management and emergency medicine. The suggested infrastructure would benefit the governments, non-government organizations, and institutions working in the areas of disaster and emergency medicine, professionals, the community, and all others associated with disaster management and emergency medicine. The creation of such an infrastructure will enable the rapid transfer of information, data, knowledge, and online connectivity from top officials to the grassroots organizations, and also among these countries regionally. This Model may be debated, modified, and tested further in the field to suit the national and local conditions. It is hoped that this exercise will result in a viable and practical model for use in public health management of disasters by South Asian countries.

  1. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  2. STORM WATER MANAGEMENT MODEL (SWMM) MODERNIZATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency's Water Supply and Water Resources Division in partnership with the consulting firm of CDM to redevelop and modernize the Storm Water Management Model (SWMM). In the initial phase of this project EPA rewrote SWMM's computational engine usi...

  3. Tsunami.gov: NOAA's Tsunami Information Portal

    NASA Astrophysics Data System (ADS)

    Shiro, B.; Carrick, J.; Hellman, S. B.; Bernard, M.; Dildine, W. P.

    2014-12-01

    We present the new Tsunami.gov website, which delivers a single authoritative source of tsunami information for the public and emergency management communities. The site efficiently merges information from NOAA's Tsunami Warning Centers (TWC's) by way of a comprehensive XML feed called Tsunami Event XML (TEX). The resulting unified view allows users to quickly see the latest tsunami alert status in geographic context without having to understand complex TWC areas of responsibility. The new site provides for the creation of a wide range of products beyond the traditional ASCII-based tsunami messages. The publication of modern formats such as Common Alerting Protocol (CAP) can drive geographically aware emergency alert systems like FEMA's Integrated Public Alert and Warning System (IPAWS). Supported are other popular information delivery systems, including email, text messaging, and social media updates. The Tsunami.gov portal allows NOAA staff to easily edit content and provides the facility for users to customize their viewing experience. In addition to access by the public, emergency managers and government officials may be offered the capability to log into the portal for special access rights to decision-making and administrative resources relevant to their respective tsunami warning systems. The site follows modern HTML5 responsive design practices for optimized use on mobile as well as non-mobile platforms. It meets all federal security and accessibility standards. Moving forward, we hope to expand Tsunami.gov to encompass tsunami-related content currently offered on separate websites, including the NOAA Tsunami Website, National Tsunami Hazard Mitigation Program, NOAA Center for Tsunami Research, National Geophysical Data Center's Tsunami Database, and National Data Buoy Center's DART Program. This project is part of the larger Tsunami Information Technology Modernization Project, which is consolidating the software architectures of NOAA's existing TWC's into a single system. We welcome your feedback to help Tsunami.gov become an effective public resource for tsunami information and a medium to enable better global tsunami warning coordination.

  4. Alternative treatment strategies for neuropathic pain: Role of Indian medicinal plants and compounds of plant origin-A review.

    PubMed

    Singh, Hasandeep; Bhushan, Sakshi; Arora, Rohit; Singh Buttar, Harpal; Arora, Saroj; Singh, Balbir

    2017-08-01

    Neuropathic pain is a complex, chronic pain state accompanied by tissue injury and nerve damage. This important health issue constitutes a challenge for the modern medicine worldwide. The management of neuropathic pain remains a major clinical challenge, pertaining to an inadequate understanding of pathophysiological mechanisms of neuropathic pain. Various classes of drugs have been reported effective for the management of neuropathic pain viz. opiates, tricyclic antidepressants, and antiepileptic agents. However, association of adverse effects with these drugs hinders their confident prescription in people with neuropathic pain. Recently, various medicinal plants have been reported effective for the management of neuropathic pain. So, it may be prudent to look beyond synthetic drugs pertaining to their unprecedented pharmacotherapeutic effects with lesser adverse effects. The extensive literature review has been carried out from databases such as Science direct, Scifinder, Wiley online library, PubMed, Research gate, Google scholar and Chemical Abstracts. The list of Traditional Indian Medicinal plants (TIMPs) and isolated compounds have been compiled which have been reported effective as an alternative therapy for the management of neuropathic pain. This helps the researchers to discover some novel therapeutic agents against neuropathic pain. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  5. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination

    PubMed Central

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-01-01

    Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122

  6. Psychology, technology, and diabetes management.

    PubMed

    Gonder-Frederick, Linda A; Shepard, Jaclyn A; Grabman, Jesse H; Ritterband, Lee M

    2016-10-01

    Use of technology in diabetes management is rapidly advancing and has the potential to help individuals with diabetes achieve optimal glycemic control. Over the past 40 years, several devices have been developed and refined, including the blood glucose meter, insulin pump, and continuous glucose monitor. When used in tandem, the insulin pump and continuous glucose monitor have prompted the Artificial Pancreas initiative, aimed at developing control system for fully automating glucose monitoring and insulin delivery. In addition to devices, modern technology, such as the Internet and mobile phone applications, have been used to promote patient education, support, and intervention to address the behavioral and emotional challenges of diabetes management. These state-of-the-art technologies not only have the potential to improve clinical outcomes, but there are possible psychological benefits, such as improved quality of life, as well. However, practical and psychosocial limitations related to advanced technology exist and, in the context of several technology-related theoretical frameworks, can influence patient adoption and continued use. It is essential for future diabetes technology research to address these barriers given that the clinical benefits appear to largely depend on patient engagement and consistence of technology use. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. 24 CFR 901.15 - Indicator #2, modernization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicator #2, modernization. 901.15... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.15 Indicator #2, modernization. This indicator is automatically excluded if a PHA does not have a modernization program. This indicator examines the...

  8. Modernizing the MagIC Paleomagnetic and Rock Magnetic Database Technology Stack to Encourage Code Reuse and Reproducible Science

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.

    2016-12-01

    The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.

  9. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  10. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    NASA Astrophysics Data System (ADS)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  11. Judicious use of custom development in an open source component architecture

    NASA Astrophysics Data System (ADS)

    Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.

    2014-12-01

    Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.

  12. Intra-reach headwater fish assemblage structure

    USGS Publications Warehouse

    McKenna, James E.

    2017-01-01

    Large-scale conservation efforts can take advantage of modern large databases and regional modeling and assessment methods. However, these broad-scale efforts often assume uniform average habitat conditions and/or species assemblages within stream reaches.

  13. Toward an integrated knowledge environment to support modern oncology.

    PubMed

    Blake, Patrick M; Decker, David A; Glennon, Timothy M; Liang, Yong Michael; Losko, Sascha; Navin, Nicholas; Suh, K Stephen

    2011-01-01

    Around the world, teams of researchers continue to develop a wide range of systems to capture, store, and analyze data including treatment, patient outcomes, tumor registries, next-generation sequencing, single-nucleotide polymorphism, copy number, gene expression, drug chemistry, drug safety, and toxicity. Scientists mine, curate, and manually annotate growing mountains of data to produce high-quality databases, while clinical information is aggregated in distant systems. Databases are currently scattered, and relationships between variables coded in disparate datasets are frequently invisible. The challenge is to evolve oncology informatics from a "systems" orientation of standalone platforms and silos into an "integrated knowledge environments" that will connect "knowable" research data with patient clinical information. The aim of this article is to review progress toward an integrated knowledge environment to support modern oncology with a focus on supporting scientific discovery and improving cancer care.

  14. Plug Into "The Modernizing Machine"! Danish University Reform and Its Transformable Academic Subjectivities

    ERIC Educational Resources Information Center

    Krejsler, John Benedicto

    2013-01-01

    "The modernizing machine" codes individual bodies, things, and symbols with images from New Public Management, neo-liberal, and Knowledge Economy discourses. Drawing on Deleuze and Guattari's concept of machines, this article explores how "the modernizing machine" produces neo-liberal modernization of the public sector. Taking…

  15. Database Management: Building, Changing and Using Databases. Collected Papers and Abstracts of the Mid-Year Meeting of the American Society for Information Science (15th, Portland, Oregon, May 1986).

    ERIC Educational Resources Information Center

    American Society for Information Science, Washington, DC.

    This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…

  16. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  17. Digital games as an effective approach for cancer management: Opportunities and challenges.

    PubMed

    Ghazisaeidi, Marjan; Safdari, Reza; Goodini, Azadeh; Mirzaiee, Mahboobeh; Farzi, Jebraeil

    2017-01-01

    Cancer is one of the most preventable and common chronic diseases that have economic, social and psychological burden for patients, families, and the society. Cancer can be monitored by new information technology. Digital games as a uniquely powerful interaction tool support optimal care management program operation in all dimensions. The aim of this review article is to describe opportunities and challenges of this new modern technology on the delivery of cancer care services in cancer management domains for cancer care improvement. This study was un-systematic (narrative) review article. In this research, 50 full-text papers and reports had been retrieved, studied exactly, and arranged based on study aims. We searched papers based on specific and relevant keywords in research databases including PubMed, ScienceDirect, Scopus, and Google scholar. In cancer management domain, digital games are as an effective medium for health education and intervention, disease self-management training, attention distraction to relieve pain, enhance clinical outcomes, improvements in lifestyles, and physical and psychosocial activity promotion when active participation and behavior rehearsal are required for cancer patient. In spite of potential benefits of new technology, sometimes people confront various challenges such as social isolation, unusual anxiety, and disorder in physiological times of body, low physical activities, decrease academic performance, increase aggressive behavior, and physical pain. These problems can be partly overcome by proper planning, good design, and usage of suitable and continuous monitoring.

  18. DTIC (Defense Technical Information Center) Model Action Plan for Incorporating DGIS (DOD Gateway Information System) Capabilities.

    DTIC Science & Technology

    1986-05-01

    Information System (DGIS) is being developed to provide the DD crmjnj t with a modern tool to access diverse dtabaiees and extract information products...this community with a modern tool for accessing these databases and extracting information products from them. Since the Defense Technical Information...adjunct to DROLS xesults. The study , thereor. centerd around obtaining background information inside the unit on that unit’s users who request DROLS

  19. A Higher Level Classification of All Living Organisms

    PubMed Central

    Ruggiero, Michael A.; Gordon, Dennis P.; Orrell, Thomas M.; Bailly, Nicolas; Bourgoin, Thierry; Brusca, Richard C.; Cavalier-Smith, Thomas; Guiry, Michael D.; Kirk, Paul M.

    2015-01-01

    We present a consensus classification of life to embrace the more than 1.6 million species already provided by more than 3,000 taxonomists’ expert opinions in a unified and coherent, hierarchically ranked system known as the Catalogue of Life (CoL). The intent of this collaborative effort is to provide a hierarchical classification serving not only the needs of the CoL’s database providers but also the diverse public-domain user community, most of whom are familiar with the Linnaean conceptual system of ordering taxon relationships. This classification is neither phylogenetic nor evolutionary but instead represents a consensus view that accommodates taxonomic choices and practical compromises among diverse expert opinions, public usages, and conflicting evidence about the boundaries between taxa and the ranks of major taxa, including kingdoms. Certain key issues, some not fully resolved, are addressed in particular. Beyond its immediate use as a management tool for the CoL and ITIS (Integrated Taxonomic Information System), it is immediately valuable as a reference for taxonomic and biodiversity research, as a tool for societal communication, and as a classificatory “backbone” for biodiversity databases, museum collections, libraries, and textbooks. Such a modern comprehensive hierarchy has not previously existed at this level of specificity. PMID:25923521

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bower, J.C.; Burford, M.J.; Downing, T.R.

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less

  1. [Selected aspects of computer-assisted literature management].

    PubMed

    Reiss, M; Reiss, G

    1998-01-01

    We want to report about our own experiences with a database manager. Bibliography database managers are used to manage information resources: specifically, to maintain a database to references and create bibliographies and reference lists for written works. A database manager allows to enter summary information (record) for articles, book sections, books, dissertations, conference proceedings, and so on. Other features that may be included in a database manager include the ability to import references from different sources, such as MEDLINE. The word processing components allow to generate reference list and bibliographies in a variety of different styles, generates a reference list from a word processor manuscript. The function and the use of the software package EndNote 2 for Windows are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.

  2. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    PubMed

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.

  3. Examples of Use of SINBAD Database for Nuclear Data and Code Validation

    NASA Astrophysics Data System (ADS)

    Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto

    2017-09-01

    The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.

  4. The ESID Online Database network.

    PubMed

    Guzman, D; Veit, D; Knerr, V; Kindle, G; Gathmann, B; Eades-Perner, A M; Grimbacher, B

    2007-03-01

    Primary immunodeficiencies (PIDs) belong to the group of rare diseases. The European Society for Immunodeficiencies (ESID), is establishing an innovative European patient and research database network for continuous long-term documentation of patients, in order to improve the diagnosis, classification, prognosis and therapy of PIDs. The ESID Online Database is a web-based system aimed at data storage, data entry, reporting and the import of pre-existing data sources in an enterprise business-to-business integration (B2B). The online database is based on Java 2 Enterprise System (J2EE) with high-standard security features, which comply with data protection laws and the demands of a modern research platform. The ESID Online Database is accessible via the official website (http://www.esid.org/). Supplementary data are available at Bioinformatics online.

  5. A 21st Century Training Model for Flexible, Quick, and Life-Long Workforce Development

    DTIC Science & Technology

    2016-02-01

    specialty code . Differentiation and tailored training are made possible through modern talent management. 8 When Joslin entered Initial Skills...associated with the IST pipeline, but also identified five overarching themes:  Talent Management,  Asynchronous Training,  Modularity (coaching...augmented reality Figure 1: The combination of modern recruitment, talent management, and modular training both in the school house and online speed

  6. The Howling Prescribed Natural Fire - long-term effects on the modernization of planning and implementation of wildland fire management

    Treesearch

    Tom Zimmerman; Laurie Kurth; Mitchell Burgard

    2011-01-01

    Wildland fire management policy and practices have long been driven by the occurrence of significant events. The Howling Prescribed Natural Fire in Glacier National Park in 1994 is a prime example of a significant historical fire event that provided the impetus for program changes and modifications that modernized wildland fire management at the local, regional, and...

  7. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  8. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  9. Resident database interfaces to the DAVID system, a heterogeneous distributed database management system

    NASA Technical Reports Server (NTRS)

    Moroh, Marsha

    1988-01-01

    A methodology for building interfaces of resident database management systems to a heterogeneous distributed database management system under development at NASA, the DAVID system, was developed. The feasibility of that methodology was demonstrated by construction of the software necessary to perform the interface task. The interface terminology developed in the course of this research is presented. The work performed and the results are summarized.

  10. Modernization of Management: Social and Socio-Cultural Aspects

    ERIC Educational Resources Information Center

    Vinogradova, Marina V.; Babakaev, Sergy V.; Larionova, Anna A.; Kobyak, Marina V.; Layko, Mikhail Y.

    2016-01-01

    The relevance of the topic is determined by the new challenges faced by the Russian state in modern conditions that have a significant impact on public administration, which entails the need for its comprehensive modernization. In this regard, this article is aimed at the disclosure of social and socio-cultural aspects of the modernization of…

  11. Compilation of Abstracts of Theses Submitted by Candidates for Degrees

    DTIC Science & Technology

    1987-09-30

    Paral- lel, Multiple Backend Database Systems Feudo, C.V. Modern Hardware Tochnololies 88 MAJ , USA 8nd. Sof ware Techniques for Online uatabase Storage...and itsApplication in the War- gaming , Reseamth and Analysis (W.A.R.) Lab Waltens erger, G.M. On Limited War, Escalation 524 CPT,, USRF Control, and...TECHNIQIUES FOR ONLINE DATABASE ,TORAGE AND ACCESS Christopher V. Feudo Ma or, United States Army B.S., United States Military Academy# 1972

  12. Data Analysis Challenges

    DTIC Science & Technology

    2008-12-01

    projects have either resorted to partitioned smaller databases, or to a hybrid scheme where meta - data are stored in the database, along with pointers to...comes from the briefing of Dr. Mark Duchaineau from LLNL. If we assume that a pixel from a modern airborne sensor covers a square meter, then one can... airborne platform. After surveillance is complete, the data (in fact the disks them- selves) are sent to a ground station for processing. Despite the

  13. Proposals for Changes in Surveying-Legal Procedures for the Needs of Cadastre in Poland

    NASA Astrophysics Data System (ADS)

    Mika, Monika

    2016-12-01

    The aim of this paper is to present the need for changes of geodetic-legal procedures for the cadastre and real estate management. This problem was analyzed both in theoretical and practical terms. In order to better present the analyzed technical and legal procedures, a study of several cases of surveying documentation was made. On their example the problems associated with the surveying services were shows and the formal and legal procedures, on the basis of which described surveying works were done were verified. The problem presented is current and valid not only for the comfort of the surveyor's work, but also from the point of view of the structure and modernization of the real estate cadastre, constituting the backbone of the real estate management. The article emphasized the need to unify the databases of state registers and the digitization of the National Geodetic and Cartographic Resources (PZDGiK). Research has shown that despite the continuous changes of legislation, there are still many shortcomings and gaps, which often complicate the surveying works. The surveyor must analyze and verify all materials he uses, including those obtained from the Centre of Geodetic and Cartographic Documentation (ODGiK). The quality of the geodetic and cartographic elaboration depends largely on the work of the Centre of Geodetic and Cartographic Documentation. The need of modernization of the Land and Buildings Registry, which acts as a cadastre in Poland, has been demonstrated. Furthermore, the unification of data used as reference systems both for plane coordinates and elevation has been proposed.

  14. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  15. An Introduction to Database Management Systems.

    ERIC Educational Resources Information Center

    Warden, William H., III; Warden, Bette M.

    1984-01-01

    Description of database management systems for microcomputers highlights system features and factors to consider in microcomputer system selection. A method for ranking database management systems is explained and applied to a defined need, i.e., software support for indexing a weekly newspaper. A glossary of terms and 32-item bibliography are…

  16. DOD Business Systems Modernization: Progress in Establishing Corporate Management Controls Needs to Be Replicated Within Military Departments

    DTIC Science & Technology

    2008-05-01

    management, and continue to address the act s requirements relative to business system budgetary disclosure and certification and approval of systems costing ...DOD continues to take steps to comply with legislative requirements and related guidance pertaining to its business systems modernization highrisk

  17. Laser applications in surgery

    PubMed Central

    Azadgoli, Beina

    2016-01-01

    In modern medicine, lasers are increasingly utilized for treatment of a variety of pathologies as interest in less invasive treatment modalities intensifies. The physics behind lasers allows the same basic principles to be applied to a multitude of tissue types using slight modifications of the system. Multiple laser systems have been studied within each field of medicine. The term “laser” was combined with “surgery,” “ablation,” “lithotripsy,” “cancer treatment,” “tumor ablation,” “dermatology,” “skin rejuvenation,” “lipolysis,” “cardiology,” “atrial fibrillation (AF),” and “epilepsy” during separate searches in the PubMed database. Original articles that studied the application of laser energy for these conditions were reviewed and included. A review of laser therapy is presented. Laser energy can be safely and effectively used for lithotripsy, for the treatment of various types of cancer, for a multitude of cosmetic and reconstructive procedures, and for the ablation of abnormal conductive pathways. For each of these conditions, management with lasers is comparable to, and potentially superior to, management with more traditional methods. PMID:28090508

  18. [Role and management of cancer clinical database in the application of gastric cancer precision medicine].

    PubMed

    Li, Yuanfang; Zhou, Zhiwei

    2016-02-01

    Precision medicine is a new medical concept and medical model, which is based on personalized medicine, rapid progress of genome sequencing technology and cross application of biological information and big data science. Precision medicine improves the diagnosis and treatment of gastric cancer to provide more convenience through more profound analyses of characteristics, pathogenesis and other core issues in gastric cancer. Cancer clinical database is important to promote the development of precision medicine. Therefore, it is necessary to pay close attention to the construction and management of the database. The clinical database of Sun Yat-sen University Cancer Center is composed of medical record database, blood specimen bank, tissue bank and medical imaging database. In order to ensure the good quality of the database, the design and management of the database should follow the strict standard operation procedure(SOP) model. Data sharing is an important way to improve medical research in the era of medical big data. The construction and management of clinical database must also be strengthened and innovated.

  19. A Critical Evaluation of Phrónêsis as a Key Tool for Professional Excellence for Modern Managers

    ERIC Educational Resources Information Center

    Thomas, Shinto

    2017-01-01

    Phrónêsis or practical wisdom is an important element of Aristotelian virtue ethics. This paper is an attempt to study what is meant by Phrónêsis, how it might be understood, reinterpreted, applied, and extended in contemporary professional management practice and its role in enhancing professional excellence in modern managers. Phrónêsis can…

  20. Ecological selectivity of the emerging mass extinction in the oceans.

    PubMed

    Payne, Jonathan L; Bush, Andrew M; Heim, Noel A; Knope, Matthew L; McCauley, Douglas J

    2016-09-16

    To better predict the ecological and evolutionary effects of the emerging biodiversity crisis in the modern oceans, we compared the association between extinction threat and ecological traits in modern marine animals to associations observed during past extinction events using a database of 2497 marine vertebrate and mollusc genera. We find that extinction threat in the modern oceans is strongly associated with large body size, whereas past extinction events were either nonselective or preferentially removed smaller-bodied taxa. Pelagic animals were victimized more than benthic animals during previous mass extinctions but are not preferentially threatened in the modern ocean. The differential importance of large-bodied animals to ecosystem function portends greater future ecological disruption than that caused by similar levels of taxonomic loss in past mass extinction events. Copyright © 2016, American Association for the Advancement of Science.

  1. Engineering-Geological Data Model - The First Step to Build National Polish Standard for Multilevel Information Management

    NASA Astrophysics Data System (ADS)

    Ryżyński, Grzegorz; Nałęcz, Tomasz

    2016-10-01

    The efficient geological data management in Poland is necessary to support multilevel decision processes for government and local authorities in case of spatial planning, mineral resources and groundwater supply and the rational use of subsurface. Vast amount of geological information gathered in the digital archives and databases of Polish Geological Survey (PGS) is a basic resource for multi-scale national subsurface management. Data integration is the key factor to allow development of GIS and web tools for decision makers, however the main barrier for efficient geological information management is the heterogeneity of data in the resources of the Polish Geological Survey. Engineering-geological database is the first PGS thematic domain applied in the whole data integration plan. The solutions developed within this area will facilitate creation of procedures and standards for multilevel data management in PGS. Twenty years of experience in delivering digital engineering-geological mapping in 1:10 000 scale and archival geotechnical reports acquisition and digitisation allowed gathering of more than 300 thousands engineering-geological boreholes database as well as set of 10 thematic spatial layers (including foundation conditions map, depth to the first groundwater level, bedrock level, geohazards). Historically, the desktop approach was the source form of the geological-engineering data storage, resulting in multiple non-correlated interbase datasets. The need for creation of domain data model emerged and an object-oriented modelling (UML) scheme has been developed. The aim of the aforementioned development was to merge all datasets in one centralised Oracle server and prepare the unified spatial data structure for efficient web presentation and applications development. The presented approach will be the milestone toward creation of the Polish national standard for engineering-geological information management. The paper presents the approach and methodology of data unification, thematic vocabularies harmonisation, assumptions and results of data modelling as well as process of the integration of domain model with enterprise architecture implemented in PGS. Currently, there is no geological data standard in Poland. Lack of guidelines for borehole and spatial data management results in an increasing data dispersion as well as in growing barrier for multilevel data management and implementation of efficient decision support tools. Building the national geological data standard makes geotechnical information accessible to multiple institutions, universities, administration and research organisations and gather their data in the same, unified digital form according to the presented data model. Such approach is compliant with current digital trends and the idea of Spatial Data Infrastructure. Efficient geological data management is essential to support the sustainable development and the economic growth, as they allow implementation of geological information to assist the idea of Smart Cites, deliver information for Building Information Management (BIM) and support modern spatial planning. The engineering-geological domain data model presented in the paper is a scalable solution. Future implementation of developed procedures on other domains of PGS geological data is possible.

  2. The future application of GML database in GIS

    NASA Astrophysics Data System (ADS)

    Deng, Yuejin; Cheng, Yushu; Jing, Lianwen

    2006-10-01

    In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.

  3. Adding EUNIS and VAULT rocket data to the VSO with Modern Perl frameworks

    NASA Astrophysics Data System (ADS)

    Mansky, Edmund

    2017-08-01

    A new Perl code is described, that uses the modern Object-oriented Moose framework, to add EUNIS and VAULT rocket data to the Virtual Solar Observatory website. The code permits the easy fixing of FITS header fields in the case where some FITS fields that are required are missing from the original data files. The code makes novel use of the Moose extensions “before” and “after” to build in dependencies so that database creation of tables occurs before the loading of data, and that the validation of file-dependent tables occurs after the loading is completed. Also described is the computation and loading of the deferred FITS field CHECKSUM into the database following the loading and validation of the file-dependent tables. The loading of the EUNIS 2006 and 2007 flight data, and the VAULT 2.0 flight data is described in detail as illustrative examples.

  4. Telling Modernization: Three Voices. Life History, Gender and the Discourse of Modernization. Roskilde University Life History Project Paper.

    ERIC Educational Resources Information Center

    Anderson, Linda

    The relationship between life history, gender, and the discourse of modernization was examined from the perspective of a researcher with extensive experience performing evaluations about modernization within human services in Denmark. Three stories about site-based management in two human service institutionsa youth center and a boarding school…

  5. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  6. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  7. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  8. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  9. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  10. Scientific Evaluation of Edible Fruits and Spices Used for the Treatment of Peptic Ulcer in Traditional Iranian Medicine

    PubMed Central

    Farzaei, Mohammad Hosein; Shams-Ardekani, Mohammad Reza; Abbasabadi, Zahra; Rahimi, Roja

    2013-01-01

    In traditional Iranian medicine (TIM), several edible fruits and spices are thought to have protective and healing effects on peptic ulcer (PU). The present study was conducted to verify anti-PU activity of these remedies. For this purpose, edible fruits and spices proposed for the management of PU in TIM were collected from TIM sources, and they were searched in modern medical databases to find studies that confirmed their efficacy. Findings from modern investigations support the claims of TIM about the efficacy of many fruits and spices in PU. The fruit of Phyllanthus emblica as a beneficial remedy for PU in TIM has been demonstrated to have antioxidant, wound healing, angiogenic, anti-H. pylori, cytoprotective, antisecretory, and anti-inflammatory properties. The fruit of Vitis vinifera has been found to be anti-H. pylori, anti-inflammatory, wound healing, angiogenic, cytoprotective, and antioxidant. The fruit and aril of seed from Myristica fragrans exert their beneficial effects in PU by increasing prostaglandin, modulation of nitric oxide and inflammatory mediators, wound healing, antisecretory, antacid, antioxidant, and anti-H. pylori activities, and improving angiogenesis. Pharmacological and clinical studies for evaluation of efficacy of all TIM fruits and spices in PU and their possible mechanisms of action are recommended. PMID:24066235

  11. Plantago major in Traditional Persian Medicine and modern phytotherapy: a narrative review

    PubMed Central

    Najafian, Younes; Hamedi, Shokouh Sadat; Farshchi, Masoumeh Kaboli

    2018-01-01

    Plantago major has been used widely since ancient times, to manage a wide range of diseases including constipation, coughs and wounds. The aim of this study is to review the traditional application, botanical characterization, pharmacological activities, phytochemistry effects and toxicity of Plantago major. In this review study, medicinal properties of Plantago major are collected from credible pharmacopeias, textbooks of traditional Persian medicine (TPM) belonging to the 10–18th century AD, such as “The Canon of Medicine”, “Makhzan-Al- Advia” and so on. Moreover, electronic databases including Scopus, Medline and Web of science were explored for this purpose. Plantago major has been prescribed in various forms such as roasted seeds, decoction, syrup, liniment, gargle, rectal enema, vaginal suppository, eye and nasal drop for each illness by TPM scholars. Some of its traditional properties including wound healing, antipyretic, antitussive, anti-infective, anti-hemorrhagic, anti-inflammatory, diuretic, laxative, astringent and hemostatic have been confirmed in recent researches. Phytochemical investigations showed that Plantago major contains volatile compounds, triterpenoids, phenolic acids and flavonoids. Modern pharmacological studies have proven some of the traditional applications of Plantago major. Nevertheless, more investigations are required on this plant, because it has the potential to be used to produce various natural medications. PMID:29629064

  12. A Tony Thomas-Inspired Guide to INSPIRE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Connell, Heath B.; /Fermilab

    2010-04-01

    The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution frommore » the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.« less

  13. Development of a North American paleoclimate pollen-based reconstruction database application

    NASA Astrophysics Data System (ADS)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  14. Classification of Ancient Mammal Individuals Using Dental Pulp MALDI-TOF MS Peptide Profiling

    PubMed Central

    Tran, Thi-Nguyen-Ny; Aboudharam, Gérard; Gardeisen, Armelle; Davoust, Bernard; Bocquet-Appel, Jean-Pierre; Flaudrops, Christophe; Belghazi, Maya; Raoult, Didier; Drancourt, Michel

    2011-01-01

    Background The classification of ancient animal corpses at the species level remains a challenging task for forensic scientists and anthropologists. Severe damage and mixed, tiny pieces originating from several skeletons may render morphological classification virtually impossible. Standard approaches are based on sequencing mitochondrial and nuclear targets. Methodology/Principal Findings We present a method that can accurately classify mammalian species using dental pulp and mass spectrometry peptide profiling. Our work was organized into three successive steps. First, after extracting proteins from the dental pulp collected from 37 modern individuals representing 13 mammalian species, trypsin-digested peptides were used for matrix-assisted laser desorption/ionization time-of-flight mass spectrometry analysis. The resulting peptide profiles accurately classified every individual at the species level in agreement with parallel cytochrome b gene sequencing gold standard. Second, using a 279–modern spectrum database, we blindly classified 33 of 37 teeth collected in 37 modern individuals (89.1%). Third, we classified 10 of 18 teeth (56%) collected in 15 ancient individuals representing five mammal species including human, from five burial sites dating back 8,500 years. Further comparison with an upgraded database comprising ancient specimen profiles yielded 100% classification in ancient teeth. Peptide sequencing yield 4 and 16 different non-keratin proteins including collagen (alpha-1 type I and alpha-2 type I) in human ancient and modern dental pulp, respectively. Conclusions/Significance Mass spectrometry peptide profiling of the dental pulp is a new approach that can be added to the arsenal of species classification tools for forensics and anthropology as a complementary method to DNA sequencing. The dental pulp is a new source for collagen and other proteins for the species classification of modern and ancient mammal individuals. PMID:21364886

  15. Combining new technologies for effective collection development: a bibliometric study using CD-ROM and a database management program.

    PubMed Central

    Burnham, J F; Shearer, B S; Wall, J C

    1992-01-01

    Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424

  16. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  17. Dam Removal Information Portal (DRIP)—A map-based resource linking scientific studies and associated geospatial information about dam removals

    USGS Publications Warehouse

    Duda, Jeffrey J.; Wieferich, Daniel J.; Bristol, R. Sky; Bellmore, J. Ryan; Hutchison, Vivian B.; Vittum, Katherine M.; Craig, Laura; Warrick, Jonathan A.

    2016-08-18

    The removal of dams has recently increased over historical levels due to aging infrastructure, changing societal needs, and modern safety standards rendering some dams obsolete. Where possibilities for river restoration, or improved safety, exceed the benefits of retaining a dam, removal is more often being considered as a viable option. Yet, as this is a relatively new development in the history of river management, science is just beginning to guide our understanding of the physical and ecological implications of dam removal. Ultimately, the “lessons learned” from previous scientific studies on the outcomes dam removal could inform future scientific understanding of ecosystem outcomes, as well as aid in decision-making by stakeholders. We created a database visualization tool, the Dam Removal Information Portal (DRIP), to display map-based, interactive information about the scientific studies associated with dam removals. Serving both as a bibliographic source as well as a link to other existing databases like the National Hydrography Dataset, the derived National Dam Removal Science Database serves as the foundation for a Web-based application that synthesizes the existing scientific studies associated with dam removals. Thus, using the DRIP application, users can explore information about completed dam removal projects (for example, their location, height, and date removed), as well as discover sources and details of associated of scientific studies. As such, DRIP is intended to be a dynamic collection of scientific information related to dams that have been removed in the United States and elsewhere. This report describes the architecture and concepts of this “metaknowledge” database and the DRIP visualization tool.

  18. Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database

    PubMed Central

    Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.

    2010-01-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938

  19. Federated web-accessible clinical data management within an extensible neuroimaging database.

    PubMed

    Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S

    2010-12-01

    Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.

  20. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  1. Global Drainage Patterns to Modern Terrestrial Sedimentary Basins and its Influence on Large River Systems

    NASA Astrophysics Data System (ADS)

    Nyberg, B.; Helland-Hansen, W.

    2017-12-01

    Long-term preservation of alluvial sediments is dependent on the hydrological processes that deposit sediments solely within an area that has available accomodation space and net subsidence know as a sedimentary basin. An understanding of the river processes contributing to terrestrial sedimentary basins is essential to fundamentally constrain and quantify controls on the modern terrestrial sink. Furthermore, the terrestrial source to sink controls place constraints on the entire coastal, shelf and deep marine sediment routing systems. In addition, the geographical importance of modern terrestrial sedimentary basins for agriculture and human settlements has resulted in significant upstream anthropogenic catchment modification for irrigation and energy needs. Yet to our knowledge, a global catchment model depicting the drainage patterns to modern terrestrial sedimentary basins has previously not been established that may be used to address these challenging issues. Here we present a new database of 180,737 global catchments that show the surface drainage patterns to modern terrestrial sedimentary basins. This is achieved by using high resolution river networks derived from digital elevation models in relation to newly acquired maps on global modern sedimentary basins to identify terrestrial sinks. The results show that active tectonic regimes are typically characterized by larger terrestrial sedimentary basins, numerous smaller source catchments and a high source to sink relief ratio. To the contrary passive margins drain catchments to smaller terrestrial sedimentary basins, are composed of fewer source catchments that are relatively larger and a lower source to sink relief ratio. The different geomorphological characteristics of source catchments by tectonic setting influence the spatial and temporal patterns of fluvial architecture within sedimentary basins and the anthropogenic methods of exploiting those rivers. The new digital database resource is aimed to help the geoscientific community to contribute further to our quantitative understanding of source-to-sink systems and its allogenic and autogenic controls, geomorphological characteristics, terrestrial sediment transit times and the anthropogenic impact on those systems.

  2. The JANA calibrations and conditions database API

    NASA Astrophysics Data System (ADS)

    Lawrence, David

    2010-04-01

    Calibrations and conditions databases can be accessed from within the JANA Event Processing framework through the API defined in its JCalibration base class. The API is designed to support everything from databases, to web services to flat files for the backend. A Web Service backend using the gSOAP toolkit has been implemented which is particularly interesting since it addresses many modern cybersecurity issues including support for SSL. The API allows constants to be retrieved through a single line of C++ code with most of the context, including the transport mechanism, being implied by the run currently being analyzed and the environment relieving developers from implementing such details.

  3. Internet Portal For A Distributed Management of Groundwater

    NASA Astrophysics Data System (ADS)

    Meissner, U. F.; Rueppel, U.; Gutzke, T.; Seewald, G.; Petersen, M.

    The management of groundwater resources for the supply of German cities and sub- urban areas has become a matter of public interest during the last years. Negative headlines in the Rhein-Main-Area dealt with cracks in buildings as well as damaged woodlands and inundated agriculture areas as an effect of varying groundwater levels. Usually a holistic management of groundwater resources is not existent because of the complexity of the geological system, the large number of involved groups and their divergent interests and a lack of essential information. The development of a network- based information system for an efficient groundwater management was the target of the project: ?Grundwasser-Online?[1]. The management of groundwater resources has to take into account various hydro- geological, climatic, water-economical, chemical and biological interrelations [2]. Thus, the traditional approaches in information retrieval, which are characterised by a high personnel and time expenditure, are not sufficient. Furthermore, the efficient control of the groundwater cultivation requires a direct communication between the different water supply companies, the consultant engineers, the scientists, the govern- mental agencies and the public, by using computer networks. The presented groundwater information system consists of different components, especially for the collection, storage, evaluation and visualisation of groundwater- relevant information. Network-based technologies are used [3]. For the collection of time-dependant groundwater-relevant information, modern technologies of Mobile Computing have been analysed in order to provide an integrated approach in the man- agement of large groundwater systems. The aggregated information is stored within a distributed geo-scientific database system which enables a direct integration of simu- lation programs for the evaluation of interactions in groundwater systems. Thus, even a prognosis for the evolution of groundwater states can be given. In order to gener- ate reports automatically, technologies are utilised. The visualisation of geo-scientific databases in the internet considering their geographic reference is performed with internet map servers. According to the communication of the map server with the un- derlying geo-scientific database, it is necessary that the demanded data can be filtered interactively in the internet browser using chronological and logical criteria. With re- gard to public use the security aspects within the described distributed system are of 1 major importance. Therefore, security methods for the modelling of access rights in combination with digital signatures have been analysed and implemented in order to provide a secure data exchange and communication between the different partners in the network 2

  4. Graph Databases for Large-Scale Healthcare Systems: A Framework for Efficient Data Management and Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Yubin; Shankar, Mallikarjun; Park, Byung H.

    Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less

  5. Multi-agent system as a new approach to effective chronic heart failure management: key considerations.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin

    2013-09-01

    Given the importance of the follow-up of chronic heart failure (CHF) patients to reduce common causes of re-admission and deterioration of their status that lead to imposing spiritual and physical costs on patients and society, modern technology tools should be used to the best advantage. The aim of this article is to explain key points which should be considered in designing an appropriate multi-agent system to improve CHF management. In this literature review articles were searched with keywords like multi-agent system, heart failure, chronic disease management in Science Direct, Google Scholar and PubMed databases without regard to the year of publications. Agents are an innovation in the field of artificial intelligence. Because agents are capable of solving complex and dynamic health problems, to take full advantage of e-Health, the healthcare system must take steps to make use of this technology. Key factors in CHF management through a multi-agent system approach must be considered such as organization, confidentiality in general aspects and design and architecture points in specific aspects. Note that use of agent systems only with a technical view is associated with many problems. Hence, in delivering healthcare to CHF patients, considering social and human aspects is essential. It is obvious that identifying and resolving technical and non-technical challenges is vital in the successful implementation of this technology.

  6. Multi-Agent System as a New Approach to Effective Chronic Heart Failure Management: Key Considerations

    PubMed Central

    Mohammadzadeh, Niloofar; Rahimi, Azin

    2013-01-01

    Objectives Given the importance of the follow-up of chronic heart failure (CHF) patients to reduce common causes of re-admission and deterioration of their status that lead to imposing spiritual and physical costs on patients and society, modern technology tools should be used to the best advantage. The aim of this article is to explain key points which should be considered in designing an appropriate multi-agent system to improve CHF management. Methods In this literature review articles were searched with keywords like multi-agent system, heart failure, chronic disease management in Science Direct, Google Scholar and PubMed databases without regard to the year of publications. Results Agents are an innovation in the field of artificial intelligence. Because agents are capable of solving complex and dynamic health problems, to take full advantage of e-Health, the healthcare system must take steps to make use of this technology. Key factors in CHF management through a multi-agent system approach must be considered such as organization, confidentiality in general aspects and design and architecture points in specific aspects. Conclusions Note that use of agent systems only with a technical view is associated with many problems. Hence, in delivering healthcare to CHF patients, considering social and human aspects is essential. It is obvious that identifying and resolving technical and non-technical challenges is vital in the successful implementation of this technology. PMID:24195010

  7. System to improve the Understanding of Collected Logistic Data, to Optimize Cycle-Time and Delivery Performance

    NASA Astrophysics Data System (ADS)

    van Rooijen, Wim-Jan; Rodriguez, Ben

    2002-12-01

    A complex production mask-house faces the issue of handling and understanding the logistics information from the production process of the masks. We managed to control key performance indicators like cycle-time, flow-factor, line-speed, WIP, etc. To improve the line flow, we set-up rules for optimising batching at operations and forbid batching between operations, we defined maximum and minimum WIP at the operations, scheduled urgency of the different lots and built rules for bottleneck management. Also we restricted the number of "hot lots". By migrating to the modern MES (manufacturing execution system) MaTISSe, which manages the shopfloor control, and a reporting database, we are able to eliminate the time deviations within our data, caused by data-extraction for different reports at different moments. This gives us a better understanding of our fixed bottleneck and a faster recognition of the temporarily bottlenecks caused by missing availability of machines or men. In this paper we describe the features and advantages of our new MES, as well as the migration process. We have already achieved considerable benefits. Our plan is to extend decision support within the MES, to help both managers and operators to make the right decisions. The project behind this paper reaped major benefits described here and we are looking forward to further challenges and successes.

  8. Security Framework for Pervasive Healthcare Architectures Utilizing MPEG-21 IPMP Components.

    PubMed

    Fragopoulos, Anastasios; Gialelis, John; Serpanos, Dimitrios

    2009-01-01

    Nowadays in modern and ubiquitous computing environments, it is imperative more than ever the necessity for deployment of pervasive healthcare architectures into which the patient is the central point surrounded by different types of embedded and small computing devices, which measure sensitive physical indications, interacting with hospitals databases, allowing thus urgent medical response in occurrences of critical situations. Such environments must be developed satisfying the basic security requirements for real-time secure data communication, and protection of sensitive medical data and measurements, data integrity and confidentiality, and protection of the monitored patient's privacy. In this work, we argue that the MPEG-21 Intellectual Property Management and Protection (IPMP) components can be used in order to achieve protection of transmitted medical information and enhance patient's privacy, since there is selective and controlled access to medical data that sent toward the hospital's servers.

  9. Big Biomedical data as the key resource for discovery science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage,more » aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.« less

  10. [A new concept for integration of image databanks into a comprehensive patient documentation].

    PubMed

    Schöll, E; Holm, J; Eggli, S

    2001-05-01

    Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.

  11. Big biomedical data as the key resource for discovery science

    PubMed Central

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-01-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s. PMID:26198305

  12. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...

  13. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...

  14. Enabling heterogenous multi-scale database for emergency service functions through geoinformation technologies

    NASA Astrophysics Data System (ADS)

    Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.

    2014-11-01

    Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.

  15. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination.

    PubMed

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-10-01

    Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  16. Non-chemotherapy drug-induced neutropenia - an update.

    PubMed

    Andrès, Emmanuel; Mourot-Cottet, Rachel

    2017-11-01

    To date, non-chemotherapy drug-induced severe neutropenia (neutrophil count of ≤0.5 x 10 9 /L) also called idiosyncratic drug-induced agranulocytosis is little discussed in the literature. In the present paper, we report and discuss the clinical data and management of this rare disorder. Areas covered: To do this, we carried out a review of the literature using PubMed database of the US National Library of Medicine. We also used data from the American Society of Hematology educational books, textbooks of Hematology and Internal medicine, and information gleaned from international meetings. Expert opinion: Idiosyncratic agranulocytosis remains a potentially serious adverse event due to the frequency of severe sepsis with severe deep tissue infections (e.g., pneumonia), septicemia, and septic shock in approximately two-thirds of all hospitalized patients. In this context, several prognostic factors have been identified that may be helpful when identifying 'susceptible' patients. Old age (>65 years), septicemia or shock, renal failure, and a neutrophil count ≤0.1 × 10 9 /L have been consensually accepted as poor prognostic factors. In our experience, modern management with pre-established procedures, intravenous broad-spectrum antibiotics and hematopoietic growth factors (particularly G-CSF) is likely to improve the prognosis. Thus with appropriate management, the mortality rate is currently between 5 to 10%.

  17. Paresthesia: A Review of Its Definition, Etiology and Treatments in View of the Traditional Medicine.

    PubMed

    Emami, Seyed Ahmad; Sahebkar, Amirhossein; Javadi, Behjat

    2016-01-01

    To search major Islamic Traditional Medicine (ITM) textbooks for definition, etiology and medicinal plants used to manage 'khadar' or 'paresthesia', a common sensory symptom of multiple sclerosis (MS) and peripheral neuropathies. In addition, the conformity of the efficacy of ITM-suggested plants with the findings from modern pharmacological research on MS will be discussed. Data on the medicinal plants used to treat 'khadar' were obtained from major ITM texts. A detailed search in PubMed, ScienceDirect, Scopus and Google Scholar databases was performed to confirm the effects of ITM-mentioned medicinal plants on MS in view of identified pharmacological actions. Moringa oleifera Lam., Aloe vera (L.) Burm.f., Euphorbia species, Citrullus colocynthis (L.) Schrad., and Costus speciosus (Koen ex. Retz) Sm. are among the most effective ITM plants for the management of 'khadar'. Recent experimental evidence confirms the effectiveness of the mentioned plants in ameliorating MS symptoms. Moreover, according to ITM, prolonged exposure to cold and consuming foodstuff with cold temperament might be involved in the etiopathogenesis of MS. The use of traditional knowledge can help finding neglected risk factors as well as effective and safe therapeutic approaches, phytomedicines and dietary habits for the management of paresthesia and related disorders such as MS.

  18. Alternatives to relational databases in precision medicine: Comparison of NoSQL approaches for big data storage using supercomputers

    NASA Astrophysics Data System (ADS)

    Velazquez, Enrique Israel

    Improvements in medical and genomic technologies have dramatically increased the production of electronic data over the last decade. As a result, data management is rapidly becoming a major determinant, and urgent challenge, for the development of Precision Medicine. Although successful data management is achievable using Relational Database Management Systems (RDBMS), exponential data growth is a significant contributor to failure scenarios. Growing amounts of data can also be observed in other sectors, such as economics and business, which, together with the previous facts, suggests that alternate database approaches (NoSQL) may soon be required for efficient storage and management of big databases. However, this hypothesis has been difficult to test in the Precision Medicine field since alternate database architectures are complex to assess and means to integrate heterogeneous electronic health records (EHR) with dynamic genomic data are not easily available. In this dissertation, we present a novel set of experiments for identifying NoSQL database approaches that enable effective data storage and management in Precision Medicine using patients' clinical and genomic information from the cancer genome atlas (TCGA). The first experiment draws on performance and scalability from biologically meaningful queries with differing complexity and database sizes. The second experiment measures performance and scalability in database updates without schema changes. The third experiment assesses performance and scalability in database updates with schema modifications due dynamic data. We have identified two NoSQL approach, based on Cassandra and Redis, which seems to be the ideal database management systems for our precision medicine queries in terms of performance and scalability. We present NoSQL approaches and show how they can be used to manage clinical and genomic big data. Our research is relevant to the public health since we are focusing on one of the main challenges to the development of Precision Medicine and, consequently, investigating a potential solution to the progressively increasing demands on health care.

  19. --No Title--

    Science.gov Websites

    interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and

  20. Databases for multilevel biophysiology research available at Physiome.jp.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Li, Li; Oka, Hideki; Nomura, Taishin; Kitano, Hiroaki

    2015-01-01

    Physiome.jp (http://physiome.jp) is a portal site inaugurated in 2007 to support model-based research in physiome and systems biology. At Physiome.jp, several tools and databases are available to support construction of physiological, multi-hierarchical, large-scale models. There are three databases in Physiome.jp, housing mathematical models, morphological data, and time-series data. In late 2013, the site was fully renovated, and in May 2015, new functions were implemented to provide information infrastructure to support collaborative activities for developing models and performing simulations within the database framework. This article describes updates to the databases implemented since 2013, including cooperation among the three databases, interactive model browsing, user management, version management of models, management of parameter sets, and interoperability with applications.

  1. Speech Processing in Realistic Battlefield Environments (Le Traitement de la Parole en Environnement de Combat Realiste)

    DTIC Science & Technology

    2009-04-01

    Available Military Speech Databases 2-2 2.3.1 FELIN Database 2-2 2.3.1.1 Overview 2-2 2.3.1.2 Technical Specifications 2-3 2.3.1.3 Limitations...emotion, confusion due to conflicting information, psychological tension, pain , and other typical conditions encountered in the modern battlefield...too, the number of possible language combinations scale with N3. It is clear that in a field of research that has only recently started and with so

  2. Implementation of Three Text to Speech Systems for Kurdish Language

    NASA Astrophysics Data System (ADS)

    Bahrampour, Anvar; Barkhoda, Wafa; Azami, Bahram Zahir

    Nowadays, concatenative method is used in most modern TTS systems to produce artificial speech. The most important challenge in this method is choosing appropriate unit for creating database. This unit must warranty smoothness and high quality speech, and also, creating database for it must reasonable and inexpensive. For example, syllable, phoneme, allophone, and, diphone are appropriate units for all-purpose systems. In this paper, we implemented three synthesis systems for Kurdish language based on syllable, allophone, and diphone and compare their quality using subjective testing.

  3. Alcoa Massena Modernization Project and Request for a Single Source Determination

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  4. Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.

    DTIC Science & Technology

    1985-12-01

    of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level

  5. Offset Requirements for U.S. Steel's Fairfield Modernization

    EPA Pesticide Factsheets

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  6. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2002-08-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  7. TOWARDS A CORE DATA SET FOR LANDSCAPE ASSESSMENTS

    EPA Science Inventory

    One of the primary goals of the NATO Committee on Challenges to Modern Society (CCMS) Landscape Pilot Study is to further develop, apply, and share landscape assessment technologies and spatial databases among participating countries, with the ultimate aim of sustaining environme...

  8. Surgical Management of Benign Biliary Stricture in Chronic Pancreatitis: A Single-Center Experience.

    PubMed

    Ray, Sukanta; Ghatak, Supriyo; Das, Khaunish; Dasgupta, Jayanta; Ray, Sujay; Khamrui, Sujan; Sonar, Pankaj Kumar; Das, Somak

    2015-12-01

    Biliary stricture in chronic pancreatitis (CP) is not uncommon. Previously, all cases were managed by surgery. Nowadays, three important modes of treatment in these patients are observation, endoscopic therapy, and surgery. In the modern era, surgery is recommended only in a subset of patients who develop biliary symptoms or those who have asymptomatic biliary stricture and require surgery for intractable abdominal pain. We want to report on our experience regarding surgical management of CP-induced benign biliary stricture. Over a period of 5 years, we have managed 340 cases of CP at our institution. Bile duct stricture was found in 62 patients. But, surgical intervention was required in 44 patients, and the remaining 18 patients were managed conservatively. Demographic data, operative procedures, postoperative complications, and follow-up parameters of these patients were collected from our prospective database. A total 44 patients were operated for biliary obstruction in the background of CP. Three patients were excluded, so the final analysis was based on 41 patients. The indication for surgery was symptomatic biliary stricture in 27 patients and asymptomatic biliary stricture with intractable abdominal pain in 14 patients. The most commonly performed operation was Frey's procedure. There was no inhospital mortality. Thirty-five patients were well at a mean follow-up of 24.4 months (range 3 to 54 months). Surgery is still the best option for CP-induced benign biliary stricture, and Frey's procedure is a versatile operation unless you suspect malignancy as the cause of biliary obstruction.

  9. Digital games as an effective approach for cancer management: Opportunities and challenges

    PubMed Central

    Ghazisaeidi, Marjan; Safdari, Reza; Goodini, Azadeh; Mirzaiee, Mahboobeh; Farzi, Jebraeil

    2017-01-01

    OBJECTIVE: Cancer is one of the most preventable and common chronic diseases that have economic, social and psychological burden for patients, families, and the society. Cancer can be monitored by new information technology. Digital games as a uniquely powerful interaction tool support optimal care management program operation in all dimensions. The aim of this review article is to describe opportunities and challenges of this new modern technology on the delivery of cancer care services in cancer management domains for cancer care improvement. METHODS: This study was un-systematic (narrative) review article. In this research, 50 full-text papers and reports had been retrieved, studied exactly, and arranged based on study aims. We searched papers based on specific and relevant keywords in research databases including PubMed, ScienceDirect, Scopus, and Google scholar. CONCLUSION: In cancer management domain, digital games are as an effective medium for health education and intervention, disease self-management training, attention distraction to relieve pain, enhance clinical outcomes, improvements in lifestyles, and physical and psychosocial activity promotion when active participation and behavior rehearsal are required for cancer patient. In spite of potential benefits of new technology, sometimes people confront various challenges such as social isolation, unusual anxiety, and disorder in physiological times of body, low physical activities, decrease academic performance, increase aggressive behavior, and physical pain. These problems can be partly overcome by proper planning, good design, and usage of suitable and continuous monitoring. PMID:28584830

  10. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  11. The role of non-technical skills in surgery

    PubMed Central

    Agha, Riaz A.; Fowler, Alexander J.; Sevdalis, Nick

    2015-01-01

    Non-technical skills are of increasing importance in surgery and surgical training. A traditional focus on technical skills acquisition and competence is no longer enough for the delivery of a modern, safe surgical practice. This review discusses the importance of non-technical skills and the values that underpin successful modern surgical practice. This narrative review used a number of sources including written and online, there was no specific search strategy of defined databases. Modern surgical practice requires; technical and non-technical skills, evidence-based practice, an emphasis on lifelong learning, monitoring of outcomes and a supportive institutional and health service framework. Finally these requirements need to be combined with a number of personal and professional values including integrity, professionalism and compassionate, patient-centred care. PMID:26904193

  12. A spatial-temporal system for dynamic cadastral management.

    PubMed

    Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie

    2006-03-01

    A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.

  13. Options of system integrated environment modelling in the predicated dynamic cyberspace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janková, Martina; Dvořák, Jiří

    In this article there are briefly mentioned some selected options of contemporary conception of cybernetic system models in the corresponding and possible integratable environment with modern system dynamics thinking and all this in the cyberspace of possible projecting of predicted system characteristics. The key to new capabilities of system integration modelling in the considered cyberspace is mainly the ability to improve the environment and the system integration options, all this with the aim of modern control in the hierarchically arranged dynamic cyberspace, e.g. in the currently desired electronic business with information. The aim of this article is to assess generallymore » the trends in the use of modern modelling methods considering the cybernetics applications verified in practice, modern concept of project management and also the potential integration of artificial intelligence in the new projecting and project management of integratable and intelligent models, e.g. with the optimal structures and adaptable behaviour.The article results from the solution of a specific research partial task at the faculty; especially the moments proving that the new economics will be based more and more on information, knowledge system defined cyberspace of modern management, are stressed in the text.« less

  14. Anatomy and evolution of database search engines-a central component of mass spectrometry based proteomic workflows.

    PubMed

    Verheggen, Kenneth; Raeder, Helge; Berven, Frode S; Martens, Lennart; Barsnes, Harald; Vaudel, Marc

    2017-09-13

    Sequence database search engines are bioinformatics algorithms that identify peptides from tandem mass spectra using a reference protein sequence database. Two decades of development, notably driven by advances in mass spectrometry, have provided scientists with more than 30 published search engines, each with its own properties. In this review, we present the common paradigm behind the different implementations, and its limitations for modern mass spectrometry datasets. We also detail how the search engines attempt to alleviate these limitations, and provide an overview of the different software frameworks available to the researcher. Finally, we highlight alternative approaches for the identification of proteomic mass spectrometry datasets, either as a replacement for, or as a complement to, sequence database search engines. © 2017 Wiley Periodicals, Inc.

  15. The development of a dynamic software for the user interaction from the geographic information system environment with the database of the calibration site of the satellite remote electro-optic sensors

    NASA Astrophysics Data System (ADS)

    Zyelyk, Ya. I.; Semeniv, O. V.

    2015-12-01

    The state of the problem of the post-launch calibration of the satellite electro-optic remote sensors and its solutions in Ukraine is analyzed. The database is improved and dynamic services for user interaction with database from the environment of open geographical information system Quantum GIS for information support of calibration activities are created. A dynamic application under QGIS is developed, implementing these services in the direction of the possibility of data entering, editing and extraction from the database, using the technology of object-oriented programming and of modern complex program design patterns. The functional and algorithmic support of this dynamic software and its interface are developed.

  16. The research and realization of digital management platform for ultra-precision optical elements within life-cycle

    NASA Astrophysics Data System (ADS)

    Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun

    2014-08-01

    In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.

  17. Computer Science and Technology: Modeling and Measurement Techniques for Evaluation of Design Alternatives in the Implementation of Database Management Software. Final Report.

    ERIC Educational Resources Information Center

    Deutsch, Donald R.

    This report describes a research effort that was carried out over a period of several years to develop and demonstrate a methodology for evaluating proposed Database Management System designs. The major proposition addressed by this study is embodied in the thesis statement: Proposed database management system designs can be evaluated best through…

  18. Database Management System

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.

  19. NBIC: National Ballast Information Clearinghouse

    Science.gov Websites

    Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database / Database Manager: Tami Huber Senior Analyst / Ecologist: Mark Minton Data Managers Ashley Arnwine Jessica Hardee Amanda Reynolds Database Design and Programming / Application Programming: Paul Winterbauer

  20. AGRICULTURAL BEST MANAGEMENT PRACTICE EFFECTIVENESS DATABASE

    EPA Science Inventory

    Resource Purpose:The Agricultural Best Management Practice Effectiveness Database contains the results of research projects which have collected water quality data for the purpose of determining the effectiveness of agricultural management practices in reducing pollutants ...

  1. Modern pollen data from North America and Greenland for multi-scale paleoenvironmental applications

    USGS Publications Warehouse

    Whitmore, J.; Gajewski, K.; Sawada, M.; Williams, J.W.; Shuman, B.; Bartlein, P.J.; Minckley, T.; Viau, A.E.; Webb, T.; Shafer, S.; Anderson, P.; Brubaker, L.

    2005-01-01

    The modern pollen network in North America and Greenland is presented as a database for use in quantitative calibration studies and paleoenvironmental reconstructions. The georeferenced database includes 4634 samples from all regions of the continent and 134 pollen taxa that range from ubiquitous to regionally diagnostic taxa. Climate data and vegetation characteristics were assigned to every site. Automated and manual procedures were used to verify the accuracy of geographic coordinates and identify duplicate records among datasets, incomplete pollen sums, and other potential errors. Data are currently available for almost all of North America, with variable density. Pollen taxonomic diversity, as measured by the Shannon-Weiner coefficient, varies as a function of location, as some vegetation regions are dominated by one or two major pollen producers, while other regions have a more even composition of pollen taxa. Squared-chord distances computed between samples show that most modern pollen samples find analogues within their own vegetation zone. Both temperature and precipitation inferred from best analogues are highly correlated with observed values but temperature exhibits the strongest relation. Maps of the contemporary distribution of several pollen types in relation to the range of the plant taxon illustrate the correspondence between plant and pollen ranges. ?? 2005 Elsevier Ltd. All rights reserved.

  2. Development and testing of transfer functions for generating quantitative climatic estimates from Australian pollen data

    NASA Astrophysics Data System (ADS)

    Cook, Ellyn J.; van der Kaars, Sander

    2006-10-01

    We review attempts to derive quantitative climatic estimates from Australian pollen data, including the climatic envelope, climatic indicator and modern analogue approaches, and outline the need to pursue alternatives for use as input to, or validation of, simulations by models of past, present and future climate patterns. To this end, we have constructed and tested modern pollen-climate transfer functions for mainland southeastern Australia and Tasmania using the existing southeastern Australian pollen database and for northern Australia using a new pollen database we are developing. After testing for statistical significance, 11 parameters were selected for mainland southeastern Australia, seven for Tasmania and six for northern Australia. The functions are based on weighted-averaging partial least squares regression and their predictive ability evaluated against modern observational climate data using leave-one-out cross-validation. Functions for summer, annual and winter rainfall and temperatures are most robust for southeastern Australia, while in Tasmania functions for minimum temperature of the coldest period, mean winter and mean annual temperature are the most reliable. In northern Australia, annual and summer rainfall and annual and summer moisture indexes are the strongest. The validation of all functions means all can be applied to Quaternary pollen records from these three areas with confidence. Copyright

  3. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Technologies and problems of reengineering of the business processes of company

    NASA Astrophysics Data System (ADS)

    Silka, Dmitriy

    2017-10-01

    Management of the combination of business processes is a modern approach in the field of business management. Together with a lot of management approaches business processes allow us to identify all the resultant actions. Article reveals the modern view on the essence of business processes as well as the general approaches of their allocation. Principles of construction and business process re-engineering are proposed. Recommendations on how to perform re-engineering under high cyclic dynamics of business activity are provided.

  5. Should pediatric neurosurgeons still manage neurotrauma today?

    PubMed

    Peter, Jonathan C

    2010-02-01

    Neurotrauma remains a major global burden of injury, especially for young patients, and will consequently always be a condition that pediatric neurosurgeons are called upon to treat. However, the face of modern neurotrauma management is changing, presenting important challenges to today's pediatric neurosurgeons. This article summarizes some of the issues in neurotrauma facing clinicians whose responsibility it is to treat these children. It is up to the individual neurosurgeon to familiarize him- or herself with the emerging literature on the modern management of pediatric neurotrauma.

  6. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    NASA Astrophysics Data System (ADS)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  7. Application of cloud database in the management of clinical data of patients with skin diseases.

    PubMed

    Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning

    2015-04-01

    To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.

  8. Does leadership style of modern matrons contribute to safer and more effective clinical services?

    PubMed

    Hill, Barry

    2017-03-30

    At the time of writing, the author was a modern matron in a surgical division of an NHS teaching hospital in London. This article considers the differences between leadership and management, and discusses the skills required by modern matrons to lead safe and successful clinical services. It also examines three leadership styles - transactional, transformational and situational - and their relevance to the role of modern matron.

  9. 23 CFR 972.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FISH AND... to operate and maintain the management systems and their associated databases; and (5) A process for... systems will use databases with a geographical reference system that can be used to geolocate all database...

  10. A web based relational database management system for filariasis control

    PubMed Central

    Murty, Upadhyayula Suryanarayana; Kumar, Duvvuri Venkata Rama Satya; Sriram, Kumaraswamy; Rao, Kadiri Madhusudhan; Bhattacharyulu, Chakravarthula Hayageeva Narasimha Venakata; Praveen, Bhoopathi; Krishna, Amirapu Radha

    2005-01-01

    The present study describes a RDBMS (relational database management system) for the effective management of Filariasis, a vector borne disease. Filariasis infects 120 million people from 83 countries. The possible re-emergence of the disease and the complexity of existing control programs warrant the development of new strategies. A database containing comprehensive data associated with filariasis finds utility in disease control. We have developed a database containing information on the socio-economic status of patients, mosquito collection procedures, mosquito dissection data, filariasis survey report and mass blood data. The database can be searched using a user friendly web interface. Availability http://www.webfil.org (login and password can be obtained from the authors) PMID:17597846

  11. Integrating RFID technique to design mobile handheld inventory management system

    NASA Astrophysics Data System (ADS)

    Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung

    2008-04-01

    An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.

  12. Alternatives to relational database: comparison of NoSQL and XML approaches for clinical data storage.

    PubMed

    Lee, Ken Ka-Yin; Tang, Wai-Choi; Choi, Kup-Sze

    2013-04-01

    Clinical data are dynamic in nature, often arranged hierarchically and stored as free text and numbers. Effective management of clinical data and the transformation of the data into structured format for data analysis are therefore challenging issues in electronic health records development. Despite the popularity of relational databases, the scalability of the NoSQL database model and the document-centric data structure of XML databases appear to be promising features for effective clinical data management. In this paper, three database approaches--NoSQL, XML-enabled and native XML--are investigated to evaluate their suitability for structured clinical data. The database query performance is reported, together with our experience in the databases development. The results show that NoSQL database is the best choice for query speed, whereas XML databases are advantageous in terms of scalability, flexibility and extensibility, which are essential to cope with the characteristics of clinical data. While NoSQL and XML technologies are relatively new compared to the conventional relational database, both of them demonstrate potential to become a key database technology for clinical data management as the technology further advances. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. Analysis and preliminary design of Kunming land use and planning management information system

    NASA Astrophysics Data System (ADS)

    Li, Li; Chen, Zhenjie

    2007-06-01

    This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.

  14. Map and digital database of sedimentary basins and indications of petroleum in the Central Alaska Province

    USGS Publications Warehouse

    Troutman, Sandra M.; Stanley, Richard G.

    2003-01-01

    This database and accompanying text depict historical and modern reported occurrences of petroleum both in wells and at the surface within the boundaries of the Central Alaska Province. These data were compiled from previously published and unpublished sources and were prepared for use in the 2002 U.S. Geological Survey petroleum assessment of Central Alaska, Yukon Flats region. Indications of petroleum are described as oil or gas shows in wells, oil or gas seeps, or outcrops of oil shale or oil-bearing rock and include confirmed and unconfirmed reports. The scale of the source map limits the spatial resolution (scale) of the database to 1:2,500,000 or smaller.

  15. Serials Management by Microcomputer: The Potential of DBMS.

    ERIC Educational Resources Information Center

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…

  16. Modern Approaches to Training Competitive Locksmiths-Electricians in the EU Countries

    ERIC Educational Resources Information Center

    Roskvas, Ihor

    2017-01-01

    The article deals with the issues of modern approaches to training competitive locksmiths-electricians and the influence of effective management on the process of vocational training. The modern labour market needs concerning vocational training of highly qualified workers have been analyzed. The concept of a competitive worker has been revealed…

  17. Some Reliability Issues in Very Large Databases.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1988-01-01

    Describes the unique reliability problems of very large databases that necessitate specialized techniques for hardware problem management. The discussion covers the use of controlled partial redundancy to improve reliability, issues in operating systems and database management systems design, and the impact of disk technology on very large…

  18. Tufts Health Sciences Database: Lessons, Issues, and Opportunities.

    ERIC Educational Resources Information Center

    Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.

    2003-01-01

    Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…

  19. Herbal traditional Chinese medicine and its evidence base in gastrointestinal disorders

    PubMed Central

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Eickhoff, Axel; Schulze, Johannes

    2015-01-01

    Herbal traditional Chinese medicine (TCM) is used to treat several ailments, but its efficiency is poorly documented and hence debated, as opposed to modern medicine commonly providing effective therapies. The aim of this review article is to present a practical reference guide on the role of herbal TCM in managing gastrointestinal disorders, supported by systematic reviews and evidence based trials. A literature search using herbal TCM combined with terms for gastrointestinal disorders in PubMed and the Cochrane database identified publications of herbal TCM trials. Results were analyzed for study type, inclusion criteria, and outcome parameters. Quality of placebo controlled, randomized, double-blind clinical trials was poor, mostly neglecting stringent evidence based diagnostic and therapeutic criteria. Accordingly, appropriate Cochrane reviews and meta-analyses were limited and failed to support valid, clinically relevant evidence based efficiency of herbal TCM in gastrointestinal diseases, including gastroesophageal reflux disease, gastric or duodenal ulcer, dyspepsia, irritable bowel syndrome, ulcerative colitis, and Crohn’s disease. In conclusion, the use of herbal TCM to treat various diseases has an interesting philosophical background with a long history, but it received increasing skepticism due to the lack of evidence based efficiency as shown by high quality trials; this has now been summarized for gastrointestinal disorders, with TCM not recommended for most gastrointestinal diseases. Future studies should focus on placebo controlled, randomized, double-blind clinical trials, herbal product quality and standard criteria for diagnosis, treatment, outcome, and assessment of adverse herb reactions. This approach will provide figures of risk/benefit profiles that hopefully are positive for at least some treatment modalities of herbal TCM. Proponents of modern herbal TCM best face these promising challenges of pragmatic modern medicine by bridging the gap between the two medicinal cultures. PMID:25914456

  20. Herbal traditional Chinese medicine and its evidence base in gastrointestinal disorders.

    PubMed

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Eickhoff, Axel; Schulze, Johannes

    2015-04-21

    Herbal traditional Chinese medicine (TCM) is used to treat several ailments, but its efficiency is poorly documented and hence debated, as opposed to modern medicine commonly providing effective therapies. The aim of this review article is to present a practical reference guide on the role of herbal TCM in managing gastrointestinal disorders, supported by systematic reviews and evidence based trials. A literature search using herbal TCM combined with terms for gastrointestinal disorders in PubMed and the Cochrane database identified publications of herbal TCM trials. Results were analyzed for study type, inclusion criteria, and outcome parameters. Quality of placebo controlled, randomized, double-blind clinical trials was poor, mostly neglecting stringent evidence based diagnostic and therapeutic criteria. Accordingly, appropriate Cochrane reviews and meta-analyses were limited and failed to support valid, clinically relevant evidence based efficiency of herbal TCM in gastrointestinal diseases, including gastroesophageal reflux disease, gastric or duodenal ulcer, dyspepsia, irritable bowel syndrome, ulcerative colitis, and Crohn's disease. In conclusion, the use of herbal TCM to treat various diseases has an interesting philosophical background with a long history, but it received increasing skepticism due to the lack of evidence based efficiency as shown by high quality trials; this has now been summarized for gastrointestinal disorders, with TCM not recommended for most gastrointestinal diseases. Future studies should focus on placebo controlled, randomized, double-blind clinical trials, herbal product quality and standard criteria for diagnosis, treatment, outcome, and assessment of adverse herb reactions. This approach will provide figures of risk/benefit profiles that hopefully are positive for at least some treatment modalities of herbal TCM. Proponents of modern herbal TCM best face these promising challenges of pragmatic modern medicine by bridging the gap between the two medicinal cultures.

  1. Characterization of Oribtal Debris via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, H.

    2015-01-01

    Existing DoD and NASA satellite breakup models are based on a key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve the break-up models and the NASA Size Estimation Model (SEM) for events involving more modern satellite designs, the NASA Orbital Debris Program Office has worked in collaboration with the University of Florida to replicate a hypervelocity impact using a satellite built with modern-day spacecraft materials and construction techniques. The spacecraft, called DebriSat, was intended to be a representative of modern LEO satellites and all major designs decisions were reviewed and approved by subject matter experts at Aerospace Corporation. DebriSat is composed of 7 major subsystems including attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. All fragments down to 2 mm is size will be characterized via material, size, shape, bulk density, and the associated data will be stored in a database for multiple users to access. Laboratory radar and optical measurements will be performed on a subset of fragments to provide a better understanding of the data products from orbital debris acquired from ground-based radars and telescopes. The resulting data analysis from DebriSat will be used to update break-up models and develop the first optical SEM in conjunction with updates into the current NASA SEM. The characterization of the fragmentation will be discussed in the subsequent presentation.

  2. [Establishement for regional pelvic trauma database in Hunan Province].

    PubMed

    Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua

    2017-04-28

    To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry.
 Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6.
 Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management.
 Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.

  3. Database on Demand: insight how to build your own DBaaS

    NASA Astrophysics Data System (ADS)

    Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio

    2015-12-01

    At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.

  4. A storage scheme for the real-time database supporting the on-line commitment

    NASA Astrophysics Data System (ADS)

    Dai, Hong-bin; Jing, Yu-jian; Wang, Hui

    2013-07-01

    The modern SCADA (Supervisory Control and Data acquisition) systems have been applied to various aspects of everyday life. As the time goes on, the requirements of the applications of the systems vary. Thus the data structure of the real-time database, which is the core of a SCADA system, often needs modification. As a result, the commitment consisting of a sequence of configuration operations modifying the data structure of the real-time database is performed from time to time. Though it is simple to perform the off-line commitment by first stopping and then restarting the system, during which all the data in the real-time database are reconstructed. It is much more preferred or in some cases even necessary to perform the on-line commitment, during which the real-time database can still provide real-time service and the system continues working normally. In this paper, a storage scheme of the data in the real-time database is proposed. It helps the real-time database support its on-line commitment, during which real-time service is still available.

  5. Insertion algorithms for network model database management systems

    NASA Astrophysics Data System (ADS)

    Mamadolimov, Abdurashid; Khikmat, Saburov

    2017-12-01

    The network model is a database model conceived as a flexible way of representing objects and their relationships. Its distinguishing feature is that the schema, viewed as a graph in which object types are nodes and relationship types are arcs, forms partial order. When a database is large and a query comparison is expensive then the efficiency requirement of managing algorithms is minimizing the number of query comparisons. We consider updating operation for network model database management systems. We develop a new sequantial algorithm for updating operation. Also we suggest a distributed version of the algorithm.

  6. Modern Functions of a Textbook on Social Sciences and Humanities as an Informational Management Tool of University Education

    ERIC Educational Resources Information Center

    Nikonova, Elina I.; Sharonov, Ivan A.; Sorokoumova, Svetlana N.; Suvorova, Olga V.; Sorokoumova, Elena A.

    2016-01-01

    The relevance of the study is conditioned by the changes in the content of socio-humanitarian education, aimed at the acquisition of knowledge, the development of tolerance, civic and moral education. The purpose of the paper is to identify the modern functions of a textbook on social sciences and humanities as an informational management tool of…

  7. Historical and modern disturbance regimes, stand structures, and landscape dynamics in pinon-juniper vegetation of the western U.S.

    Treesearch

    William H. Romme; Craig D. Allen; John D. Bailey; William L. Baker; Brandon T. Bestelmeyer; Peter M. Brown; Karen S. Eisenhart; Lisa Floyd-Hanna; Dustin W. Huffman; Brian F. Jacobs; Richard F. Miller; Esteban H. Muldavin; Thomas W. Swetnam; Robin J. Tausch; Peter J. Weisberg

    2008-01-01

    Pinon-juniper is one of the major vegetation types in western North America. It covers a huge area, provides many resources and ecosystem services, and is of great management concern. Management of pinon-juniper vegetation has been hindered, especially where ecological restoration is a goal, by inadequate understanding of the variability in historical and modern...

  8. Using Statistics for Database Management in an Academic Library.

    ERIC Educational Resources Information Center

    Hyland, Peter; Wright, Lynne

    1996-01-01

    Collecting statistical data about database usage by library patrons aids in the management of CD-ROM and database offerings, collection development, and evaluation of training programs. Two approaches to data collection are presented which should be used together: an automated or nonintrusive method which monitors search sessions while the…

  9. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  10. Teaching Database Management System Use in a Library School Curriculum.

    ERIC Educational Resources Information Center

    Cooper, Michael D.

    1985-01-01

    Description of database management systems course being taught to students at School of Library and Information Studies, University of California, Berkeley, notes course structure, assignments, and course evaluation. Approaches to teaching concepts of three types of database systems are discussed and systems used by students in the course are…

  11. The land management and operations database (LMOD)

    USDA-ARS?s Scientific Manuscript database

    This paper presents the design, implementation, deployment, and application of the Land Management and Operations Database (LMOD). LMOD is the single authoritative source for reference land management and operation reference data within the USDA enterprise data warehouse. LMOD supports modeling appl...

  12. Development of bilateral data transferability in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2006-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was designed, developed, and implemented at the Virginia Department of Transportation (VDOT) in 2002 to retrieve, manage, archive, and analyze geotechnical da...

  13. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    NASA Astrophysics Data System (ADS)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  14. Transaction Processing Performance Council (TPC): State of the Council 2010

    NASA Astrophysics Data System (ADS)

    Nambiar, Raghunath; Wakou, Nicholas; Carman, Forrest; Majdalany, Michael

    The Transaction Processing Performance Council (TPC) is a non-profit corporation founded to define transaction processing and database benchmarks and to disseminate objective, verifiable performance data to the industry. Established in August 1988, the TPC has been integral in shaping the landscape of modern transaction processing and database benchmarks over the past twenty-two years. This paper provides an overview of the TPC's existing benchmark standards and specifications, introduces two new TPC benchmarks under development, and examines the TPC's active involvement in the early creation of additional future benchmarks.

  15. XML Storage for Magnetotelluric Transfer Functions: Towards a Comprehensive Online Reference Database

    NASA Astrophysics Data System (ADS)

    Kelbert, A.; Blum, C.

    2015-12-01

    Magnetotelluric Transfer Functions (MT TFs) represent most of the information about Earth electrical conductivity found in the raw electromagnetic data, providing inputs for further inversion and interpretation. To be useful for scientific interpretation, they must also contain carefully recorded metadata. Making these data available in a discoverable and citable fashion would provide the most benefit to the scientific community, but such a development requires that the metadata is not only present in the file but is also searchable. The most commonly used MT TF format to date, the historical Society of Exploration Geophysicists Electromagnetic Data Interchange Standard 1987 (EDI), no longer supports some of the needs of modern magnetotellurics, most notably accurate error bars recording. Moreover, the inherent heterogeneity of EDI's and other historic MT TF formats has mostly kept the community away from healthy data sharing practices. Recently, the MT team at Oregon State University in collaboration with IRIS Data Management Center developed a new, XML-based format for MT transfer functions, and an online system for long-term storage, discovery and sharing of MT TF data worldwide (IRIS SPUD; www.iris.edu/spud/emtf). The system provides a query page where all of the MT transfer functions collected within the USArray MT experiment and other field campaigns can be searched for and downloaded; an automatic on-the-fly conversion to the historic EDI format is also included. To facilitate conversion to the new, more comprehensive and sustainable, XML format for MT TFs, and to streamline inclusion of historic data into the online database, we developed a set of open source format conversion tools, which can be used for rotation of MT TFs as well as a general XML <-> EDI converter (https://seiscode.iris.washington.edu/projects/emtf-fcu). Here, we report on the newly established collaboration between the USGS Geomagnetism Program and the Oregon State University to gather and convert both historic and modern-day MT or related transfer functions into the searchable database at the IRIS DMC. The more complete and free access to these previously collected MT TFs will be of great value to MT scientists both in planning future surveys, and then to leverage the value of the new data at the inversion and interpretation stage.

  16. Leading with charisma.

    PubMed

    Davidhizar, R

    1993-04-01

    Traditionally, leaders have used characteristics related to authority, control, competition and logic. Such approaches are more autocratic, and task-oriented. With changes in society, employers are focusing less on tasks and more on job satisfaction. Leaders are focusing on co-operation versus competition. Human relations and recognition are being used as motivators. Charisma is an important characteristic for leaders who wish to motivate by interpersonal characteristics. Transformational leadership is an emerging paradigm for modern management and can be important to the modern nurse manager as well. This paper describes charisma and how it can be useful to the nurse manager.

  17. Modern management of juvenile recurrent parotitis.

    PubMed

    Capaccio, P; Sigismund, P E; Luca, N; Marchisio, P; Pignataro, L

    2012-12-01

    To evaluate modern diagnostic and therapeutic management of juvenile recurrent parotitis, and to show the benefits of operative sialoendoscopy on the basis of our experience in 14 patients and the results of others. Ultrasonography is sensitive in detecting the pathological features of juvenile recurrent parotitis. Interventional sialoendoscopy is a safe and effective method of treating the disease. In our case series, after a mean follow-up time of 30 months only 5 patients experienced recurrence of symptoms, with a mean symptom-free period of 20 months. The use of modern, minimally invasive diagnostic tools such as colour Doppler ultrasonography, magnetic resonance sialography and sialoendoscopy represents a new frontier in the management of juvenile recurrent parotitis. Operative sialoendoscopy also has the important therapeutic benefit of reducing the number of recurrences of acute episodes of parotitis, thus giving patients a better quality of life until puberty.

  18. Kristin Munch | NREL

    Science.gov Websites

    Information Management System, Materials Research Society Fall Meeting (2013) Photovoltaics Informatics scientific data management, database and data systems design, database clusters, storage systems integration , and distributed data analytics. She has used her experience in laboratory data management systems, lab

  19. Development of the interconnectivity and enhancement (ICE) module in the Virginia Department of Transportation's Geotechnical Database Management System Framework.

    DOT National Transportation Integrated Search

    2007-01-01

    An Internet-based, spatiotemporal Geotechnical Database Management System (GDBMS) Framework was implemented at the Virginia Department of Transportation (VDOT) in 2002 to manage geotechnical data using a distributed Geographical Information System (G...

  20. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  1. 23 CFR 973.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS MANAGEMENT... system; (2) A process to operate and maintain the management systems and their associated databases; (3... systems shall use databases with a common or coordinated reference system that can be used to geolocate...

  2. Developing Modern Information Systems and Services: Africa's Challenges for the Future.

    ERIC Educational Resources Information Center

    Chowdhury, G. G.

    1996-01-01

    Discusses the current state of information systems and services in Africa, examines future possibilities, and suggests areas for improvement. Topics include the lack of automation; CD-ROM databases for accessibility to information sources; developing low-cost electronic communication facilities; Internet connectivity; dependence on imported…

  3. Salary Management System for Small and Medium-sized Enterprises

    NASA Astrophysics Data System (ADS)

    Hao, Zhang; Guangli, Xu; Yuhuan, Zhang; Yilong, Lei

    Small and Medium-sized Enterprises (SMEs) in the process of wage entry, calculation, the total number are needed to be done manually in the past, the data volume is quite large, processing speed is low, and it is easy to make error, which is resulting in low efficiency. The main purpose of writing this paper is to present the basis of salary management system, establish a scientific database, the computer payroll system, using the computer instead of a lot of past manual work in order to reduce duplication of staff labor, it will improve working efficiency.This system combines the actual needs of SMEs, through in-depth study and practice of the C/S mode, PowerBuilder10.0 development tools, databases and SQL language, Completed a payroll system needs analysis, database design, application design and development work. Wages, departments, units and personnel database file are included in this system, and have data management, department management, personnel management and other functions, through the control and management of the database query, add, delete, modify, and other functions can be realized. This system is reasonable design, a more complete function, stable operation has been tested to meet the basic needs of the work.

  4. High-performance Negative Database for Massive Data Management System of The Mingantu Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Shi, Congming; Wang, Feng; Deng, Hui; Liu, Yingbo; Liu, Cuiyin; Wei, Shoulin

    2017-08-01

    As a dedicated synthetic aperture radio interferometer in China, the MingantU SpEctral Radioheliograph (MUSER), initially known as the Chinese Spectral RadioHeliograph (CSRH), has entered the stage of routine observation. More than 23 million data records per day need to be effectively managed to provide high-performance data query and retrieval for scientific data reduction. In light of these massive amounts of data generated by the MUSER, in this paper, a novel data management technique called the negative database (ND) is proposed and used to implement a data management system for the MUSER. Based on the key-value database, the ND technique makes complete utilization of the complement set of observational data to derive the requisite information. Experimental results showed that the proposed ND can significantly reduce storage volume in comparison with a relational database management system (RDBMS). Even when considering the time needed to derive records that were absent, its overall performance, including querying and deriving the data of the ND, is comparable with that of a relational database management system (RDBMS). The ND technique effectively solves the problem of massive data storage for the MUSER and is a valuable reference for the massive data management required in next-generation telescopes.

  5. Looking ahead in systems engineering

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Donald S.

    1966-01-01

    Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.

  6. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  7. 48 CFR 52.232-33 - Payment by Electronic Funds Transfer-System for Award Management.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... contained in the System for Award Management (SAM) database. In the event that the EFT information changes, the Contractor shall be responsible for providing the updated information to the SAM database. (c... 210. (d) Suspension of payment. If the Contractor's EFT information in the SAM database is incorrect...

  8. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  9. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  10. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  11. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  12. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  13. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  14. 47 CFR 101.1523 - Sharing and coordination among non-government licensees and between non-government and government...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Wireless Telecommunications Bureau announces by public notice the implementation of a third-party database...) Provide an electronic copy of an interference analysis to the third-party database manager which...-party database managers shall receive and retain the interference analyses electronically and make them...

  15. A Database Design and Development Case: NanoTEK Networks

    ERIC Educational Resources Information Center

    Ballenger, Robert M.

    2010-01-01

    This case provides a real-world project-oriented case study for students enrolled in a management information systems, database management, or systems analysis and design course in which database design and development are taught. The case consists of a business scenario to provide background information and details of the unique operating…

  16. Database Access Systems.

    ERIC Educational Resources Information Center

    Dalrymple, Prudence W.; Roderer, Nancy K.

    1994-01-01

    Highlights the changes that have occurred from 1987-93 in database access systems. Topics addressed include types of databases, including CD-ROMs; enduser interface; database selection; database access management, including library instruction and use of primary literature; economic issues; database users; the search process; and improving…

  17. BoreholeAR: A mobile tablet application for effective borehole database visualization using an augmented reality technology

    NASA Astrophysics Data System (ADS)

    Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong

    2015-03-01

    Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.

  18. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Data management in the modern structural biology and biomedical research environment.

    PubMed

    Zimmerman, Matthew D; Grabowski, Marek; Domagalski, Marcin J; Maclean, Elizabeth M; Chruszcz, Maksymilian; Minor, Wladek

    2014-01-01

    Modern high-throughput structural biology laboratories produce vast amounts of raw experimental data. The traditional method of data reduction is very simple-results are summarized in peer-reviewed publications, which are hopefully published in high-impact journals. By their nature, publications include only the most important results derived from experiments that may have been performed over the course of many years. The main content of the published paper is a concise compilation of these data, an interpretation of the experimental results, and a comparison of these results with those obtained by other scientists.Due to an avalanche of structural biology manuscripts submitted to scientific journals, in many recent cases descriptions of experimental methodology (and sometimes even experimental results) are pushed to supplementary materials that are only published online and sometimes may not be reviewed as thoroughly as the main body of a manuscript. Trouble may arise when experimental results are contradicting the results obtained by other scientists, which requires (in the best case) the reexamination of the original raw data or independent repetition of the experiment according to the published description of the experiment. There are reports that a significant fraction of experiments obtained in academic laboratories cannot be repeated in an industrial environment (Begley CG & Ellis LM, Nature 483(7391):531-3, 2012). This is not an indication of scientific fraud but rather reflects the inadequate description of experiments performed on different equipment and on biological samples that were produced with disparate methods. For that reason the goal of a modern data management system is not only the simple replacement of the laboratory notebook by an electronic one but also the creation of a sophisticated, internally consistent, scalable data management system that will combine data obtained by a variety of experiments performed by various individuals on diverse equipment. All data should be stored in a core database that can be used by custom applications to prepare internal reports, statistics, and perform other functions that are specific to the research that is pursued in a particular laboratory.This chapter presents a general overview of the methods of data management and analysis used by structural genomics (SG) programs. In addition to a review of the existing literature on the subject, also presented is experience in the development of two SG data management systems, UniTrack and LabDB. The description is targeted to a general audience, as some technical details have been (or will be) published elsewhere. The focus is on "data management," meaning the process of gathering, organizing, and storing data, but also briefly discussed is "data mining," the process of analysis ideally leading to an understanding of the data. In other words, data mining is the conversion of data into information. Clearly, effective data management is a precondition for any useful data mining. If done properly, gathering details on millions of experiments on thousands of proteins and making them publicly available for analysis-even after the projects themselves have ended-may turn out to be one of the most important benefits of SG programs.

  20. Modern Initial Management of Severe Limbs Trauma in War Surgery: Orthopaedic Damage Control

    DTIC Science & Technology

    2010-04-01

    avoid fat embolism , allow an optimal nursing and medical evacuation without any secondary functional consequences [3]. 2.2.1 Indications: The...decrease the risk of fat embolism . Modern Initial Management of Severe Limbs Trauma in War Surgery: “Orthopaedic Damage Control” RTO-MP-HFM-182 17...injuries. Orthopaedic Imperious: Multiple open shaft fractures with blood loss, complex epiphysal fractures requiring a long difficult surgical bloody

  1. Network Configuration of Oracle and Database Programming Using SQL

    NASA Technical Reports Server (NTRS)

    Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.

    2000-01-01

    A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).

  2. A geospatial database model for the management of remote sensing datasets at multiple spectral, spatial, and temporal scales

    NASA Astrophysics Data System (ADS)

    Ifimov, Gabriela; Pigeau, Grace; Arroyo-Mora, J. Pablo; Soffer, Raymond; Leblanc, George

    2017-10-01

    In this study the development and implementation of a geospatial database model for the management of multiscale datasets encompassing airborne imagery and associated metadata is presented. To develop the multi-source geospatial database we have used a Relational Database Management System (RDBMS) on a Structure Query Language (SQL) server which was then integrated into ArcGIS and implemented as a geodatabase. The acquired datasets were compiled, standardized, and integrated into the RDBMS, where logical associations between different types of information were linked (e.g. location, date, and instrument). Airborne data, at different processing levels (digital numbers through geocorrected reflectance), were implemented in the geospatial database where the datasets are linked spatially and temporally. An example dataset consisting of airborne hyperspectral imagery, collected for inter and intra-annual vegetation characterization and detection of potential hydrocarbon seepage events over pipeline areas, is presented. Our work provides a model for the management of airborne imagery, which is a challenging aspect of data management in remote sensing, especially when large volumes of data are collected.

  3. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  4. Universal Index System

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos; Wallace, Sarah

    1993-01-01

    The Universal Index System (UIS) is an index management system that uses a uniform interface to solve the heterogeneity problem among database management systems. UIS provides an easy-to-use common interface to access all underlying data, but also allows different underlying database management systems, storage representations, and access methods.

  5. Maintaining Research Documents with Database Management Software.

    ERIC Educational Resources Information Center

    Harrington, Stuart A.

    1999-01-01

    Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)

  6. 76 FR 15953 - Agency Information Collection Activities; Announcement of Office of Management and Budget...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... CONSUMER PRODUCT SAFETY COMMISSION Agency Information Collection Activities; Announcement of Office of Management and Budget Approval; Publicly Available Consumer Product Safety Information Database... Product Safety Information Database has been approved by the Office of Management and Budget (OMB) under...

  7. Modern Design of Resonant Edge-Slot Array Antennas

    NASA Technical Reports Server (NTRS)

    Gosselin, R. B.

    2006-01-01

    Resonant edge-slot (slotted-waveguide) array antennas can now be designed very accurately following a modern computational approach like that followed for some other microwave components. This modern approach makes it possible to design superior antennas at lower cost than was previously possible. Heretofore, the physical and engineering knowledge of resonant edge-slot array antennas had remained immature since they were introduced during World War II. This is because despite their mechanical simplicity, high reliability, and potential for operation with high efficiency, the electromagnetic behavior of resonant edge-slot antennas is very complex. Because engineering design formulas and curves for such antennas are not available in the open literature, designers have been forced to implement iterative processes of fabricating and testing multiple prototypes to derive design databases, each unique for a specific combination of operating frequency and set of waveguide tube dimensions. The expensive, time-consuming nature of these processes has inhibited the use of resonant edge-slot antennas. The present modern approach reduces costs by making it unnecessary to build and test multiple prototypes. As an additional benefit, this approach affords a capability to design an array of slots having different dimensions to taper the antenna illumination to reduce the amplitudes of unwanted side lobes. The heart of the modern approach is the use of the latest commercially available microwave-design software, which implements finite-element models of electromagnetic fields in and around waveguides, antenna elements, and similar components. Instead of building and testing prototypes, one builds a database and constructs design curves from the results of computational simulations for sets of design parameters. The figure shows a resonant edge-slot antenna designed following this approach. Intended for use as part of a radiometer operating at a frequency of 10.7 GHz, this antenna was fabricated from dimensions defined exclusively by results of computational simulations. The final design was found to be well optimized and to yield performance exceeding that initially required.

  8. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  9. Federal Aviation Administration : challenges in modernizing the agency

    DOT National Transportation Integrated Search

    2000-02-01

    FAA's efforts to implement initiatives in five key areas-air traffic control modernization, procurement and personnel reform, aviation safety, aviation and computer security, and financial management-have met with limited success. For example, FAA ha...

  10. searchSCF: Using MongoDB to Enable Richer Searches of Locally Hosted Science Data Repositories

    NASA Astrophysics Data System (ADS)

    Knosp, B.

    2016-12-01

    Science teams today are in the unusual position of almost having too much data available to them. Modern sensors and models are capable of outputting terabytes of data per day, which can make it difficult to find specific subsets of data. The sheer size of files can also make it time consuming to retrieve this big data from national data archive centers. Thus, many science teams choose to store what data they can on their local systems, but they are not always equipped with tools to help them intelligently organize and search their data. In its local data repository, the Aura Microwave Limb Sounder (MLS) science team at NASA's Jet Propulsion Laboratory has collected over 300TB of atmospheric science data from 71 missions/models that aid in validation, algorithm development, and research activities. When the project began, the team developed a MySQL database to aid in data queries, but this database was only designed to keep track of MLS and a few ancillary data sets, leving much of the data uncatalogued. The team has also seen database query time rise over the life of the mission. Even though the MLS science team's data holdings are not the size of a national data center's, team members still need tools to help them discover and utilize the data that they have on-hand. Over the past year, members of the science team have been looking for solutions to (1) store information on all the data sets they have collected in a single database, (2) store more metadata about each data file, (3) develop queries that can find relationships among these disparate data types, and (4) plug any new functions developed around this database into existing analysis, visualization, and web tools, transparently to users. In this presentation, I will discuss the searchSCF package that is currently under development. This package includes a NoSQL database management system (MongoDB) and a set of Python tools that both ingests data into the database and supports user queries. I will also highlight case studies of how this system could be used by the MLS science team, and how it could be implemented by other science teams with local data repositories.

  11. Testing, Requirements, and Metrics

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hyatt, Larry; Hammer, Theodore F.; Huffman, Lenore; Wilson, William

    1998-01-01

    The criticality of correct, complete, testable requirements is a fundamental tenet of software engineering. Also critical is complete requirements based testing of the final product. Modern tools for managing requirements allow new metrics to be used in support of both of these critical processes. Using these tools, potential problems with the quality of the requirements and the test plan can be identified early in the life cycle. Some of these quality factors include: ambiguous or incomplete requirements, poorly designed requirements databases, excessive or insufficient test cases, and incomplete linkage of tests to requirements. This paper discusses how metrics can be used to evaluate the quality of the requirements and test to avoid problems later. Requirements management and requirements based testing have always been critical in the implementation of high quality software systems. Recently, automated tools have become available to support requirements management. At NASA's Goddard Space Flight Center (GSFC), automated requirements management tools are being used on several large projects. The use of these tools opens the door to innovative uses of metrics in characterizing test plan quality and assessing overall testing risks. In support of these projects, the Software Assurance Technology Center (SATC) is working to develop and apply a metrics program that utilizes the information now available through the application of requirements management tools. Metrics based on this information provides real-time insight into the testing of requirements and these metrics assist the Project Quality Office in its testing oversight role. This paper discusses three facets of the SATC's efforts to evaluate the quality of the requirements and test plan early in the life cycle, thus preventing costly errors and time delays later.

  12. Syndesmosis and lateral ankle sprains in the National Football League.

    PubMed

    Osbahr, Daryl C; Drakos, Mark C; O'Loughlin, Padhraig F; Lyman, Stephen; Barnes, Ronnie P; Kennedy, John G; Warren, Russell F

    2013-11-01

    Syndesmosis sprains in the National Football League (NFL) can be a persistent source of disability, especially compared with lateral ankle injuries. This study evaluated syndesmosis and lateral ankle sprains in NFL players to allow for better identification and management of these injuries. Syndesmosis and lateral ankle sprains from a single NFL team database were reviewed over a 15-year period, and 32 NFL team physicians completed a questionnaire detailing their management approach. A comparative analysis was performed analyzing several variables, including diagnosis, treatment methods, and time lost from sports participation. Thirty-six syndesmosis and 53 lateral ankle sprains occurred in the cohort of NFL players. The injury mechanism typically resulted from direct impact in the syndesmosis and torsion in the lateral ankle sprain group (P=.034). All players were managed nonoperatively. The mean time lost from participation was 15.4 days in the syndesmosis and 6.5 days in the lateral ankle sprain groups (P⩽.001). National Football League team physicians varied treatment for syndesmosis sprains depending on the category of diastasis but recommended nonoperative management for lateral ankle sprains. Syndesmosis sprains in the NFL can be a source of significant disability compared with lateral ankle sprains. Successful return to play with nonoperative management is frequently achieved for syndesmosis and lateral ankle sprains depending on injury severity. With modern treatment algorithms for syndesmosis sprains, more aggressive nonoperative treatment is advocated. Although the current study shows that syndesmosis injuries require longer rehabilitation periods when compared with lateral ankle sprains, the time lost from participation may not be as prolonged as previously reported. Copyright 2013, SLACK Incorporated.

  13. Digital Geodata Traces--New Challenges for Geographic Education

    ERIC Educational Resources Information Center

    Hohnle, Steffen; Michel, Boris; Glasze, Georg; Uphues, Rainer

    2013-01-01

    Young people in modern societies consciously (e.g. Facebook) or unconsciously (e.g. some Google services) produce a vast amount of geodata. Using relational databases, private companies are capable of creating very precise profiles of the individual user and his/her spatial practices from this data. This almost inevitably prompts questions…

  14. On Research Methodology in Applied Linguistics in 2002-2008

    ERIC Educational Resources Information Center

    Martynychev, Andrey

    2010-01-01

    This dissertation examined the status of data-based research in applied linguistics through an analysis of published research studies in nine peer-reviewed applied linguistics journals ("Applied Language Learning, The Canadian Modern Language Review / La Revue canadienne des langues vivantes, Current Issues in Language Planning, Dialog on Language…

  15. Library Users: How They Adapt to Changing Roles.

    ERIC Educational Resources Information Center

    Miido, Helis

    Traditional library tasks, for example database searching, are increasingly performed by library users, forcing both the librarian and the user to assume at times dichotomous roles of teacher and student. Modern librarians install new software and guide organizations in multimedia applications. Librarians need to be cognizant of the human factor,…

  16. Development of the ageing management database of PUSPATI TRIGA reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramli, Nurhayati, E-mail: nurhayati@nm.gov.my; Tom, Phongsakorn Prak; Husain, Nurfazila

    Since its first criticality in 1982, PUSPATI TRIGA Reactor (RTP) has been operated for more than 30 years. As RTP become older, ageing problems have been seen to be the prominent issues. In addressing the ageing issues, an Ageing Management (AgeM) database for managing related ageing matters was systematically developed. This paper presents the development of AgeM database taking into account all RTP major Systems, Structures and Components (SSCs) and ageing mechanism of these SSCs through the system surveillance program.

  17. The Design and Implementation of a Relational to Network Query Translator for a Distributed Database Management System.

    DTIC Science & Technology

    1985-12-01

    RELATIONAL TO NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM TH ESI S .L Kevin H. Mahoney -- Captain, USAF AFIT/GCS/ENG/85D-7...NETWORK QUERY TRANSLATOR FOR A DISTRIBUTED DATABASE MANAGEMENT SYSTEM - THESIS Presented to the Faculty of the School of Engineering of the Air Force...Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Systems - Kevin H. Mahoney

  18. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    NASA Astrophysics Data System (ADS)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  19. A curated database of cyanobacterial strains relevant for modern taxonomy and phylogenetic studies.

    PubMed

    Ramos, Vitor; Morais, João; Vasconcelos, Vitor M

    2017-04-25

    The dataset herein described lays the groundwork for an online database of relevant cyanobacterial strains, named CyanoType (http://lege.ciimar.up.pt/cyanotype). It is a database that includes categorized cyanobacterial strains useful for taxonomic, phylogenetic or genomic purposes, with associated information obtained by means of a literature-based curation. The dataset lists 371 strains and represents the first version of the database (CyanoType v.1). Information for each strain includes strain synonymy and/or co-identity, strain categorization, habitat, accession numbers for molecular data, taxonomy and nomenclature notes according to three different classification schemes, hierarchical automatic classification, phylogenetic placement according to a selection of relevant studies (including this), and important bibliographic references. The database will be updated periodically, namely by adding new strains meeting the criteria for inclusion and by revising and adding up-to-date metadata for strains already listed. A global 16S rDNA-based phylogeny is provided in order to assist users when choosing the appropriate strains for their studies.

  20. A curated database of cyanobacterial strains relevant for modern taxonomy and phylogenetic studies

    PubMed Central

    Ramos, Vitor; Morais, João; Vasconcelos, Vitor M.

    2017-01-01

    The dataset herein described lays the groundwork for an online database of relevant cyanobacterial strains, named CyanoType (http://lege.ciimar.up.pt/cyanotype). It is a database that includes categorized cyanobacterial strains useful for taxonomic, phylogenetic or genomic purposes, with associated information obtained by means of a literature-based curation. The dataset lists 371 strains and represents the first version of the database (CyanoType v.1). Information for each strain includes strain synonymy and/or co-identity, strain categorization, habitat, accession numbers for molecular data, taxonomy and nomenclature notes according to three different classification schemes, hierarchical automatic classification, phylogenetic placement according to a selection of relevant studies (including this), and important bibliographic references. The database will be updated periodically, namely by adding new strains meeting the criteria for inclusion and by revising and adding up-to-date metadata for strains already listed. A global 16S rDNA-based phylogeny is provided in order to assist users when choosing the appropriate strains for their studies. PMID:28440791

  1. Postpartum Visit Attendance Increases the Use of Modern Contraceptives

    PubMed Central

    Cha, Susan; Charles, RaShel; McGee, Elizabeth; Karjane, Nicole; Hines, Linda; Kornstein, Susan G.

    2016-01-01

    Background. Delays in postpartum contraceptive use may increase risk for unintended or rapid repeat pregnancies. The postpartum care visit (PPCV) is a good opportunity for women to discuss family planning options with their health care providers. This study examined the association between PPCV attendance and modern contraceptive use using data from a managed care organization. Methods. Claims and demographic and administrative data came from a nonprofit managed care organization in Virginia (2008–2012). Information on the most recent delivery for mothers with singleton births was analyzed (N = 24,619). Routine PPCV (yes, no) and modern contraceptive use were both dichotomized. Descriptive analyses provided percentages, frequencies, and means. Multiple logistic regression was conducted and ORs and 95% CIs were calculated. Results. More than half of the women did not attend their PPCV (50.8%) and 86.9% had no modern contraceptive use. After controlling for the effects of confounders, women with PPCV were 50% more likely to use modern contraceptive methods than women with no PPCV (OR = 1.50, 95% CI = 1.31, 1.72). Conclusions. These findings highlight the importance of PPCV in improving modern contraceptive use and guide health care policy in the effort of reducing unintended pregnancy rates. PMID:28070422

  2. Tautomerism in chemical information management systems

    NASA Astrophysics Data System (ADS)

    Warr, Wendy A.

    2010-06-01

    Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.

  3. Phynx: an open source software solution supporting data management and web-based patient-level data review for drug safety studies in the general practice research database and other health care databases.

    PubMed

    Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan

    2010-01-01

    To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.

  4. A hierarchical spatial framework and database for the national river fish habitat condition assessment

    USGS Publications Warehouse

    Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.

    2011-01-01

    Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.

  5. Building a Rich Community-Contributed Phenology Dataset: Lessons Learned from Winegrapes in California’s Napa Valley

    NASA Astrophysics Data System (ADS)

    Cahill, K. N.; Cayan, D. R.; Dettinger, M.

    2009-12-01

    In the wine industry of California’s Napa Valley, there is great community interest in using legacy and modern observations of grapevine phenological stages to track trends in both phenology and climate. Although management such as changing pruning and winemaking preferences can affect phenological records, grapevines can serve as sensitive climate indicators. In collaboration with a local vintners’ organization, we conducted an outreach campaign to solicit contributions of climate and phenological data from winegrowers and winemakers. We received nearly 10,000 phenological records dating from 1940 to the present, including data on budbreak, bloom, véraison (color change), and harvest dates for 68 minor grape varieties (15% of the data) and 10 major varieties (85% of the data). Compiling a unified database from records collected by different individuals in different ways presented a challenge, and we developed several new approaches to using data from our newly compiled records to develop empirical corrections to standardize observations across the dataset. The time series of phenological observations, along with a companion set of regional climate observations, reveal, expectedly, a strong linkage to seasonal temperature and, unexpectedly, a significant association with winter precipitation. The series of harvest timing reports contains influences of both management and climate. We will also present lessons learned on data management, confidentiality, and science-stakeholder partnerships relevant for others interested in conducting community phenological partnerships.

  6. Patient monitoring in mobile health: opportunities and challenges.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza

    2014-01-01

    In most countries chronic diseases lead to high health care costs and reduced productivity of people in society. The best way to reduce costs of health sector and increase the empowerment of people is prevention of chronic diseases and appropriate health activities management through monitoring of patients. To enjoy the full benefits of E-health, making use of methods and modern technologies is very important. This literature review articles were searched with keywords like Patient monitoring, Mobile Health, and Chronic Disease in Science Direct, Google Scholar and Pub Med databases without regard to the year of publications. Applying remote medical diagnosis and monitoring system based on mobile health systems can help significantly to reduce health care costs, correct performance management particularly in chronic disease management. Also some challenges are in patient monitoring in general and specific aspects like threats to confidentiality and privacy, technology acceptance in general and lack of system interoperability with electronic health records and other IT tools, decrease in face to face communication between doctor and patient, sudden interruptions of telecommunication networks, and device and sensor type in specific aspect. It is obvious identifying the opportunities and challenges of mobile technology and reducing barriers, strengthening the positive points will have a significant role in the appropriate planning and promoting the achievements of the health care systems based on mobile and helps to design a roadmap for improvement of mobile health.

  7. Common Database Interface for Heterogeneous Software Engineering Tools.

    DTIC Science & Technology

    1987-12-01

    SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager

  8. Modernizing Information Technology in the Office of Economic Adjustment

    DTIC Science & Technology

    1993-07-01

    AD-A286 036 l 1 ( )HTj% i ll Ntll1.1 flCil i I odit Ut Modernizing Information Technology in the Office of Economic Adjustment FPI IOR1 ~ DG...Jeffrey S. Frost Michael P. McEwen 94-34585 ’> DTC J . -94 11 7 074 July 1993 Modernizing Information Technology in the Office of Economic Adjustment...Road "Bethesda, Maryland 20817-5886 FPIIOR1/JuIy 1993 LOGISTICS MANAGEMENT INSTITUTE Modernizing Information Technology in the Office of Economic

  9. Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects

    NASA Astrophysics Data System (ADS)

    Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan

    2010-10-01

    The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.

  10. Scandinavian systems monitoring the oral health in children and adolescents; an evaluation of their quality and utility in the light of modern perspectives of caries management.

    PubMed

    Skeie, Marit S; Klock, Kristin S

    2014-04-30

    Recording reliable oral health data is a challenge. The aims were a) to outline different Scandinavian systems of oral health monitoring, b) to evaluate the quality and utility of the collected data in the light of modern concepts of disease management and to suggest improvements. The information for in this study was related to (a) children and adolescents, (b) oral health data and (c) routines for monitoring such data. This meant information available in the official web sites of the "KOSTRA-data" (Municipality-State-Report) in Norway, the Swedish National Board of Health and Welfare ("Socialstyrelsen") and Oral Health Register (the SCOR system, National Board of Health) in Denmark. A potential for increasing the reliability and validity of the data existed. Routines for monitoring other oral diseases than caries were limited. Compared with the other Scandinavian countries, the data collection system in Denmark appeared more functional and had adopted more modern concepts of disease management than other systems. In the light of modern concepts of caries management, data collected elsewhere had limited utility. The Scandinavian systems of health reporting had much in common, but some essential differences existed. If the quality of epidemiological data were enhanced, it would be possible to use the data for planning oral health care. Routines and procedures should be improved and updated in accordance with the modern ideas about caries prevention and therapy. For appropriate oral health planning in an organised dental service, reporting of enamel caries is essential.

  11. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  12. SEVEN PILLARS OF ECOSYSTEM MANAGEMENT

    EPA Science Inventory

    Ecosystem management is widely proposed in the popular and professional literature as the modern and preferred way of managing natural resources and ecosystems. Advocates glowingly describe ecosystem management as an approach that will protect the environment, maintain healthy ec...

  13. Documentation of a spatial data-base management system for monitoring pesticide application in Washington

    USGS Publications Warehouse

    Schurr, K.M.; Cox, S.E.

    1994-01-01

    The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.

  14. Network-based statistical comparison of citation topology of bibliographic databases

    PubMed Central

    Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko

    2014-01-01

    Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231

  15. SoyFN: a knowledge database of soybean functional networks.

    PubMed

    Xu, Yungang; Guo, Maozu; Liu, Xiaoyan; Wang, Chunyu; Liu, Yang

    2014-01-01

    Many databases for soybean genomic analysis have been built and made publicly available, but few of them contain knowledge specifically targeting the omics-level gene-gene, gene-microRNA (miRNA) and miRNA-miRNA interactions. Here, we present SoyFN, a knowledge database of soybean functional gene networks and miRNA functional networks. SoyFN provides user-friendly interfaces to retrieve, visualize, analyze and download the functional networks of soybean genes and miRNAs. In addition, it incorporates much information about KEGG pathways, gene ontology annotations and 3'-UTR sequences as well as many useful tools including SoySearch, ID mapping, Genome Browser, eFP Browser and promoter motif scan. SoyFN is a schema-free database that can be accessed as a Web service from any modern programming language using a simple Hypertext Transfer Protocol call. The Web site is implemented in Java, JavaScript, PHP, HTML and Apache, with all major browsers supported. We anticipate that this database will be useful for members of research communities both in soybean experimental science and bioinformatics. Database URL: http://nclab.hit.edu.cn/SoyFN.

  16. Impact of selection on maize root traits and rhizosphere interactions

    NASA Astrophysics Data System (ADS)

    Schmidt, J. E.; Gaudin, A. C. M.

    2017-12-01

    Effects of domestication and breeding on maize have been well-characterized aboveground, but impacts on root traits and rhizosphere processes remain unclear. Breeding in high-inorganic-input environments may have negatively affected the ability of modern maize to acquire nutrients through foraging and microbial interactions in marginal and/or organically managed soils. Twelve maize genotypes representing a selection gradient (teosintes, landraces, open-pollinated parents of modern elite germplasm, and modern hybrids released 1934-2015) were grown in three soils varying in intensity of long-term management (unfertilized, organic, conventional) in the greenhouse. Recruitment of rhizosphere microbial communities, nutrient acquisition, and plant productivity were affected by genotype-by-soil interactions. Maize genotypes exhibit significant variation in their ability to obtain nutrients from soils of different management history, indicating the potential for re-integration of beneficial root and rhizosphere traits to increase adaptation to low-input agroecosystems.

  17. From experimental imaging techniques to virtual embryology.

    PubMed

    Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis

    2004-01-01

    Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.

  18. The future of medical diagnostics: large digitized databases.

    PubMed

    Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron

    2012-09-01

    The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.

  19. The quest for the perfect gravity anomaly: Part 1 - New calculation standards

    USGS Publications Warehouse

    Li, X.; Hildenbrand, T.G.; Hinze, W. J.; Keller, Gordon R.; Ravat, D.; Webring, M.

    2006-01-01

    The North American gravity database together with databases from Canada, Mexico, and the United States are being revised to improve their coverage, versatility, and accuracy. An important part of this effort is revision of procedures and standards for calculating gravity anomalies taking into account our enhanced computational power, modern satellite-based positioning technology, improved terrain databases, and increased interest in more accurately defining different anomaly components. The most striking revision is the use of one single internationally accepted reference ellipsoid for the horizontal and vertical datums of gravity stations as well as for the computation of the theoretical gravity. The new standards hardly impact the interpretation of local anomalies, but do improve regional anomalies. Most importantly, such new standards can be consistently applied to gravity database compilations of nations, continents, and even the entire world. ?? 2005 Society of Exploration Geophysicists.

  20. Informatics in radiology: use of CouchDB for document-based storage of DICOM objects.

    PubMed

    Rascovsky, Simón J; Delgado, Jorge A; Sanz, Alexander; Calvo, Víctor D; Castrillón, Gabriel

    2012-01-01

    Picture archiving and communication systems traditionally have depended on schema-based Structured Query Language (SQL) databases for imaging data management. To optimize database size and performance, many such systems store a reduced set of Digital Imaging and Communications in Medicine (DICOM) metadata, discarding informational content that might be needed in the future. As an alternative to traditional database systems, document-based key-value stores recently have gained popularity. These systems store documents containing key-value pairs that facilitate data searches without predefined schemas. Document-based key-value stores are especially suited to archive DICOM objects because DICOM metadata are highly heterogeneous collections of tag-value pairs conveying specific information about imaging modalities, acquisition protocols, and vendor-supported postprocessing options. The authors used an open-source document-based database management system (Apache CouchDB) to create and test two such databases; CouchDB was selected for its overall ease of use, capability for managing attachments, and reliance on HTTP and Representational State Transfer standards for accessing and retrieving data. A large database was created first in which the DICOM metadata from 5880 anonymized magnetic resonance imaging studies (1,949,753 images) were loaded by using a Ruby script. To provide the usual DICOM query functionality, several predefined "views" (standard queries) were created by using JavaScript. For performance comparison, the same queries were executed in both the CouchDB database and a SQL-based DICOM archive. The capabilities of CouchDB for attachment management and database replication were separately assessed in tests of a similar, smaller database. Results showed that CouchDB allowed efficient storage and interrogation of all DICOM objects; with the use of information retrieval algorithms such as map-reduce, all the DICOM metadata stored in the large database were searchable with only a minimal increase in retrieval time over that with the traditional database management system. Results also indicated possible uses for document-based databases in data mining applications such as dose monitoring, quality assurance, and protocol optimization. RSNA, 2012

  1. Kingfisher: a system for remote sensing image database management

    NASA Astrophysics Data System (ADS)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  2. Medical uses of Carthamus tinctorius L. (Safflower): a comprehensive review from Traditional Medicine to Modern Medicine.

    PubMed

    Delshad, Elahe; Yousefi, Mahdi; Sasannezhad, Payam; Rakhshandeh, Hasan; Ayati, Zahra

    2018-04-01

    Carthamus tinctorius L. , known as Kafesheh (Persian) and safflower (English) is vastly utilized in Traditional Medicine for various medical conditions, namely dysmenorrhea, amenorrhea, postpartum abdominal pain and mass, trauma and pain of joints. It is largely used for flavoring and coloring purposes among the local population. Recent reviews have addressed the uses of the plant in various ethnomedical systems. This review was an update to provide a summary on the botanical features, uses in Iranian folklore and modern medical applications of safflower. A main database containing important early published texts written in Persian, together with electronic papers was established on ethnopharmacology and modern pharmacology of C. tinctorius. Literature review was performed on the years from 1937 to 2016 in Web of Science, PubMed, Scientific Information Database, Google Scholar, and Scopus for the terms "Kafesheh", "safflower", "Carthamus tinctorius", and so forth. Safflower is an indispensable element of Iranian folklore medicine, with a variety of applications due to laxative effects. Also, it was recommended as treatment for rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, melancholy humor, and the like. According to the modern pharmacological and clinical examinations, safflower provides promising opportunities for the amelioration of myocardial ischemia, coagulation, thrombosis, inflammation, toxicity, cancer, and so forth. However, there have been some reports on its undesirable effects on male and female fertility. Most of these beneficial therapeutic effects were correlated to hydroxysafflor yellow A. More attention should be drawn to the lack of a thorough phytochemical investigation. The potential implications of safflower based on Persian traditional medicine, such as the treatment of rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, and melancholy humor warrant further consideration.

  3. The ID Database: Managing the Instructional Development Process

    ERIC Educational Resources Information Center

    Piña, Anthony A.; Sanford, Barry K.

    2017-01-01

    Management is evolving as a foundational domain to the field of instructional design and technology. However, there are few tools dedicated to the management of instructional design and development projects and activities. In this article, we describe the development, features and implementation of an instructional design database--built from a…

  4. Expansion of the MANAGE database with forest and drainage studies

    USDA-ARS?s Scientific Manuscript database

    The “Measured Annual Nutrient loads from AGricultural Environments” (MANAGE) database was published in 2006 to expand an early 1980’s compilation of nutrient export (load) data from agricultural land uses at the field or farm spatial scale. Then in 2008, MANAGE was updated with 15 additional studie...

  5. 76 FR 12617 - Airworthiness Directives; The Boeing Company Model 777-200 and -300 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-08

    ... installing new operational software for the electrical load management system and configuration database... the electrical load management system operational software and configuration database software, in... Management, P.O. Box 3707, MC 2H-65, Seattle, Washington 98124-2207; telephone 206- 544-5000, extension 1...

  6. Advanced Distribution Management Systems | Grid Modernization | NREL

    Science.gov Websites

    Advanced Distribution Management Systems Advanced Distribution Management Systems Electric utilities are investing in updated grid technologies such as advanced distribution management systems to management testbed for cyber security in power systems. The "advanced" elements of advanced

  7. Managing today's complex healthcare business enterprise: reflections on distinctive requirements of healthcare management education.

    PubMed

    Welton, William E

    2004-01-01

    In early 2001, the community of educational programs offering master's-level education in healthcare management began an odyssey to modernize its approach to the organization and delivery of healthcare management education. The community recognized that cumulative long-term changes within healthcare management practice required a careful examination of healthcare management context and manpower requirements. This article suggests an evidence-based rationale for defining the distinctive elements of healthcare management, thus suggesting a basis for review and transformation of master's-level healthcare management curricula. It also suggests ways to modernize these curricula in a manner that recognizes the distinctiveness of the healthcare business enterprise as well as the changing management roles and careers within these complex organizations and systems. Through such efforts, the healthcare management master's-level education community would be better prepared to meet current and future challenges, to increase its relevance to the management practice community, and to allocate scarce faculty and program resources more effectively.

  8. Solving Relational Database Problems with ORDBMS in an Advanced Database Course

    ERIC Educational Resources Information Center

    Wang, Ming

    2011-01-01

    This paper introduces how to use the object-relational database management system (ORDBMS) to solve relational database (RDB) problems in an advanced database course. The purpose of the paper is to provide a guideline for database instructors who desire to incorporate the ORDB technology in their traditional database courses. The paper presents…

  9. Migration from relational to NoSQL database

    NASA Astrophysics Data System (ADS)

    Ghotiya, Sunita; Mandal, Juhi; Kandasamy, Saravanakumar

    2017-11-01

    Data generated by various real time applications, social networking sites and sensor devices is of very huge amount and unstructured, which makes it difficult for Relational database management systems to handle the data. Data is very precious component of any application and needs to be analysed after arranging it in some structure. Relational databases are only able to deal with structured data, so there is need of NoSQL Database management System which can deal with semi -structured data also. Relational database provides the easiest way to manage the data but as the use of NoSQL is increasing it is becoming necessary to migrate the data from Relational to NoSQL databases. Various frameworks has been proposed previously which provides mechanisms for migration of data stored at warehouses in SQL, middle layer solutions which can provide facility of data to be stored in NoSQL databases to handle data which is not structured. This paper provides a literature review of some of the recent approaches proposed by various researchers to migrate data from relational to NoSQL databases. Some researchers proposed mechanisms for the co-existence of NoSQL and Relational databases together. This paper provides a summary of mechanisms which can be used for mapping data stored in Relational databases to NoSQL databases. Various techniques for data transformation and middle layer solutions are summarised in the paper.

  10. An update on the management of breast cancer in Africa.

    PubMed

    Vanderpuye, V; Grover, S; Hammad, N; PoojaPrabhakar; Simonds, H; Olopade, F; Stefan, D C

    2017-01-01

    There is limited information about the challenges of cancer management and attempts at improving outcomes in Africa. Even though South and North Africa are better resourceds to tackle the burden of breast cancer, similar poor prognostic factors are common to all countries. The five-year overall Survival rate for breast cancer patients does not exceed 60% for any low and middle-income country (LMIC) in Africa. In spite of the gains achieved over the past decade, certain characteristics remain the same such as limited availability of breast conservation therapies, inadequate access to drugs, few oncology specialists and adherence to harmful socio-cultural practices. This review on managing breast cancer in Africa is authored by African oncologists who practice or collaborate in Africa and with hands-on experience with the realities. A search was performed via electronic databases from 1999 to 2016. (PubMed/Medline, African Journals Online) for all literature in English or translated into English, covering the terms "breast cancer in Africa and developing countries". One hundred ninety were deemed appropriate. Breast tumors are diagnosed at earlier ages and later stages than in highincome countries. There is a higher prevalence of triple-negative cancers. The limitations of poor nursing care and surgery, inadequate access to radiotherapy, poor availability of basic and modern systemic therapies translate into lower survival rate. Positive strides in breast cancer management in Africa include increased adaptation of treatment guidelines, improved pathology services including immuno-histochemistry, expansion and upgrading of radiotherapy equipment across the continent in addition to more research opportunities. This review is an update of the management of breast cancer in Africa, taking a look at the epidemiology, pathology, management resources, outcomes, research and limitations in Africa from the perspective of oncologists with local experience.

  11. National Library of Medicine

    MedlinePlus

    ... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ... Disasters and Public Health Emergencies The NLM Disaster Information Management Research Center has tools, guides, and databases to ...

  12. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Dezaki, Kyoko; Saeki, Makoto

    Rapid progress in advanced informationalization has increased need to enforce documentation activities in industries. Responding to it Tokin Corporation has been engaged in database construction for patent information, technical reports and so on accumulated inside the Company. Two results are obtained; One is TOPICS, inhouse patent information management system, the other is TOMATIS, management and technical information system by use of personal computers and all-purposed relational database software. These systems aim at compiling databases of patent and technological management information generated internally and externally by low labor efforts as well as low cost, and providing for comprehensive information company-wide. This paper introduces the outline of these systems and how they are actually used.

  13. An integrated photogrammetric and spatial database management system for producing fully structured data using aerial and remote sensing images.

    PubMed

    Ahmadi, Farshid Farnood; Ebadi, Hamid

    2009-01-01

    3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.

  14. Personal Database Management System I TRIAS

    NASA Astrophysics Data System (ADS)

    Yamamoto, Yoneo; Kashihara, Akihiro; Kawagishi, Keisuke

    The current paper provides TRIAS (TRIple Associative System) which is a database management system for a personal use. In order to implement TRIAS, we have developed an associative database, whose format is (e,a,v) : e for entity, a for attribute, v for value. ML-TREE is used to construct (e,a,v). ML-TREE is a reversion of B+-tree that is multiway valanced tree. The paper focuses mainly on the usage of associative database, demonstrating how to use basic commands, primary functions and applcations.

  15. The ADAMS interactive interpreter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietscha, E.R.

    1990-12-17

    The ADAMS (Advanced DAta Management System) project is exploring next generation database technology. Database management does not follow the usual programming paradigm. Instead, the database dictionary provides an additional name space environment that should be interactively created and tested before writing application code. This document describes the implementation and operation of the ADAMS Interpreter, an interactive interface to the ADAMS data dictionary and runtime system. The Interpreter executes individual statements of the ADAMS Interface Language, providing a fast, interactive mechanism to define and access persistent databases. 5 refs.

  16. An Extensible "SCHEMA-LESS" Database Framework for Managing High-Throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semistructured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  17. An Extensible Schema-less Database Framework for Managing High-throughput Semi-Structured Documents

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.; La, Tracy; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword searches of records for both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high throughput open database framework for managing, storing, and searching unstructured or semi structured arbitrary hierarchal models, XML and HTML.

  18. NETMARK: A Schema-less Extension for Relational Databases for Managing Semi-structured Data Dynamically

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Tran, Peter B.

    2003-01-01

    Object-Relational database management system is an integrated hybrid cooperative approach to combine the best practices of both the relational model utilizing SQL queries and the object-oriented, semantic paradigm for supporting complex data creation. In this paper, a highly scalable, information on demand database framework, called NETMARK, is introduced. NETMARK takes advantages of the Oracle 8i object-relational database using physical addresses data types for very efficient keyword search of records spanning across both context and content. NETMARK was originally developed in early 2000 as a research and development prototype to solve the vast amounts of unstructured and semi-structured documents existing within NASA enterprises. Today, NETMARK is a flexible, high-throughput open database framework for managing, storing, and searching unstructured or semi-structured arbitrary hierarchal models, such as XML and HTML.

  19. Novel statistical tools for management of public databases facilitate community-wide replicability and control of false discovery.

    PubMed

    Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani

    2014-07-01

    Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.

  20. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1992-01-01

    One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.

  1. Osteoporosis therapies: evidence from health-care databases and observational population studies.

    PubMed

    Silverman, Stuart L

    2010-11-01

    Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.

  2. A case study for a digital seabed database: Bohai Sea engineering geology database

    NASA Astrophysics Data System (ADS)

    Tianyun, Su; Shikui, Zhai; Baohua, Liu; Ruicai, Liang; Yanpeng, Zheng; Yong, Wang

    2006-07-01

    This paper discusses the designing plan of ORACLE-based Bohai Sea engineering geology database structure from requisition analysis, conceptual structure analysis, logical structure analysis, physical structure analysis and security designing. In the study, we used the object-oriented Unified Modeling Language (UML) to model the conceptual structure of the database and used the powerful function of data management which the object-oriented and relational database ORACLE provides to organize and manage the storage space and improve its security performance. By this means, the database can provide rapid and highly effective performance in data storage, maintenance and query to satisfy the application requisition of the Bohai Sea Oilfield Paradigm Area Information System.

  3. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  4. Using Virtual Servers to Teach the Implementation of Enterprise-Level DBMSs: A Teaching Note

    ERIC Educational Resources Information Center

    Wagner, William P.; Pant, Vik

    2010-01-01

    One of the areas where demand has remained strong for MIS students is in the area of database management. Since the early days, this topic has been a mainstay in the MIS curriculum. Students of database management today typically learn about relational databases, SQL, normalization, and how to design and implement various kinds of database…

  5. Towards G2G: Systems of Technology Database Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Bell, David

    2005-01-01

    We present an approach and methodology for developing Government-to-Government (G2G) Systems of Technology Database Systems. G2G will deliver technologies for distributed and remote integration of technology data for internal use in analysis and planning as well as for external communications. G2G enables NASA managers, engineers, operational teams and information systems to "compose" technology roadmaps and plans by selecting, combining, extending, specializing and modifying components of technology database systems. G2G will interoperate information and knowledge that is distributed across organizational entities involved that is ideal for NASA future Exploration Enterprise. Key contributions of the G2G system will include the creation of an integrated approach to sustain effective management of technology investments that supports the ability of various technology database systems to be independently managed. The integration technology will comply with emerging open standards. Applications can thus be customized for local needs while enabling an integrated management of technology approach that serves the global needs of NASA. The G2G capabilities will use NASA s breakthrough in database "composition" and integration technology, will use and advance emerging open standards, and will use commercial information technologies to enable effective System of Technology Database systems.

  6. The SysMan monitoring service and its management environment

    NASA Astrophysics Data System (ADS)

    Debski, Andrzej; Janas, Ekkehard

    1996-06-01

    Management of modern information systems is becoming more and more complex. There is a growing need for powerful, flexible and affordable management tools to assist system managers in maintaining such systems. It is at the same time evident that effective management should integrate network management, system management and application management in a uniform way. Object oriented OSI management architecture with its four basic modelling concepts (information, organization, communication and functional models) together with widely accepted distribution platforms such as ANSA/CORBA, constitutes a reliable and modern framework for the implementation of a management toolset. This paper focuses on the presentation of concepts and implementation results of an object oriented management toolset developed and implemented within the framework of the ESPRIT project 7026 SysMan. An overview is given of the implemented SysMan management services including the System Management Service, Monitoring Service, Network Management Service, Knowledge Service, Domain and Policy Service, and the User Interface. Special attention is paid to the Monitoring Service which incorporates the architectural key entity responsible for event management. Its architecture and building components, especially filters, are emphasized and presented in detail.

  7. Economics of pressure-ulcer care: review of the literature on modern versus traditional dressings.

    PubMed

    San Miguel, L; Torra i Bou, J E; Verdú Soriano, J

    2007-01-01

    Published evidence suggests that some of the benefits of modern dressings--longer wear times and less frequent dressing changes--make them more cost-effective than traditional gauze dressings in pressure ulcer management.

  8. Developing a web-based forecasting tool for nutrient management

    USDA-ARS?s Scientific Manuscript database

    Modern nutrient management planning tools provide strategic guidance that, in the best cases, educates farmers and others involved in nutrient management to make prudent management decisions. The strategic guidance provided by nutrient management plans does not provide the day-to-day support require...

  9. Postmodern Program Management

    DTIC Science & Technology

    2008-06-01

    Charlie Chaplin , in 1936, to make the aptly titled film Modern Times . Watch the movie to see what we mean. Postmodernism: The Humanist Reaction Along...PM, plus more. Further, because Pomo PMs do not insist on standardiza- tion to the degree Modern PMs do, they spend much less time producing the...voluminous, detailed documentation that Modern PMs require to ensure precise repeatability, and much more time on actually doing things (perhaps

  10. Adopting a corporate perspective on databases. Improving support for research and decision making.

    PubMed

    Meistrell, M; Schlehuber, C

    1996-03-01

    The Veterans Health Administration (VHA) is at the forefront of designing and managing health care information systems that accommodate the needs of clinicians, researchers, and administrators at all levels. Rather than using one single-site, centralized corporate database VHA has constructed several large databases with different configurations to meet the needs of users with different perspectives. The largest VHA database is the Decentralized Hospital Computer Program (DHCP), a multisite, distributed data system that uses decoupled hospital databases. The centralization of DHCP policy has promoted data coherence, whereas the decentralization of DHCP management has permitted system development to be done with maximum relevance to the users'local practices. A more recently developed VHA data system, the Event Driven Reporting system (EDR), uses multiple, highly coupled databases to provide workload data at facility, regional, and national levels. The EDR automatically posts a subset of DHCP data to local and national VHA management. The development of the EDR illustrates how adoption of a corporate perspective can offer significant database improvements at reasonable cost and with modest impact on the legacy system.

  11. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    PubMed Central

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  12. SeTES, a Self-Teaching Expert System for the analysis, design and prediction of gas production from shales and a prototype for a new generation of Expert Systems in the Earth Sciences

    NASA Astrophysics Data System (ADS)

    Kuzma, H. A.; Boyle, K.; Pullman, S.; Reagan, M. T.; Moridis, G. J.; Blasingame, T. A.; Rector, J. W.; Nikolaou, M.

    2010-12-01

    A Self Teaching Expert System (SeTES) is being developed for the analysis, design and prediction of gas production from shales. An Expert System is a computer program designed to answer questions or clarify uncertainties that its designers did not necessarily envision which would otherwise have to be addressed by consultation with one or more human experts. Modern developments in computer learning, data mining, database management, web integration and cheap computing power are bringing the promise of expert systems to fruition. SeTES is a partial successor to Prospector, a system to aid in the identification and evaluation of mineral deposits developed by Stanford University and the USGS in the late 1970s, and one of the most famous early expert systems. Instead of the text dialogue used in early systems, the web user interface of SeTES helps a non-expert user to articulate, clarify and reason about a problem by navigating through a series of interactive wizards. The wizards identify potential solutions to queries by retrieving and combining together relevant records from a database. Inferences, decisions and predictions are made from incomplete and noisy inputs using a series of probabilistic models (Bayesian Networks) which incorporate records from the database, physical laws and empirical knowledge in the form of prior probability distributions. The database is mainly populated with empirical measurements, however an automatic algorithm supplements sparse data with synthetic data obtained through physical modeling. This constitutes the mechanism for how SeTES self-teaches. SeTES’ predictive power is expected to grow as users contribute more data into the system. Samples are appropriately weighted to favor high quality empirical data over low quality or synthetic data. Finally, a set of data visualization tools digests the output measurements into graphical outputs.

  13. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  14. Data base management system for lymphatic filariasis--a neglected tropical disease.

    PubMed

    Upadhyayula, Suryanaryana Murty; Mutheneni, Srinivasa Rao; Kadiri, Madhusudhan Rao; Kumaraswamy, Sriram; Nelaturu, Sarat Chandra Babu

    2012-01-01

    Researchers working in the area of Public Health are being confronted with large volumes of data on various aspects of entomology and epidemiology. To obtain the relevant information out of these data requires particular database management system. In this paper, we have described about the usages of our developed database on lymphatic filariasis. This database application is developed using Model View Controller (MVC) architecture, with MySQL as database and a web based interface. We have collected and incorporated the data on filariasis in the database from Karimnagar, Chittoor, East and West Godavari districts of Andhra Pradesh, India. The importance of this database is to store the collected data, retrieve the information and produce various combinational reports on filarial aspects which in turn will help the public health officials to understand the burden of disease in a particular locality. This information is likely to have an imperative role on decision making for effective control of filarial disease and integrated vector management operations.

  15. [The therapeutic drug monitoring network server of tacrolimus for Chinese renal transplant patients].

    PubMed

    Deng, Chen-Hui; Zhang, Guan-Min; Bi, Shan-Shan; Zhou, Tian-Yan; Lu, Wei

    2011-07-01

    This study is to develop a therapeutic drug monitoring (TDM) network server of tacrolimus for Chinese renal transplant patients, which can facilitate doctor to manage patients' information and provide three levels of predictions. Database management system MySQL was employed to build and manage the database of patients and doctors' information, and hypertext mark-up language (HTML) and Java server pages (JSP) technology were employed to construct network server for database management. Based on the population pharmacokinetic model of tacrolimus for Chinese renal transplant patients, above program languages were used to construct the population prediction and subpopulation prediction modules. Based on Bayesian principle and maximization of the posterior probability function, an objective function was established, and minimized by an optimization algorithm to estimate patient's individual pharmacokinetic parameters. It is proved that the network server has the basic functions for database management and three levels of prediction to aid doctor to optimize the regimen of tacrolimus for Chinese renal transplant patients.

  16. 78 FR 43890 - Privacy Act of 1974; Department of Homeland Security, Federal Emergency Management Agency-006...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... titled, ``Department of Homeland Security/Federal Emergency Management Agency--006 Citizen Corps Database...) authorities; (5) purpose; (6) routine uses of information; (7) system manager and address; (8) notification... Database'' and retitle it ``DHS/FEMA--006 Citizen Corps Program System of Records.'' FEMA administers the...

  17. NREL: U.S. Life Cycle Inventory Database - Project Management Team

    Science.gov Websites

    Project Management Team Information about the U.S. Life Cycle Inventory (LCI) Database project management team is listed on this page. Additional project information is available about the U.S. LCI Mechanical Engineering, Colorado State University Professional History Michael has worked as a Senior

  18. Development of a Relational Database for Learning Management Systems

    ERIC Educational Resources Information Center

    Deperlioglu, Omer; Sarpkaya, Yilmaz; Ergun, Ertugrul

    2011-01-01

    In today's world, Web-Based Distance Education Systems have a great importance. Web-based Distance Education Systems are usually known as Learning Management Systems (LMS). In this article, a database design, which was developed to create an educational institution as a Learning Management System, is described. In this sense, developed Learning…

  19. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    ERIC Educational Resources Information Center

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  20. The challenge of designing a database for auditing surgical in-patients.

    PubMed

    Branday, J M; Crandon, I; Carpenter, R; Rhoden, A; Meeks-Aitken, N

    1999-12-01

    Surgical audit is imperative in modern practice, particularly in the developing world where resources are limited and efficient allocation important. The structure, process and outcome of surgical care can be determined for quality assurance or for research. Improved efficiency and reduction of morbidity and mortality are additional goals which may be accomplished. However, computerization, medical staff cooperation and the availability of dedicated staff are among the hurdles which may be encountered. We report the challenge of designing and establishing a database for auditing surgical inpatients in a developing country and the difficulties which were encountered.

  1. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  2. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  3. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  4. Graphical user interfaces for symbol-oriented database visualization and interaction

    NASA Astrophysics Data System (ADS)

    Brinkschulte, Uwe; Siormanolakis, Marios; Vogelsang, Holger

    1997-04-01

    In this approach, two basic services designed for the engineering of computer based systems are combined: a symbol-oriented man-machine-service and a high speed database-service. The man-machine service is used to build graphical user interfaces (GUIs) for the database service; these interfaces are stored using the database service. The idea is to create a GUI-builder and a GUI-manager for the database service based upon the man-machine service using the concept of symbols. With user-definable and predefined symbols, database contents can be visualized and manipulated in a very flexible and intuitive way. Using the GUI-builder and GUI-manager, a user can build and operate its own graphical user interface for a given database according to its needs without writing a single line of code.

  5. From BIM to GIS at the Smithsonian Institution

    NASA Astrophysics Data System (ADS)

    Günther-Diringer, Detlef

    2018-05-01

    BIM-files (Building Information Models) are in modern architecture and building management a basic prerequisite for successful creation of construction engineering projects. At the facilities department of the Smithsonian Institution more than six hundred buildings were maintained. All facilities were digital available in an ESRI ArcGIS-environment with connection to the database information about single rooms with the usage and further maintenance information. These data are organization wide available by an intranet viewer, but only in a two-dimensional representation. Goal of the carried out project was the development of a workflow from available BIM-models to the given GIS-structure. The test-environment were the BIM-models of the buildings of the Smithsonian museums along the Washington Mall. Based on new software editions of Autodesk Revit, FME and ArcGIS Pro the workflow from BIM to the GIS-data structure of the Smithsonian was successfully developed and may be applied for the setup of the future 3D intranet viewer.

  6. Natural Treatments for Fissure in Ano Used by Traditional Persian Scholars, Razi (Rhazes) and Ibn Sina (Avicenna)

    PubMed Central

    Derakhshan, Ali Reza

    2016-01-01

    Most cases of chronic fissure do not respond to medical treatment. Razi and Ibn Sina were 2 of the best-known scientists of ancient Persia. The purpose of this study was to find out new scientific evidence in modern medicine about their recommendations, in order to find certain clues to conduct useful researches in the future. First, treatments of anal fissure mentioned by Razi and Ibn Sina were reviewed. Then, literature search was made in electronic databases including PubMed, Scopus, and Google Scholar. Management of anal fissure according to Razi’s and Ibn Sina’s practices is done based on 3 interventions: lifestyle modifications, drug treatments, and manual procedures. Almost all remedies suggested by Razi and Ibn Sina have shown their effects on fissure in ano via several mechanisms of action in many in vitro and in vivo studies; Still there is lack of human studies on the subject. PMID:27279645

  7. PACS-based model for telemedicine

    NASA Astrophysics Data System (ADS)

    Bastos de Figueiredo, Julio C.; Furuie, Sergio S.; Gutierrez, Marco A.; Melo, Candido P.

    2001-08-01

    In this work we present a practical model of telemedicine usage based on a HIS/PACS system that was developed and is being tested at the Heart Institute of Sao Paulo, Brazil). The objective of this project is to allow hospitals that are distant of specialized medical centers to have access to the services of other medical institutions using a low cost telemedicine solution supported by an appropriate architecture of storage and management of medical information. The services that can be accessed using this solution are, for example, second medical opinion, medical images databases and reports of clinical exams. With a simple architecture and easy operation, this project showed to be an efficient way to make a bridge between modern medicine centers and others, localized in places not provided by specialized medical assistance. The system described is still a prototype in experimental phase operating at the Heart Institute of Sao Paulo with good results and will shortly equip other hospitals (auxiliary units of the Heart Institute).

  8. Natural Treatments for Fissure in Ano Used by Traditional Persian Scholars, Razi (Rhazes) and Ibn Sina (Avicenna).

    PubMed

    Derakhshan, Ali Reza

    2016-06-08

    Most cases of chronic fissure do not respond to medical treatment. Razi and Ibn Sina were 2 of the best-known scientists of ancient Persia. The purpose of this study was to find out new scientific evidence in modern medicine about their recommendations, in order to find certain clues to conduct useful researches in the future. First, treatments of anal fissure mentioned by Razi and Ibn Sina were reviewed. Then, literature search was made in electronic databases including PubMed, Scopus, and Google Scholar. Management of anal fissure according to Razi's and Ibn Sina's practices is done based on 3 interventions: lifestyle modifications, drug treatments, and manual procedures. Almost all remedies suggested by Razi and Ibn Sina have shown their effects on fissure in ano via several mechanisms of action in many in vitro and in vivo studies; Still there is lack of human studies on the subject. © The Author(s) 2016.

  9. Big biomedical data as the key resource for discovery science.

    PubMed

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. THE ROLE OF FORENSIC DENTIST FOLLOWING MASS DISASTER

    PubMed Central

    Kolude, B.; Adeyemi, B.F.; Taiwo, J.O.; Sigbeku, O.F.; Eze, U.O.

    2010-01-01

    This review article focuses on mass disaster situations that may arise from natural or manmade circumstances and the significant role of forensic dental personnel in human identification following such occurrences. The various forensic dental modalities of identification that include matching techniques, postmortem profiling, genetic fingerprinting, dental fossil assessment and dental biometrics with digital subtraction were considered. The varying extent of use of forensic dental techniques and the resulting positive impact on human identification were considered. The importance of preparation by way of special training for forensic dental personnel, mock disaster rehearsal, and use of modern day technology was stressed. The need for international standardization of identification through the use of Interpol Disaster Victim Identification (DVI) for ms was further emphasized. Recommendations for improved human identification in Nigerian situation include reform of the National Emergency Management Association (NEMA), incorporation of dental care in primary health care to facilitate proper ante mortem database of the populace and commencement of identification at site of disaster. PMID:25161478

  11. Respiratory distress syndrome and birth order in premature twins

    PubMed Central

    Hacking, D; Watkins, A; Fraser, S; Wolfe, R; Nolan, T

    2001-01-01

    OBJECTIVE—To determine the effect of birth order on respiratory distress syndrome (RDS) in the outcome of twins in a large premature population managed in a modern neonatal intensive care unit.
METHODS—An historical cohort study design was used to analyse the neonatal outcomes of 301 premature liveborn twin sibling pairs of between 23 and 31 weeks gestation from the Australia and New Zealand Neonatal Network 1995database.
RESULTS—Among the 56 twin sibling pairs who were discordant for RDS, the second twin was affected in 41 cases (odds ratio (OR) 2.7,95% confidence interval (CI) 1.5 to 5.3). The excess risk of RDS in the second twin increased with gestation and was statistically significant for twins above 29 weeks gestation (OR 4.4, 95% CI 1.6 to 15).
CONCLUSIONS—There is a significant increased risk of RDS associated with being the second born of premature twins, which appears to depend on gestation.

 PMID:11207228

  12. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  13. Do Apprentices' Communities of Practice Block Unwelcome Knowledge?

    ERIC Educational Resources Information Center

    Sligo, Frank; Tilley, Elspeth; Murray, Niki

    2011-01-01

    Purpose: This study aims to examine how well print-literacy support being provided to New Zealand Modern Apprentices (MAs) is supporting their study and practical work. Design/methodology/approach: The authors undertook a qualitative analysis of a database of 191 MAs in the literacy programme, then in 14 case studies completed 46 interviews with…

  14. Learners' Reflections in Technological Learning Environments: Why To Promote and How To Evaluate.

    ERIC Educational Resources Information Center

    Rimor, Rikki; Kozminsky, Ely

    In this study, 24 9th-grade students investigated several issues related to modern Israeli society. In their investigation, students were engaged in activities such as data search, data sorting, making inquiries, project writing, and construction of a new computerized database related to the subjects of their investigations. Students were…

  15. The Design and Realization of Net Testing System on Campus Network

    ERIC Educational Resources Information Center

    Ren, Zhanying; Liu, Shijie

    2005-01-01

    According to the requirement of modern teaching theory and technology, based on software engineering, database theory, the technique of net information security and system integration, a net testing system on local network was designed and realized. The system benefits for dividing of testing & teaching and settles the problems of random…

  16. Educational System Efficiency Improvement Using Knowledge Discovery in Databases

    ERIC Educational Resources Information Center

    Lukaš, Mirko; Leškovic, Darko

    2007-01-01

    This study describes one of possible way of usage ICT in education system. We basically treated educational system like Business Company and develop appropriate model for clustering of student population. Modern educational systems are forced to extract the most necessary and purposeful information from a large amount of available data. Clustering…

  17. Modern representation of databases on the example of the Catalog of Solar Proton Events in the 23rd Cycle of Solar Activity

    NASA Astrophysics Data System (ADS)

    Ishkov, V. N.; Zabarinskaya, L. P.; Sergeeva, N. A.

    2017-11-01

    The development of studies of solar sources and their effects on the state of the near-Earth space required systematization of the corresponding information in the form of databases and catalogs for the entire time of observation of any geoeffective phenomenon that includes, if possible at the time of creation, all of the characteristics of the phenomena themselves and the sources of these phenomena on the Sun. A uniform presentation of information in the form of a series of similar catalogs that cover long time intervals is of particular importance. The large amount of information collected in such catalogs makes it necessary to use modern methods of its organization and presentation that allow a transition between individual parts of the catalog and a quick search for necessary events and their characteristics, which is implemented in the presented Catalog of Solar Proton Events in the 23rd Cycle of Solar Activity of the sequence of catalogs (six separate issues) that cover the period from 1970 to 2009 (20th-23rd solar cycles).

  18. Creating Your Own Database.

    ERIC Educational Resources Information Center

    Blair, John C., Jr.

    1982-01-01

    Outlines the important factors to be considered in selecting a database management system for use with a microcomputer and presents a series of guidelines for developing a database. General procedures, report generation, data manipulation, information storage, word processing, data entry, database indexes, and relational databases are among the…

  19. A Summary of the Naval Postgraduate School Research Program

    DTIC Science & Technology

    1989-08-30

    5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database

  20. Difficulties and challenges associated with literature searches in operating room management, complete with recommendations.

    PubMed

    Wachtel, Ruth E; Dexter, Franklin

    2013-12-01

    The purpose of this article is to teach operating room managers, financial analysts, and those with a limited knowledge of search engines, including PubMed, how to locate articles they need in the areas of operating room and anesthesia group management. Many physicians are unaware of current literature in their field and evidence-based practices. The most common source of information is colleagues. Many people making management decisions do not read published scientific articles. Databases such as PubMed are available to search for such articles. Other databases, such as citation indices and Google Scholar, can be used to uncover additional articles. Nevertheless, most people who do not know how to use these databases are reluctant to utilize help resources when they do not know how to accomplish a task. Most people are especially reluctant to use on-line help files. Help files and search databases are often difficult to use because they have been designed for users already familiar with the field. The help files and databases have specialized vocabularies unique to the application. MeSH terms in PubMed are not useful alternatives for operating room management, an important limitation, because MeSH is the default when search terms are entered in PubMed. Librarians or those trained in informatics can be valuable assets for searching unusual databases, but they must possess the domain knowledge relative to the subject they are searching. The search methods we review are especially important when the subject area (e.g., anesthesia group management) is so specific that only 1 or 2 articles address the topic of interest. The materials are presented broadly enough that the reader can extrapolate the findings to other areas of clinical and management issues in anesthesiology.

  1. The Perfect Marriage: Integrated Word Processing and Data Base Management Programs.

    ERIC Educational Resources Information Center

    Pogrow, Stanley

    1983-01-01

    Discussion of database integration and how it operates includes recommendations on compatible brand name word processing and database management programs, and a checklist for evaluating essential and desirable features of the available programs. (MBR)

  2. Liz Torres | NREL

    Science.gov Websites

    of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005

  3. Teaching Case: Adapting the Access Northwind Database to Support a Database Course

    ERIC Educational Resources Information Center

    Dyer, John N.; Rogers, Camille

    2015-01-01

    A common problem encountered when teaching database courses is that few large illustrative databases exist to support teaching and learning. Most database textbooks have small "toy" databases that are chapter objective specific, and thus do not support application over the complete domain of design, implementation and management concepts…

  4. 76 FR 41792 - Information Collection Being Submitted for Review and Approval to the Office of Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... administrator from the private sector to create and operate TV band databases. The TV band database... database administrator will be responsible for operation of their database and coordination of the overall functioning of the database with other administrators, and will provide database access to TVBDs. The...

  5. Small Business Innovations (Integrated Database)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Because of the diversity of NASA's information systems, it was necessary to develop DAVID as a central database management system. Under a Small Business Innovation Research (SBIR) grant, Ken Wanderman and Associates, Inc. designed software tools enabling scientists to interface with DAVID and commercial database management systems, as well as artificial intelligence programs. The software has been installed at a number of data centers and is commercially available.

  6. Are Bibliographic Management Software Search Interfaces Reliable?: A Comparison between Search Results Obtained Using Database Interfaces and the EndNote Online Search Function

    ERIC Educational Resources Information Center

    Fitzgibbons, Megan; Meert, Deborah

    2010-01-01

    The use of bibliographic management software and its internal search interfaces is now pervasive among researchers. This study compares the results between searches conducted in academic databases' search interfaces versus the EndNote search interface. The results show mixed search reliability, depending on the database and type of search…

  7. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Yakov; Munro, Rosemary; Lang, Rüdiger; Fiedler, Lars; Dyer, Richard; Eisinger, Michael

    2010-05-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument's health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument's degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  8. Operational Monitoring of GOME-2 and IASI Level 1 Product Processing at EUMETSAT

    NASA Astrophysics Data System (ADS)

    Livschitz, Y.; Munro, R.; Lang, R.; Fiedler, L.; Dyer, R.; Eisinger, M.

    2009-12-01

    The growing complexity of operational level 1 radiance products from Low Earth Orbiting (LEO) platforms like EUMETSATs Metop series makes near-real-time monitoring of product quality a challenging task. The main challenge is to provide a monitoring system which is flexible and robust enough to identify and to react to anomalies which may be previously unknown to the system, as well as to provide all means and parameters necessary in order to support efficient ad-hoc analysis of the incident. The operational monitoring system developed at EUMETSAT for monitoring of GOME-2 and IASI level 1 data allows to perform near-real-time monitoring of operational products and instrument’s health in a robust and flexible fashion. For effective information management, the system is based on a relational database (Oracle). An Extract, Transform, Load (ETL) process transforms products in EUMETSAT Polar System (EPS) format into relational data structures. The identification of commonalities between products and instruments allows for a database structure design in such a way that different data can be analyzed using the same business intelligence functionality. An interactive analysis software implementing modern data mining techniques is also provided for a detailed look into the data. The system is effectively used for day-to-day monitoring, long-term reporting, instrument’s degradation analysis as well as for ad-hoc queries in case of an unexpected instrument or processing behaviour. Having data from different sources on a single instrument and even from different instruments, platforms or numerical weather prediction within the same database allows effective cross-comparison and looking for correlated parameters. Automatic alarms raised by checking for deviation of certain parameters, for data losses and other events significantly reduce time, necessary to monitor the processing on a day-to-day basis.

  9. Scandinavian systems monitoring the oral health in children and adolescents; an evaluation of their quality and utility in the light of modern perspectives of caries management

    PubMed Central

    2014-01-01

    Background Recording reliable oral health data is a challenge. The aims were a) to outline different Scandinavian systems of oral health monitoring, b) to evaluate the quality and utility of the collected data in the light of modern concepts of disease management and to suggest improvements. Material and methods The information for in this study was related to (a) children and adolescents, (b) oral health data and (c) routines for monitoring such data. This meant information available in the official web sites of the “KOSTRA-data” (Municipality-State-Report) in Norway, the Swedish National Board of Health and Welfare (“Socialstyrelsen”) and Oral Health Register (the SCOR system, National Board of Health) in Denmark. Results A potential for increasing the reliability and validity of the data existed. Routines for monitoring other oral diseases than caries were limited. Compared with the other Scandinavian countries, the data collection system in Denmark appeared more functional and had adopted more modern concepts of disease management than other systems. In the light of modern concepts of caries management, data collected elsewhere had limited utility. Conclusions The Scandinavian systems of health reporting had much in common, but some essential differences existed. If the quality of epidemiological data were enhanced, it would be possible to use the data for planning oral health care. Routines and procedures should be improved and updated in accordance with the modern ideas about caries prevention and therapy. For appropriate oral health planning in an organised dental service, reporting of enamel caries is essential. PMID:24885243

  10. Technologies and standards in the information systems of the soil-geographic database of Russia

    NASA Astrophysics Data System (ADS)

    Golozubov, O. M.; Rozhkov, V. A.; Alyabina, I. O.; Ivanov, A. V.; Kolesnikova, V. M.; Shoba, S. A.

    2015-01-01

    The achievements, problems, and challenges of the modern stage of the development of the Soil-Geographic Database of Russia (SGDBR) and the history of this project are outlined. The structure of the information system of the SGDBR as an internet-based resource to collect data on soil profiles and to integrate the geographic and attribute databases on the same platform is described. The pilot project in Rostov oblast illustrates the inclusion of regional information in the SGDBR and its application for solving practical problems. For the first time in Russia, the GeoRSS standard based on the structured hypertext representation of the geographic and attribute information has been applied in the state system for the agromonitoring of agricultural lands in Rostov oblast and information exchange through the internet.

  11. Mathematical modeling of project management in logistics systems based on two-dimensional random vector

    NASA Astrophysics Data System (ADS)

    Glushkova, Yu O.; Gordashnukova, O. Yu; Pahomova, A. V.; Shatohina, S. P.; Filippov, D. V.

    2018-05-01

    The modern markets are characterized by fierce competition, constantly changing demand, increasing demands of consumers, shortening of the life cycle of goods and services in connection with scientific and technological progress. Therefore, for survival, modern logistic systems of industrial enterprises must be constantly improved. Modern economic literature is represented by a large volume of publications on various aspects of the studied issues. They consider the issues of project management in the logistics system that inevitably encounter with triple Limited. It initially describes the balance between project content, cost, and time. Later it was suggested to either replace the content with quality or add a fourth criterion. Therefore it is possible to name such limitation as triple or four-criteria limitation.

  12. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1993-01-01

    One of the biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental data base access method, VIEWCACHE, provides such an interface for accessing distributed datasets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image datasets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate database search.

  13. District of Columbia: D.C. Public Schools' Modernization Program Faces Major Challenges. Testimony before the Subcommittee on the District of Columbia, Committee on Appropriations, House of Representatives.

    ERIC Educational Resources Information Center

    Cooper, David E.

    This Congressional testimony focuses on the challenges faced by the District of Columbia in modernizing its public schools. Specifically, it addresses: (1) increases in the cost of modernizing the schools; (2) delays in completing the schools; (3) quality inspection problems; and (4) concerns about managing asbestos hazards. The testimony…

  14. Effects of Budget Reductions on Army Acquisition Support of Equipping and Modernization Goals

    DTIC Science & Technology

    2015-04-16

    overall Army budgets are significantly reduced (34% since 2008), maintaining the entire equipment portfolio reduces the funding available to meet...the Mission Command portfolio , examine their impact on equipping and modernization, and make recommendations on how to divest the equipment no longer... portfolio of equipment being managed and the link to the new Defense guidance and Army equipping guidance and modernization plans. Any systems or programs

  15. Evidence generation from healthcare databases: recommendations for managing change.

    PubMed

    Bourke, Alison; Bate, Andrew; Sauer, Brian C; Brown, Jeffrey S; Hall, Gillian C

    2016-07-01

    There is an increasing reliance on databases of healthcare records for pharmacoepidemiology and other medical research, and such resources are often accessed over a long period of time so it is vital to consider the impact of changes in data, access methodology and the environment. The authors discuss change in communication and management, and provide a checklist of issues to consider for both database providers and users. The scope of the paper is database research, and changes are considered in relation to the three main components of database research: the data content itself, how it is accessed, and the support and tools needed to use the database. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Flight Deck Interval Management Display. [Elements, Information and Annunciations Database User Guide

    NASA Technical Reports Server (NTRS)

    Lancaster, Jeff; Dillard, Michael; Alves, Erin; Olofinboba, Olu

    2014-01-01

    The User Guide details the Access Database provided with the Flight Deck Interval Management (FIM) Display Elements, Information, & Annunciations program. The goal of this User Guide is to support ease of use and the ability to quickly retrieve and select items of interest from the Database. The Database includes FIM Concepts identified in a literature review preceding the publication of this document. Only items that are directly related to FIM (e.g., spacing indicators), which change or enable FIM (e.g., menu with control buttons), or which are affected by FIM (e.g., altitude reading) are included in the database. The guide has been expanded from previous versions to cover database structure, content, and search features with voiced explanations.

  17. The Future of Asset Management for Human Space Exploration: Supply Classification and an Integrated Database

    NASA Technical Reports Server (NTRS)

    Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert

    2006-01-01

    One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.

  18. A federated information management system for the Deep Space Network. M.S. Thesis - Univ. of Southern California

    NASA Technical Reports Server (NTRS)

    Dobinson, E.

    1982-01-01

    General requirements for an information management system for the deep space network (DSN) are examined. A concise review of available database management system technology is presented. It is recommended that a federation of logically decentralized databases be implemented for the Network Information Management System of the DSN. Overall characteristics of the federation are specified, as well as reasons for adopting this approach.

  19. Applications of Database Machines in Library Systems.

    ERIC Educational Resources Information Center

    Salmon, Stephen R.

    1984-01-01

    Characteristics and advantages of database machines are summarized and their applications to library functions are described. The ability to attach multiple hosts to the same database and flexibility in choosing operating and database management systems for different functions without loss of access to common database are noted. (EJS)

  20. Conflation and integration of archived geologic maps and associated uncertainties

    USGS Publications Warehouse

    Shoberg, Thomas G.

    2016-01-01

    Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.

  1. Should business management training be part of medical education?

    PubMed

    Hornick, P; Hornick, C; Taylor, K; Ratnatunga, C

    1997-09-01

    The introduction of modern business management practices in the National Health Service has exposed different levels of management training and expertise between clinicians and hospital managers. This is undesirable and frequently unproductive. This article puts the case for the inclusion of business management training within undergraduate and postgraduate curricula.

  2. Distribution Management System Volt/VAR Evaluation | Grid Modernization |

    Science.gov Websites

    NREL Distribution Management System Volt/VAR Evaluation Distribution Management System Volt/VAR Evaluation This project involves building a prototype distribution management system testbed that links a GE Grid Solutions distribution management system to power hardware-in-the-loop testing. This setup is

  3. Approaches of Improving University Assets Management Efficiency

    ERIC Educational Resources Information Center

    Wang, Jingliang

    2015-01-01

    University assets management, as an important content of modern university management, is generally confronted with the issue of low efficiency. Currently, to address the problems exposed in university assets management and take appropriate modification measures is an urgent issue in front of Chinese university assets management sectors. In this…

  4. Heterogeneous distributed query processing: The DAVID system

    NASA Technical Reports Server (NTRS)

    Jacobs, Barry E.

    1985-01-01

    The objective of the Distributed Access View Integrated Database (DAVID) project is the development of an easy to use computer system with which NASA scientists, engineers and administrators can uniformly access distributed heterogeneous databases. Basically, DAVID will be a database management system that sits alongside already existing database and file management systems. Its function is to enable users to access the data in other languages and file systems without having to learn the data manipulation languages. Given here is an outline of a talk on the DAVID project and several charts.

  5. Development of geotechnical analysis and design modules for the Virginia Department of Transportation's geotechnical database.

    DOT National Transportation Integrated Search

    2005-01-01

    In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...

  6. Improving Recall Using Database Management Systems: A Learning Strategy.

    ERIC Educational Resources Information Center

    Jonassen, David H.

    1986-01-01

    Describes the use of microcomputer database management systems to facilitate the instructional uses of learning strategies relating to information processing skills, especially recall. Two learning strategies, cross-classification matrixing and node acquisition and integration, are highlighted. (Author/LRW)

  7. Using Online Databases in Corporate Issues Management.

    ERIC Educational Resources Information Center

    Thomsen, Steven R.

    1995-01-01

    Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)

  8. Theoretical bases of project management in conditions of innovative economy based on fuzzy modeling

    NASA Astrophysics Data System (ADS)

    Beilin, I. L.; Khomenko, V. V.

    2018-05-01

    In recent years, more and more Russian enterprises (both private and public) are trying to organize their activities on the basis of modern scientific research in order to improve the management of economic processes. Business planning, financial and investment analysis, modern software products based on the latest scientific developments are introduced everywhere. At the same time, there is a growing demand for market research (both at the microeconomic and macroeconomic levels), for financial and general economic information.

  9. DOD BUSINESS SYSTEMS MODERNIZATION: Key Marine Corps System Acquisition Needs to be Better Justified, Defined, and Managed

    DTIC Science & Technology

    2008-07-01

    recommendations. D f e t n t b r a m p s t • • • • T a r o To view the full product, including the scope and methodology, click on GAO-08...Modernization July 28, 2008 The Honorable Daniel K. Akaka Chairman The Honorable John Thune Ranking Member Subcommittee on Readiness and Management...Support Committee on Armed Services United States Senate The Honorable John Ensign United States Senate For decades, the Department of Defense

  10. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  11. Field surgery on a future conventional battlefield: strategy and wound management.

    PubMed Central

    Ryan, J. M.; Cooper, G. J.; Haywood, I. R.; Milner, S. M.

    1991-01-01

    Most papers appearing in the surgical literature dealing with wound ballistics concern themselves with wound management in the civilian setting. The pathophysiology of modern war wounds is contrasted with ballistic wounds commonly encountered in peacetime, but it should be noted that even in peacetime the modern terrorist may have access to sophisticated military weaponry, and that patients injured by them may fall within the catchment area of any civilian hospital. Management problems associated with both wound types are highlighted; areas of controversy are discussed. The orthodox military surgical approach to ballistic wounds is expounded and defended. Images Figure 2 Figure Figure 4 PMID:1996857

  12. PACSY, a relational database management system for protein structure and chemical shift analysis.

    PubMed

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo; Lee, Weontae; Markley, John L

    2012-10-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu.

  13. The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity

    NASA Astrophysics Data System (ADS)

    Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo

    2015-05-01

    The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.

  14. Reflective Database Access Control

    ERIC Educational Resources Information Center

    Olson, Lars E.

    2009-01-01

    "Reflective Database Access Control" (RDBAC) is a model in which a database privilege is expressed as a database query itself, rather than as a static privilege contained in an access control list. RDBAC aids the management of database access controls by improving the expressiveness of policies. However, such policies introduce new interactions…

  15. Modernization of the International Volcanic Ash Website - a global resource for ashfall preparedness and impact guidance.

    NASA Astrophysics Data System (ADS)

    Wallace, K.; Leonard, G.; Stewart, C.; Wilson, T. M.; Randall, M.; Stovall, W. K.

    2015-12-01

    The internationally collaborative volcanic ash website (http://volcanoes.usgs.gov/ash/) has been an important global information resource for ashfall preparedness and impact guidance since 2004. Recent volcanic ashfalls with significant local, regional, and global impacts highlighted the need to improve the website to make it more accessible and pertinent to users worldwide. Recently, the Volcanic Ash Impacts Working Group (Cities and Volcanoes Commission of IAVCEI) redesigned and modernized the website. Improvements include 1) a database-driven back end, 2) reorganized menu navigation, 3) language translation, 4) increased downloadable content, 5) addition of ash-impact case studies, 7) expanded and updated references , 8) an image database, and 9) inclusion of cooperating organization's logos. The database-driven platform makes the website more dynamic and efficient to operate and update. New menus provide information about specific impact topics (buildings, transportation, power, health, agriculture, water and waste water, equipment and communications, clean up) and updated content has been added throughout all topics. A new "for scientists" menu includes information on ash collection and analysis. Website translation using Google translate will significantly increase user base. Printable resources (e.g. checklists, pamphlets, posters) provide information to people without Internet access. Ash impact studies are used to improve mitigation measures during future eruptions, and links to case studies will assist communities' preparation and response plans. The Case Studies menu is intended to be a living topic area, growing as new case studies are published. A database of all images from the website allows users to access larger resolution images and additional descriptive details. Logos clarify linkages among key contributors and assure users that the site is authoritative and science-based.

  16. The Clinical Next-Generation Sequencing Database: A Tool for the Unified Management of Clinical Information and Genetic Variants to Accelerate Variant Pathogenicity Classification.

    PubMed

    Nishio, Shin-Ya; Usami, Shin-Ichi

    2017-03-01

    Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  17. Next Generation Waste Tracking: Linking Legacy Systems with Modern Networking Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Randy M.; Resseguie, David R.; Shankar, Mallikarjun

    2010-01-01

    This report describes results from a preliminary analysis to satisfy the Department of Energy (DOE) objective to ensure the safe, secure, efficient packaging and transportation of materials both hazardous and non hazardous [1, 2]. The DOE Office of Environmental Management (OEM) through Oak Ridge National Laboratory (ORNL) has embarked on a project to further this objective. OEM and ORNL have agreed to develop, demonstrate and make available modern day cost effective technologies for characterization, identification, tracking, monitoring and disposal of radioactive waste when transported by, or between, motor, air, rail, and water modes. During the past 8 years ORNL hasmore » investigated and deployed Web 2.0 compliant sensors into the transportation segment of the supply chain. ORNL has recently demonstrated operational experience with DOE Oak Ridge Operations Office (ORO) and others in national test beds and applications within this domain of the supply chain. Furthermore, in addition to DOE, these hazardous materials supply chain partners included Federal and State enforcement agencies, international ports, and commercial sector shipping operations in a hazardous/radioactive materials tracking and monitoring program called IntelligentFreight. IntelligentFreight is an ORNL initiative encompassing 5 years of research effort associated with the supply chain. The ongoing ORNL SmartFreight programs include RadSTraM [3], GRadSTraM , Trusted Corridors, SensorPedia [4], SensorNet, Southeastern Transportation Corridor Pilot (SETCP) and Trade Data Exchange [5]. The integration of multiple technologies aimed at safer more secure conveyance has been investigated with the core research question being focused on testing distinctly different distributed supply chain information sharing systems. ORNL with support from ORO have demonstrated capabilities when transporting Environmental Management (EM) waste materials for disposal over an onsite haul road. ORNL has unified the operations of existing legacy hazardous, radioactive and related informational databases and systems using emerging Web 2.0 technologies. These capabilities were used to interoperate ORNL s waste generating, packaging, transportation and disposal with other DOE ORO waste management contractors. Importantly, the DOE EM objectives were accomplished in a cost effective manner without altering existing information systems. A path forward is to demonstrate and share these technologies with DOE EM, contractors and stakeholders. This approach will not alter existing DOE assets, i.e. Automated Traffic Management Systems (ATMS), Transportation Tracking and Communications System (TRANSCOM), the Argonne National Laboratory (ANL) demonstrated package tracking system, etc« less

  18. Management of proximal humeral fractures in the nineteenth century: an historical review of preradiographic sources.

    PubMed

    Brorson, Stig

    2011-04-01

    The diagnosis and treatment of fractures of the proximal humerus have troubled patients and medical practitioners since antiquity. Preradiographic diagnosis relied on surface anatomy, pain localization, crepitus, and impaired function. During the nineteenth century, a more thorough understanding of the pathoanatomy and pathophysiology of proximal humeral fractures was obtained, and new methods of reduction and bandaging were developed. I reviewed nineteenth-century principles of (1) diagnosis, (2) classification, (3) reduction, (4) bandaging, and (5) concepts of displacement in fractures of the proximal humerus. A narrative review of nineteenth-century surgical texts is presented. Sources were identified by searching bibliographic databases, orthopaedic sourcebooks, textbooks in medical history, and a subsequent hand search. Substantial progress in understanding fractures of the proximal humerus is found in nineteenth-century textbooks. A rational approach to understanding fractures of the proximal humerus was made possible by an appreciation of the underlying functional anatomy and subsequent pathoanatomy. Thus, new principles of diagnosis, pathoanatomic classifications, modified methods of reduction, functional bandaging, and advanced concepts of displacement were proposed, challenging the classic management adhered to for more than 2000 years. The principles for modern pathoanatomic and pathophysiologic understanding of proximal humeral fractures and the principles for classification, nonsurgical treatment, and bandaging were established in the preradiographic era.

  19. Environment/Health/Safety (EHS): Databases

    Science.gov Websites

    Hazard Documents Database Biosafety Authorization System CATS (Corrective Action Tracking System) (for findings 12/2005 to present) Chemical Management System Electrical Safety Ergonomics Database (for new Learned / Best Practices REMS - Radiation Exposure Monitoring System SJHA Database - Subcontractor Job

  20. Implementation of Risk Management in NASA's CEV Project- Ensuring Mission Success

    NASA Astrophysics Data System (ADS)

    Perera, Jeevan; Holsomback, Jerry D.

    2005-12-01

    Most project managers know that Risk Management (RM) is essential to good project management. At NASA, standards and procedures to manage risk through a tiered approach have been developed - from the global agency-wide requirements down to a program or project's implementation. The basic methodology for NASA's risk management strategy includes processes to identify, analyze, plan, track, control, communicate and document risks. The identification, characterization, mitigation plan, and mitigation responsibilities associated with specific risks are documented to help communicate, manage, and effectuate appropriate closure. This approach helps to ensure more consistent documentation and assessment and provides a means of archiving lessons learned for future identification or mitigation activities.A new risk database and management tool was developed by NASA in 2002 and since has been used successfully to communicate, document and manage a number of diverse risks for the International Space Station, Space Shuttle, and several other NASA projects and programs including at the Johnson Space Center. Organizations use this database application to effectively manage and track each risk and gain insight into impacts from other organization's viewpoint to develop integrated solutions. Schedule, cost, technical and safety issues are tracked in detail through this system.Risks are tagged within the system to ensure proper review, coordination and management at the necessary management level. The database is intended as a day-to- day tool for organizations to manage their risks and elevate those issues that need coordination from above. Each risk is assigned to a managing organization and a specific risk owner who generates mitigation plans as appropriate. In essence, the risk owner is responsible for shepherding the risk through closure. The individual that identifies a new risk does not necessarily get assigned as the risk owner. Whoever is in the best position to effectuate comprehensive closure is assigned as the risk owner. Each mitigation plan includes the specific tasks that will be conducted to either decrease the likelihood of the risk occurring and/or lessen the severity of the consequences if they do occur. As each mitigation task is completed, the responsible managing organization records the completion of the task in the risk database and then re-scores the risk considering the task's results. By keeping scores updated, a managing organization's current top risks and risk posture can be readily identified including the status of any risk in the system.A number of metrics measure risk process trends from data contained in the database. This allows for trend analysis to further identify improvements to the process and assist in the management of all risks. The metrics will also scrutinize both the effectiveness and compliance of risk management requirements.The risk database is an evolving tool and will be continuously improved with capabilities requested by the NASA project community. This paper presents the basic foundations of risk management, the elements necessary for effective risk management, and the capabilities of this new risk database and how it is implemented to support NASA's risk management needs.

  1. DB Dehydrogenase: an online integrated structural database on enzyme dehydrogenase.

    PubMed

    Nandy, Suman Kumar; Bhuyan, Rajabrata; Seal, Alpana

    2012-01-01

    Dehydrogenase enzymes are almost inevitable for metabolic processes. Shortage or malfunctioning of dehydrogenases often leads to several acute diseases like cancers, retinal diseases, diabetes mellitus, Alzheimer, hepatitis B & C etc. With advancement in modern-day research, huge amount of sequential, structural and functional data are generated everyday and widens the gap between structural attributes and its functional understanding. DB Dehydrogenase is an effort to relate the functionalities of dehydrogenase with its structures. It is a completely web-based structural database, covering almost all dehydrogenases [~150 enzyme classes, ~1200 entries from ~160 organisms] whose structures are known. It is created by extracting and integrating various online resources to provide the true and reliable data and implemented by MySQL relational database through user friendly web interfaces using CGI Perl. Flexible search options are there for data extraction and exploration. To summarize, sequence, structure, function of all dehydrogenases in one place along with the necessary option of cross-referencing; this database will be utile for researchers to carry out further work in this field. The database is available for free at http://www.bifku.in/DBD/

  2. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  3. JPEG2000 and dissemination of cultural heritage over the Internet.

    PubMed

    Politou, Eugenia A; Pavlidis, George P; Chamzas, Christodoulos

    2004-03-01

    By applying the latest technologies in image compression for managing the storage of massive image data within cultural heritage databases and by exploiting the universality of the Internet we are now able not only to effectively digitize, record and preserve, but also to promote the dissemination of cultural heritage. In this work we present an application of the latest image compression standard JPEG2000 in managing and browsing image databases, focusing on the image transmission aspect rather than database management and indexing. We combine the technologies of JPEG2000 image compression with client-server socket connections and client browser plug-in, as to provide with an all-in-one package for remote browsing of JPEG2000 compressed image databases, suitable for the effective dissemination of cultural heritage.

  4. Configuration management program plan for Hanford site systems engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kellie, C.L.

    This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.

  5. Microgrid Controls | Grid Modernization | NREL

    Science.gov Websites

    Systems Integration Facility. Microgrid Controller Interaction with Distribution Management Systems This project investigates the interaction of distribution management systems with local controllers, including microgrid controllers. The project is developing integrated control and management systems for distribution

  6. Case Study: IRS Business System Modernization Process Improvement

    DTIC Science & Technology

    2004-03-01

    31 CMU/SEI-2004-TR-002 iii List of Figures Figure 1: Managing Organizational Change ............................................................ 9...and constant emphasis on training in project management and acquisition, as well as in the SA-CMM. Figure 1: Managing Organizational Change Other

  7. TRENDS: The aeronautical post-test database management system

    NASA Technical Reports Server (NTRS)

    Bjorkman, W. S.; Bondi, M. J.

    1990-01-01

    TRENDS, an engineering-test database operating system developed by NASA to support rotorcraft flight tests, is described. Capabilities and characteristics of the system are presented, with examples of its use in recalling and analyzing rotorcraft flight-test data from a TRENDS database. The importance of system user-friendliness in gaining users' acceptance is stressed, as is the importance of integrating supporting narrative data with numerical data in engineering-test databases. Considerations relevant to the creation and maintenance of flight-test database are discussed and TRENDS' solutions to database management problems are described. Requirements, constraints, and other considerations which led to the system's configuration are discussed and some of the lessons learned during TRENDS' development are presented. Potential applications of TRENDS to a wide range of aeronautical and other engineering tests are identified.

  8. DOE technology information management system database study report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widing, M.A.; Blodgett, D.W.; Braun, M.D.

    1994-11-01

    To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.« less

  9. Netlib services and resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, S.V.; Green, S.C.; Moore, K.

    1994-04-01

    The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less

  10. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  11. Foundational Report Series. Advanced Distribution management Systems for Grid Modernization (Importance of DMS for Distribution Grid Modernization)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jianhui

    2015-09-01

    Grid modernization is transforming the operation and management of electric distribution systems from manual, paper-driven business processes to electronic, computer-assisted decisionmaking. At the center of this business transformation is the distribution management system (DMS), which provides a foundation from which optimal levels of performance can be achieved in an increasingly complex business and operating environment. Electric distribution utilities are facing many new challenges that are dramatically increasing the complexity of operating and managing the electric distribution system: growing customer expectations for service reliability and power quality, pressure to achieve better efficiency and utilization of existing distribution system assets, and reductionmore » of greenhouse gas emissions by accommodating high penetration levels of distributed generating resources powered by renewable energy sources (wind, solar, etc.). Recent “storm of the century” events in the northeastern United States and the lengthy power outages and customer hardships that followed have greatly elevated the need to make power delivery systems more resilient to major storm events and to provide a more effective electric utility response during such regional power grid emergencies. Despite these newly emerging challenges for electric distribution system operators, only a small percentage of electric utilities have actually implemented a DMS. This paper discusses reasons why a DMS is needed and why the DMS may emerge as a mission-critical system that will soon be considered essential as electric utilities roll out their grid modernization strategies.« less

  12. Surface Transportation Security Priority Assessment

    DTIC Science & Technology

    2010-03-01

    intercity buses), and pipelines, and related infrastructure (including roads and highways), that are within the territory of the United States...Modernizing the information technology infrastructure used to vet the identity of travelers and transportation workers  Using terrorist databases to...examination of persons travelling , surface transportation modes tend to operate in a much more open environment, making it difficult to screen workers

  13. Human and biophysical factors influencing modern fire disturbance in northern Wisconsin

    Treesearch

    Brian R. Sturtevant; David T. Cleland

    2007-01-01

    Humans cause most wildfires in northern Wisconsin, but interactions between human and biophysical variables affecting fire starts and size are not well understood. We applied classification tree analyses to a 16-year fire database from northern Wisconsin to evaluate the relative importance of human v. biophysical variables affecting fire occurrence within (1) all cover...

  14. Master's Degree in Management Information Systems with a Supply Chain Management Focus

    ERIC Educational Resources Information Center

    Ramaswamy, Kizhanatham V.; Boyd, Joseph L.; Desai, Mayur

    2007-01-01

    A graduate curriculum in Management Information Systems with a Supply Chain Management focus is presented. The motivation for this endeavor stems from the fact that the global scope of modern business organizations and the competitive environment in which they operate, requires an information system leveraged supply chain management system (SCM)…

  15. 77 FR 20010 - Proposed Information Collection; Comment Request; Online Customer Relationship Management (CRM...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... DEPARTMENT OF COMMERCE Minority Business Development Agency Proposed Information Collection; Comment Request; Online Customer Relationship Management (CRM)/Performance Databases, the Online Phoenix... of program goals via the Online CRM/Performance Databases. The data collected through the Online CRM...

  16. Implementing a Microcomputer Database Management System.

    ERIC Educational Resources Information Center

    Manock, John J.; Crater, K. Lynne

    1985-01-01

    Current issues in selecting, structuring, and implementing microcomputer database management systems in research administration offices are discussed, and their capabilities are illustrated with the system used by the University of North Carolina at Wilmington. Trends in microcomputer technology and their likely impact on research administration…

  17. Palliative care and active disease management are synergistic in modern surgical oncology.

    PubMed

    Sadler, Erin M; Hawley, Philippa H; Easson, Alexandra M

    2018-04-01

    Palliative care has long been described in medical literature but only recently is being discussed in the surgical domain. Mounting evidence suggests that early integration of palliative care improves patient outcomes and this is especially true of oncology patients. Thus, the pendulum is swinging toward recognizing that palliative care and active disease management are not mutually exclusive but rather synergistic in modern surgical oncology. Here we use a patient vignette to demonstrate the new challenges and possibilities in modern surgical oncology, we then discuss the historic perspective of palliative care and describe how the paradigm is shifting. Finally, we introduce a model that may be beneficial in conceptualizing this new way of thinking about and integrating palliative care into surgical oncology. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Clinical Databases for Chest Physicians.

    PubMed

    Courtwright, Andrew M; Gabriel, Peter E

    2018-04-01

    A clinical database is a repository of patient medical and sociodemographic information focused on one or more specific health condition or exposure. Although clinical databases may be used for research purposes, their primary goal is to collect and track patient data for quality improvement, quality assurance, and/or actual clinical management. This article aims to provide an introduction and practical advice on the development of small-scale clinical databases for chest physicians and practice groups. Through example projects, we discuss the pros and cons of available technical platforms, including Microsoft Excel and Access, relational database management systems such as Oracle and PostgreSQL, and Research Electronic Data Capture. We consider approaches to deciding the base unit of data collection, creating consensus around variable definitions, and structuring routine clinical care to complement database aims. We conclude with an overview of regulatory and security considerations for clinical databases. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  19. The ATLAS TAGS database distribution and management - Operational challenges of a multi-terabyte distributed database

    NASA Astrophysics Data System (ADS)

    Viegas, F.; Malon, D.; Cranshaw, J.; Dimitrov, G.; Nowak, M.; Nairz, A.; Goossens, L.; Gallas, E.; Gamboa, C.; Wong, A.; Vinek, E.

    2010-04-01

    The TAG files store summary event quantities that allow a quick selection of interesting events. This data will be produced at a nominal rate of 200 Hz, and is uploaded into a relational database for access from websites and other tools. The estimated database volume is 6TB per year, making it the largest application running on the ATLAS relational databases, at CERN and at other voluntary sites. The sheer volume and high rate of production makes this application a challenge to data and resource management, in many aspects. This paper will focus on the operational challenges of this system. These include: uploading the data from files to the CERN's and remote sites' databases; distributing the TAG metadata that is essential to guide the user through event selection; controlling resource usage of the database, from the user query load to the strategy of cleaning and archiving of old TAG data.

  20. PACSY, a relational database management system for protein structure and chemical shift analysis

    PubMed Central

    Lee, Woonghee; Yu, Wookyung; Kim, Suhkmann; Chang, Iksoo

    2012-01-01

    PACSY (Protein structure And Chemical Shift NMR spectroscopY) is a relational database management system that integrates information from the Protein Data Bank, the Biological Magnetic Resonance Data Bank, and the Structural Classification of Proteins database. PACSY provides three-dimensional coordinates and chemical shifts of atoms along with derived information such as torsion angles, solvent accessible surface areas, and hydrophobicity scales. PACSY consists of six relational table types linked to one another for coherence by key identification numbers. Database queries are enabled by advanced search functions supported by an RDBMS server such as MySQL or PostgreSQL. PACSY enables users to search for combinations of information from different database sources in support of their research. Two software packages, PACSY Maker for database creation and PACSY Analyzer for database analysis, are available from http://pacsy.nmrfam.wisc.edu. PMID:22903636

Top