Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, CT; Notice of Negative... Services, Inc., Corporate/EIT/CTO Database Management Division, Hartford, Connecticut (The Hartford, Corporate/EIT/CTO Database Management Division). The negative determination was issued on August 19, 2011...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
47 CFR 52.101 - General definitions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Center (“NASC”). The entity that provides user support for the Service Management System database and administers the Service Management System database on a day-to-day basis. (b) Responsible Organization (“Resp... regional databases in the toll free network. (d) Service Management System Database (“SMS Database”). The...
Graphical user interfaces for symbol-oriented database visualization and interaction
NASA Astrophysics Data System (ADS)
Brinkschulte, Uwe; Siormanolakis, Marios; Vogelsang, Holger
1997-04-01
In this approach, two basic services designed for the engineering of computer based systems are combined: a symbol-oriented man-machine-service and a high speed database-service. The man-machine service is used to build graphical user interfaces (GUIs) for the database service; these interfaces are stored using the database service. The idea is to create a GUI-builder and a GUI-manager for the database service based upon the man-machine service using the concept of symbols. With user-definable and predefined symbols, database contents can be visualized and manipulated in a very flexible and intuitive way. Using the GUI-builder and GUI-manager, a user can build and operate its own graphical user interface for a given database according to its needs without writing a single line of code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Yubin; Shankar, Mallikarjun; Park, Byung H.
Designing a database system for both efficient data management and data services has been one of the enduring challenges in the healthcare domain. In many healthcare systems, data services and data management are often viewed as two orthogonal tasks; data services refer to retrieval and analytic queries such as search, joins, statistical data extraction, and simple data mining algorithms, while data management refers to building error-tolerant and non-redundant database systems. The gap between service and management has resulted in rigid database systems and schemas that do not support effective analytics. We compose a rich graph structure from an abstracted healthcaremore » RDBMS to illustrate how we can fill this gap in practice. We show how a healthcare graph can be automatically constructed from a normalized relational database using the proposed 3NF Equivalent Graph (3EG) transformation.We discuss a set of real world graph queries such as finding self-referrals, shared providers, and collaborative filtering, and evaluate their performance over a relational database and its 3EG-transformed graph. Experimental results show that the graph representation serves as multiple de-normalized tables, thus reducing complexity in a database and enhancing data accessibility of users. Based on this finding, we propose an ensemble framework of databases for healthcare applications.« less
interoperability emerging infrastructure for data management on computational grids Software Packages Services : ATLAS: Management and Steering: Computing Management Board Software Project Management Board Database Model Group Computing TDR: 4.5 Event Data 4.8 Database and Data Management Services 6.3.4 Production and
Database on Demand: insight how to build your own DBaaS
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio
2015-12-01
At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.
The INFN-CNAF Tier-1 GEMSS Mass Storage System and database facility activity
NASA Astrophysics Data System (ADS)
Ricci, Pier Paolo; Cavalli, Alessandro; Dell'Agnello, Luca; Favaro, Matteo; Gregori, Daniele; Prosperini, Andrea; Pezzi, Michele; Sapunenko, Vladimir; Zizzi, Giovanni; Vagnoni, Vincenzo
2015-05-01
The consolidation of Mass Storage services at the INFN-CNAF Tier1 Storage department that has occurred during the last 5 years, resulted in a reliable, high performance and moderately easy-to-manage facility that provides data access, archive, backup and database services to several different use cases. At present, the GEMSS Mass Storage System, developed and installed at CNAF and based upon an integration between the IBM GPFS parallel filesystem and the Tivoli Storage Manager (TSM) tape management software, is one of the largest hierarchical storage sites in Europe. It provides storage resources for about 12% of LHC data, as well as for data of other non-LHC experiments. Files are accessed using standard SRM Grid services provided by the Storage Resource Manager (StoRM), also developed at CNAF. Data access is also provided by XRootD and HTTP/WebDaV endpoints. Besides these services, an Oracle database facility is in production characterized by an effective level of parallelism, redundancy and availability. This facility is running databases for storing and accessing relational data objects and for providing database services to the currently active use cases. It takes advantage of several Oracle technologies, like Real Application Cluster (RAC), Automatic Storage Manager (ASM) and Enterprise Manager centralized management tools, together with other technologies for performance optimization, ease of management and downtime reduction. The aim of the present paper is to illustrate the state-of-the-art of the INFN-CNAF Tier1 Storage department infrastructures and software services, and to give a brief outlook to forthcoming projects. A description of the administrative, monitoring and problem-tracking tools that play a primary role in managing the whole storage framework is also given.
NASA Astrophysics Data System (ADS)
Bhanumurthy, V.; Venugopala Rao, K.; Srinivasa Rao, S.; Ram Mohan Rao, K.; Chandra, P. Satya; Vidhyasagar, J.; Diwakar, P. G.; Dadhwal, V. K.
2014-11-01
Geographical Information Science (GIS) is now graduated from traditional desktop system to Internet system. Internet GIS is emerging as one of the most promising technologies for addressing Emergency Management. Web services with different privileges are playing an important role in dissemination of the emergency services to the decision makers. Spatial database is one of the most important components in the successful implementation of Emergency Management. It contains spatial data in the form of raster, vector, linked with non-spatial information. Comprehensive data is required to handle emergency situation in different phases. These database elements comprise core data, hazard specific data, corresponding attribute data, and live data coming from the remote locations. Core data sets are minimum required data including base, thematic, infrastructure layers to handle disasters. Disaster specific information is required to handle a particular disaster situation like flood, cyclone, forest fire, earth quake, land slide, drought. In addition to this Emergency Management require many types of data with spatial and temporal attributes that should be made available to the key players in the right format at right time. The vector database needs to be complemented with required resolution satellite imagery for visualisation and analysis in disaster management. Therefore, the database is interconnected and comprehensive to meet the requirement of an Emergency Management. This kind of integrated, comprehensive and structured database with appropriate information is required to obtain right information at right time for the right people. However, building spatial database for Emergency Management is a challenging task because of the key issues such as availability of data, sharing policies, compatible geospatial standards, data interoperability etc. Therefore, to facilitate using, sharing, and integrating the spatial data, there is a need to define standards to build emergency database systems. These include aspects such as i) data integration procedures namely standard coding scheme, schema, meta data format, spatial format ii) database organisation mechanism covering data management, catalogues, data models iii) database dissemination through a suitable environment, as a standard service for effective service dissemination. National Database for Emergency Management (NDEM) is such a comprehensive database for addressing disasters in India at the national level. This paper explains standards for integrating, organising the multi-scale and multi-source data with effective emergency response using customized user interfaces for NDEM. It presents standard procedure for building comprehensive emergency information systems for enabling emergency specific functions through geospatial technologies.
[Quality management and participation into clinical database].
Okubo, Suguru; Miyata, Hiroaki; Tomotaki, Ai; Motomura, Noboru; Murakami, Arata; Ono, Minoru; Iwanaka, Tadashi
2013-07-01
Quality management is necessary for establishing useful clinical database in cooperation with healthcare professionals and facilities. The ways of management are 1) progress management of data entry, 2) liaison with database participants (healthcare professionals), and 3) modification of data collection form. In addition, healthcare facilities are supposed to consider ethical issues and information security for joining clinical databases. Database participants should check ethical review boards and consultation service for patients.
Reef Ecosystem Services and Decision Support Database
This scientific and management information database utilizes systems thinking to describe the linkages between decisions, human activities, and provisioning of reef ecosystem goods and services. This database provides: (1) Hierarchy of related topics - Click on topics to navigat...
Do Librarians Really Do That? Or Providing Custom, Fee-Based Services.
ERIC Educational Resources Information Center
Whitmore, Susan; Heekin, Janet
This paper describes some of the fee-based, custom services provided by National Institutes of Health (NIH) Library to NIH staff, including knowledge management, clinical liaisons, specialized database searching, bibliographic database development, Web resource guide development, and journal management. The first section discusses selecting the…
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...
DIMA.Tools: An R package for working with the database for inventory, monitoring, and assessment
USDA-ARS?s Scientific Manuscript database
The Database for Inventory, Monitoring, and Assessment (DIMA) is a Microsoft Access database used to collect, store and summarize monitoring data. This database is used by both local and national monitoring efforts within the National Park Service, the Forest Service, the Bureau of Land Management, ...
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.
2012-12-01
At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.
The Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Kirby, Michael
2014-06-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.
An optical scan/statistical package for clinical data management in C-L psychiatry.
Hammer, J S; Strain, J J; Lyerly, M
1993-03-01
This paper explores aspects of the need for clinical database management systems that permit ongoing service management, measurement of the quality and appropriateness of care, databased administration of consultation liaison (C-L) services, teaching/educational observations, and research. It describes an OPTICAL SCAN databased management system that permits flexible form generation, desktop publishing, and linking of observations in multiple files. This enhanced MICRO-CARES software system--Medical Application Platform (MAP)--permits direct transfer of the data to ASCII and SAS format for mainframe manipulation of the clinical information. The director of a C-L service may now develop his or her own forms, incorporate structured instruments, or develop "branch chains" of essential data to add to the core data set without the effort and expense to reprint forms or consult with commercial vendors.
Solutions in radiology services management: a literature review.
Pereira, Aline Garcia; Vergara, Lizandra Garcia Lupi; Merino, Eugenio Andrés Díaz; Wagner, Adriano
2015-01-01
The present study was aimed at reviewing the literature to identify solutions for problems observed in radiology services. Basic, qualitative, exploratory literature review at Scopus and SciELO databases, utilizing the Mendeley and Illustrator CC Adobe softwares. In the databases, 565 papers - 120 out of them, pdf free - were identified. Problems observed in the radiology sector are related to procedures scheduling, humanization, lack of training, poor knowledge and use of management techniques, and interaction with users. The design management provides the services with interesting solutions such as Benchmarking, CRM, Lean Approach, ServiceBlueprinting, continued education, among others. Literature review is an important tool to identify problems and respective solutions. However, considering the small number of studies approaching management of radiology services, this is a great field of research for the development of deeper studies.
Enabling On-Demand Database Computing with MIT SuperCloud Database Management System
2015-09-15
arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created
The Fabric for Frontier Experiments Project at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirby, Michael
2014-01-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere, 2) an extensive data management system for managing local and remote caches, cataloging, querying,more » moving, and tracking the use of data, 3) custom and generic database applications for calibrations, beam information, and other purposes, 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.« less
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005
Microcomputer-Based Access to Machine-Readable Numeric Databases.
ERIC Educational Resources Information Center
Wenzel, Patrick
1988-01-01
Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)
Solutions in radiology services management: a literature review*
Pereira, Aline Garcia; Vergara, Lizandra Garcia Lupi; Merino, Eugenio Andrés Díaz; Wagner, Adriano
2015-01-01
Objective The present study was aimed at reviewing the literature to identify solutions for problems observed in radiology services. Materials and Methods Basic, qualitative, exploratory literature review at Scopus and SciELO databases, utilizing the Mendeley and Illustrator CC Adobe softwares. Results In the databases, 565 papers – 120 out of them, pdf free – were identified. Problems observed in the radiology sector are related to procedures scheduling, humanization, lack of training, poor knowledge and use of management techniques, and interaction with users. The design management provides the services with interesting solutions such as Benchmarking, CRM, Lean Approach, ServiceBlueprinting, continued education, among others. Conclusion Literature review is an important tool to identify problems and respective solutions. However, considering the small number of studies approaching management of radiology services, this is a great field of research for the development of deeper studies. PMID:26543281
Using Online Databases in Corporate Issues Management.
ERIC Educational Resources Information Center
Thomsen, Steven R.
1995-01-01
Finds that corporate public relations practitioners felt they were able, using online database and information services, to intercept issues earlier in the "issue cycle" and thus enable their organizations to develop more "proactionary" or "catalytic" issues management repose strategies. (SR)
USDA-ARS?s Scientific Manuscript database
Agroecosystem models and conservation planning tools require spatially and temporally explicit input data about agricultural management operations. The USDA Natural Resources Conservation Service is developing a Land Management and Operation Database (LMOD) which contains potential model input, howe...
NNDC Stand: Activities and Services of the National Nuclear Data Center
NASA Astrophysics Data System (ADS)
Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.
2005-05-01
The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.
Lessons Learned With a Global Graph and Ozone Widget Framework (OWF) Testbed
2013-05-01
of operating system and database environments. The following is one example. Requirements are: Java 1.6 + and a Relational Database Management...We originally tried to use MySQL as our database, because we were more familiar with it, but since the database dumps as well as most of the...Global Graph Rest Services In order to set up the Global Graph Rest Services, you will need to have the following dependencies installed: Java 1.6
Health technology management: a database analysis as support of technology managers in hospitals.
Miniati, Roberto; Dori, Fabrizio; Iadanza, Ernesto; Fregonara, Mario M; Gentili, Guido Biffi
2011-01-01
Technology management in healthcare must continually respond and adapt itself to new improvements in medical equipment. Multidisciplinary approaches which consider the interaction of different technologies, their use and user skills, are necessary in order to improve safety and quality. An easy and sustainable methodology is vital to Clinical Engineering (CE) services in healthcare organizations in order to define criteria regarding technology acquisition and replacement. This article underlines the critical aspects of technology management in hospitals by providing appropriate indicators for benchmarking CE services exclusively referring to the maintenance database from the CE department at the Careggi Hospital in Florence, Italy.
Code of Federal Regulations, 2014 CFR
2014-10-01
... the Service Management System database without having an actual toll free subscriber for whom those... database; or (2) The Responsible Organization does not have an identified toll free subscriber agreeing to... database shall serve as that Responsible Organization's certification that there is an identified toll free...
Code of Federal Regulations, 2011 CFR
2011-10-01
... the Service Management System database without having an actual toll free subscriber for whom those... database; or (2) The Responsible Organization does not have an identified toll free subscriber agreeing to... database shall serve as that Responsible Organization's certification that there is an identified toll free...
Code of Federal Regulations, 2013 CFR
2013-10-01
... the Service Management System database without having an actual toll free subscriber for whom those... database; or (2) The Responsible Organization does not have an identified toll free subscriber agreeing to... database shall serve as that Responsible Organization's certification that there is an identified toll free...
Code of Federal Regulations, 2010 CFR
2010-10-01
... the Service Management System database without having an actual toll free subscriber for whom those... database; or (2) The Responsible Organization does not have an identified toll free subscriber agreeing to... database shall serve as that Responsible Organization's certification that there is an identified toll free...
Code of Federal Regulations, 2012 CFR
2012-10-01
... the Service Management System database without having an actual toll free subscriber for whom those... database; or (2) The Responsible Organization does not have an identified toll free subscriber agreeing to... database shall serve as that Responsible Organization's certification that there is an identified toll free...
Li, Huayan; Fuller, Jeffrey; Sun, Mei; Wang, Yong; Xu, Shuang; Feng, Hui
2014-11-01
To evaluate the situation for chronic disease management in China, and to seek the method for improving the collaborative management for chronic diseases in community. We searched literature between January 2008 and November 2013 from the Database, such as China Academic Journal Full-Text Database, and PubMed. The screening was strictly in accordance with the inclusion and exclusion criteria and a summary was made among the selected literature based on a collaboration model. We got 698 articles after rough screen and finally selected 33. All studies were involved in patient's self-management support, but only 9 studies mentioned the communication within the team, and 11 showed a clear team division of labor. Chronic disease community management in China displays some disadvantages. It really needs a general service team with clear roles and responsibilities for team members to improve the service ability of team members and provide patients with various forms of self management services.
Intelligent community management system based on the devicenet fieldbus
NASA Astrophysics Data System (ADS)
Wang, Yulan; Wang, Jianxiong; Liu, Jiwen
2013-03-01
With the rapid development of the national economy and the improvement of people's living standards, people are making higher demands on the living environment. And the estate management content, management efficiency and service quality have been higher required. This paper in-depth analyzes about the intelligent community of the structure and composition. According to the users' requirements and related specifications, it achieves the district management systems, which includes Basic Information Management: the management level of housing, household information management, administrator-level management, password management, etc. Service Management: standard property costs, property charges collecting, the history of arrears and other property expenses. Security Management: household gas, water, electricity and security and other security management, security management district and other public places. Systems Management: backup database, restore database, log management. This article also carries out on the Intelligent Community System analysis, proposes an architecture which is based on B / S technology system. And it has achieved a global network device management with friendly, easy to use, unified human - machine interface.
SWS: accessing SRS sites contents through Web Services.
Romano, Paolo; Marra, Domenico
2008-03-26
Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.
Bosma, Laine; Balen, Robert M; Davidson, Erin; Jewesson, Peter J
2003-01-01
The development and integration of a personal digital assistant (PDA)-based point-of-care database into an intravenous resource nurse (IVRN) consultation service for the purposes of consultation management and service characterization are described. The IVRN team provides a consultation service 7 days a week in this 1000-bed tertiary adult care teaching hospital. No simple, reliable method for documenting IVRN patient care activity and facilitating IVRN-initiated patient follow-up evaluation was available. Implementation of a PDA database with exportability of data to statistical analysis software was undertaken in July 2001. A Palm IIIXE PDA was purchased and a three-table, 13-field database was developed using HanDBase software. During the 7-month period of data collection, the IVRN team recorded 4868 consultations for 40 patient care areas. Full analysis of service characteristics was conducted using SPSS 10.0 software. Team members adopted the new technology with few problems, and the authors now can efficiently track and analyze the services provided by their IVRN team.
A Brief Assessment of LC2IEDM, MIST and Web Services for use in Naval Tactical Data Management
2004-07-01
server software, messaging between the client and server, and a database. The MIST database is implemented in an open source DBMS named PostGreSQL ... PostGreSQL had its beginnings at the University of California, Berkley, in 1986 [11]. The development of PostGreSQL has since evolved into a...contact history from the database. DRDC Atlantic TM 2004-148 9 Request Software Request Software Server Side Response from service
The relational clinical database: a possible solution to the star wars in registry systems.
Michels, D K; Zamieroski, M
1990-12-01
In summary, having data from other service areas available in a relational clinical database could resolve many of the problems existing in today's registry systems. Uniting sophisticated information systems into a centralized database system could definitely be a corporate asset in managing the bottom line.
ERIC Educational Resources Information Center
Far West Lab. for Educational Research and Development, San Francisco, CA.
This report is intended as a guide for local comprehensive integrated school-linked services sites and software vendors in developing and implementing case management information systems for the exchange and management of client data. The report is also intended to influence new development and future revisions of data systems, databases, and…
Information integration for a sky survey by data warehousing
NASA Astrophysics Data System (ADS)
Luo, A.; Zhang, Y.; Zhao, Y.
The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line
NASA Astrophysics Data System (ADS)
Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan
2010-10-01
The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.
University Library Online Reference Service Program Plan, 1986/87.
ERIC Educational Resources Information Center
Koga, James S.
This program plan for online reference service--the individualized assistance provided to a library patron using an online system--at California State Polytechnic University, Pomona, covers the areas of funding, eligibility for online services, search request eligibility, database eligibility, management of online services, reference faculty…
Technology transfer at NASA - A librarian's view
NASA Technical Reports Server (NTRS)
Buchan, Ronald L.
1991-01-01
The NASA programs, publications, and services promoting the transfer and utilization of aerospace technology developed by and for NASA are briefly surveyed. Topics addressed include the corporate sources of NASA technical information and its interest for corporate users of information services; the IAA and STAR abstract journals; NASA/RECON, NTIS, and the AIAA Aerospace Database; the RECON Space Commercialization file; the Computer Software Management and Information Center file; company information in the RECON database; and services to small businesses. Also discussed are the NASA publications Tech Briefs and Spinoff, the Industrial Applications Centers, NASA continuing bibliographies on management and patent abstracts (indexed using the NASA Thesaurus), the Index to NASA News Releases and Speeches, and the Aerospace Research Information Network (ARIN).
77 FR 71177 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-29
... automated Tri-Service, Web- based database containing credentialing, privileging, risk management, and... credentialing, privileging, risk- management and adverse actions capabilities which support medical quality... submitting comments. Mail: Federal Docket Management System Office, 4800 Mark Center Drive, East Tower, 2nd...
Kim, Chang-Gon; Mun, Su-Jeong; Kim, Ka-Na; Shin, Byung-Cheul; Kim, Nam-Kwen; Lee, Dong-Hyo; Lee, Jung-Han
2016-05-13
Manual therapy is the non-surgical conservative management of musculoskeletal disorders using the practitioner's hands on the patient's body for diagnosing and treating disease. The aim of this study is to systematically review trial-based economic evaluations of manual therapy relative to other interventions used for the management of musculoskeletal diseases. Randomised clinical trials (RCTs) on the economic evaluation of manual therapy for musculoskeletal diseases will be included in the review. The following databases will be searched from their inception: Medline, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Econlit, Mantis, Index to Chiropractic Literature, Science Citation Index, Social Science Citation Index, Allied and Complementary Medicine Database (AMED), Cochrane Database of Systematic Reviews (CDSR), National Health Service Database of Abstracts of Reviews of Effects (NHS DARE), National Health Service Health Technology Assessment Database (NHS HTA), National Health Service Economic Evaluation Database (NHS EED), CENTRAL, five Korean medical databases (Oriental Medicine Advanced Searching Integrated System (OASIS), Research Information Service System (RISS), DBPIA, Korean Traditional Knowledge Portal (KTKP) and KoreaMed) and three Chinese databases (China National Knowledge Infrastructure (CNKI), VIP and Wanfang). The evidence for the cost-effectiveness, cost-utility and cost-benefit of manual therapy for musculoskeletal diseases will be assessed as the primary outcome. Health-related quality of life and adverse effects will be assessed as secondary outcomes. We will critically appraise the included studies using the Cochrane risk of bias tool and the Drummond checklist. Results will be summarised using Slavin's qualitative best-evidence synthesis approach. The results of the study will be disseminated via a peer-reviewed journal and/or conference presentations. PROSPERO CRD42015026757. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D
2003-01-01
We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.
Federal Emergency Management Information System (FEMIS) system administration guide, version 1.4.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Burnett, R.A.; Carter, R.J.
The Federal Emergency Management Information Systems (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the US Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are connected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication, data distribution, and notification functionality necessary to operate FEMIS in a networked, client/server environment. The UNIX server provides an Oracle relational database management system (RDBMS) services, ARC/INFO GIS (optional) capabilities, and basic file management services. PNNL developed utilities that reside on the server include the Notification Service, the Command Service that executes the evacuation model, and AutoRecovery. To operate FEMIS, the Application Software must have access to a site specific FEMIS emergency management database. Data that pertains to an individual EOC`s jurisdiction is stored on the EOC`s local server. Information that needs to be accessible to all EOCs is automatically distributed by the FEMIS database to the other EOCs at the site.« less
An architecture for integrating distributed and cooperating knowledge-based Air Force decision aids
NASA Technical Reports Server (NTRS)
Nugent, Richard O.; Tucker, Richard W.
1988-01-01
MITRE has been developing a Knowledge-Based Battle Management Testbed for evaluating the viability of integrating independently-developed knowledge-based decision aids in the Air Force tactical domain. The primary goal for the testbed architecture is to permit a new system to be added to a testbed with little change to the system's software. Each system that connects to the testbed network declares that it can provide a number of services to other systems. When a system wants to use another system's service, it does not address the server system by name, but instead transmits a request to the testbed network asking for a particular service to be performed. A key component of the testbed architecture is a common database which uses a relational database management system (RDBMS). The RDBMS provides a database update notification service to requesting systems. Normally, each system is expected to monitor data relations of interest to it. Alternatively, a system may broadcast an announcement message to inform other systems that an event of potential interest has occurred. Current research is aimed at dealing with issues resulting from integration efforts, such as dealing with potential mismatches of each system's assumptions about the common database, decentralizing network control, and coordinating multiple agents.
ERIC Educational Resources Information Center
May, Abigail
1998-01-01
Offers some key business principles with the hope of helping educational facilities managers improve their operations. Looks at customer service, disparate databases, technological concerns, the mission of facility management, how to improve the bottom line, staffing ideas, future planning, and management suggestions. Lists seven habits of…
EPA Facility Registry Service (FRS): OIL
This dataset contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Oil database. The Oil database contains information on Spill Prevention, Control, and Countermeasure (SPCC) and Facility Response Plan (FRP) subject facilities to prevent and respond to oil spills. FRP facilities are referred to as substantial harm facilities due to the quantities of oil stored and facility characteristics. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to Oil facilities once the Oil data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
Towards a Global Service Registry for the World-Wide LHC Computing Grid
NASA Astrophysics Data System (ADS)
Field, Laurence; Alandes Pradillo, Maria; Di Girolamo, Alessandro
2014-06-01
The World-Wide LHC Computing Grid encompasses a set of heterogeneous information systems; from central portals such as the Open Science Grid's Information Management System and the Grid Operations Centre Database, to the WLCG information system, where the information sources are the Grid services themselves. Providing a consistent view of the information, which involves synchronising all these informations systems, is a challenging activity that has lead the LHC virtual organisations to create their own configuration databases. This experience, whereby each virtual organisation's configuration database interfaces with multiple information systems, has resulted in the duplication of effort, especially relating to the use of manual checks for the handling of inconsistencies. The Global Service Registry aims to address this issue by providing a centralised service that aggregates information from multiple information systems. It shows both information on registered resources (i.e. what should be there) and available resources (i.e. what is there). The main purpose is to simplify the synchronisation of the virtual organisation's own configuration databases, which are used for job submission and data management, through the provision of a single interface for obtaining all the information. By centralising the information, automated consistency and validation checks can be performed to improve the overall quality of information provided. Although internally the GLUE 2.0 information model is used for the purpose of integration, the Global Service Registry in not dependent on any particular information model for ingestion or dissemination. The intention is to allow the virtual organisation's configuration databases to be decoupled from the underlying information systems in a transparent way and hence simplify any possible future migration due to the evolution of those systems. This paper presents the Global Service Registry architecture, its advantages compared to the current situation and how it can support the evolution of information systems.
77 FR 38581 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
...: Minority Business Development Agency. Title: Online Customer Relationship Management (CRM)/Performance... client information, service activities and progress on attainment of program goals via the Online CRM/Performance Databases. The data collected through the Online CRM/Performance Databases is used to regularly...
ERIC Educational Resources Information Center
DeLong, Richard A.
1984-01-01
Unusually hard hit by the 1970s recession, the University of Michigan accumulated more deferred maintenance problems than could be analyzed efficiently either by hand or with existing computer systems. Using an existing microcomputer and a database management software package, the maintenance service developed its own database to support…
Event Driven Messaging with Role-Based Subscriptions
NASA Technical Reports Server (NTRS)
Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed
2009-01-01
Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).
[Establishement for regional pelvic trauma database in Hunan Province].
Cheng, Liang; Zhu, Yong; Long, Haitao; Yang, Junxiao; Sun, Buhua; Li, Kanghua
2017-04-28
To establish a database for pelvic trauma in Hunan Province, and to start the work of multicenter pelvic trauma registry. Methods: To establish the database, literatures relevant to pelvic trauma were screened, the experiences from the established trauma database in China and abroad were learned, and the actual situations for pelvic trauma rescue in Hunan Province were considered. The database for pelvic trauma was established based on the PostgreSQL and the advanced programming language Java 1.6. Results: The complex procedure for pelvic trauma rescue was described structurally. The contents for the database included general patient information, injurious condition, prehospital rescue, conditions in admission, treatment in hospital, status on discharge, diagnosis, classification, complication, trauma scoring and therapeutic effect. The database can be accessed through the internet by browser/servicer. The functions for the database include patient information management, data export, history query, progress report, video-image management and personal information management. Conclusion: The database with whole life cycle pelvic trauma is successfully established for the first time in China. It is scientific, functional, practical, and user-friendly.
National Weather- RFC Development Management
Map News Organization Search NWS ALL NOAA Go RFC Development Management Presentations Projects & ; Plans RFC Development Program RFC Archive Database Documentation Outline Workshops Contact Us resources and services. Description Graphic The RFC Development Management component of the Office of
DOE technology information management system database study report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widing, M.A.; Blodgett, D.W.; Braun, M.D.
1994-11-01
To support the missions of the US Department of Energy (DOE) Special Technologies Program, Argonne National Laboratory is defining the requirements for an automated software system that will search electronic databases on technology. This report examines the work done and results to date. Argonne studied existing commercial and government sources of technology databases in five general areas: on-line services, patent database sources, government sources, aerospace technology sources, and general technology sources. First, it conducted a preliminary investigation of these sources to obtain information on the content, cost, frequency of updates, and other aspects of their databases. The Laboratory then performedmore » detailed examinations of at least one source in each area. On this basis, Argonne recommended which databases should be incorporated in DOE`s Technology Information Management System.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, S.V.; Green, S.C.; Moore, K.
1994-04-01
The Netlib repository, maintained by the University of Tennessee and Oak Ridge National Laboratory, contains freely available software, documents, and databases of interest to the numerical, scientific computing, and other communities. This report includes both the Netlib User`s Guide and the Netlib System Manager`s Guide, and contains information about Netlib`s databases, interfaces, and system implementation. The Netlib repository`s databases include the Performance Database, the Conferences Database, and the NA-NET mail forwarding and Whitepages Databases. A variety of user interfaces enable users to access the Netlib repository in the manner most convenient and compatible with their networking capabilities. These interfaces includemore » the Netlib email interface, the Xnetlib X Windows client, the netlibget command-line TCP/IP client, anonymous FTP, anonymous RCP, and gopher.« less
Evolution of the use of relational and NoSQL databases in the ATLAS experiment
NASA Astrophysics Data System (ADS)
Barberis, D.
2016-09-01
The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of "NoSQL" databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to be orchestrated by specialised services that run on front-end machines and shield the user from the complexity of data storage infrastructure. This paper describes this technology evolution in the ATLAS database infrastructure and presents a few examples of large database applications that benefit from it.
Kuhn, Stefan; Schlörer, Nils E
2015-08-01
nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.
The research of network database security technology based on web service
NASA Astrophysics Data System (ADS)
Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin
2013-03-01
Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.
Consortial IT Services: Collaborating To Reduce the Pain.
ERIC Educational Resources Information Center
Klonoski, Ed
The Connecticut Distance Learning Consortium (CTDLC) provides its 32 members with Information Technologies (IT) services including a portal Web site, course management software, course hosting and development, faculty training, a help desk, online assessment, and a student financial aid database. These services are supplied to two- and four-year…
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
12 CFR 517.1 - Purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., expert witnesses, customized training, relocation services, information systems technology (computer systems, database management, software and office automation), or micrographic services; or in support of...-Owned Businesses Outreach Program (Outreach Program) is to ensure that firms owned and operated by...
The CMS dataset bookkeeping service
NASA Astrophysics Data System (ADS)
Afaq, A.; Dolgert, A.; Guo, Y.; Jones, C.; Kosyakov, S.; Kuznetsov, V.; Lueking, L.; Riley, D.; Sekhri, V.
2008-07-01
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.
EPA Facility Registry Service (FRS): TRI
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Toxic Release Inventory (TRI) System. TRI is a publicly available EPA database reported annually by certain covered industry groups, as well as federal facilities. It contains information about more than 650 toxic chemicals that are being used, manufactured, treated, transported, or released into the environment, and includes information about waste management and pollution prevention activities. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to TRI facilities once the TRI data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
LIS Journals in the Knowledge Age.
ERIC Educational Resources Information Center
Breen, Eileen
This paper examines EMERALD LIS and how it facilitates the use of information contained in LIS (library and information science) journals for improvements and progress. EMERALD LIS is a full-text database of journals in information management, library technology, library and information service management, and collection management/development.…
Wollbrett, Julien; Larmande, Pierre; de Lamotte, Frédéric; Ruiz, Manuel
2013-04-15
In recent years, a large amount of "-omics" data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic.
2013-01-01
Background In recent years, a large amount of “-omics” data have been produced. However, these data are stored in many different species-specific databases that are managed by different institutes and laboratories. Biologists often need to find and assemble data from disparate sources to perform certain analyses. Searching for these data and assembling them is a time-consuming task. The Semantic Web helps to facilitate interoperability across databases. A common approach involves the development of wrapper systems that map a relational database schema onto existing domain ontologies. However, few attempts have been made to automate the creation of such wrappers. Results We developed a framework, named BioSemantic, for the creation of Semantic Web Services that are applicable to relational biological databases. This framework makes use of both Semantic Web and Web Services technologies and can be divided into two main parts: (i) the generation and semi-automatic annotation of an RDF view; and (ii) the automatic generation of SPARQL queries and their integration into Semantic Web Services backbones. We have used our framework to integrate genomic data from different plant databases. Conclusions BioSemantic is a framework that was designed to speed integration of relational databases. We present how it can be used to speed the development of Semantic Web Services for existing relational biological databases. Currently, it creates and annotates RDF views that enable the automatic generation of SPARQL queries. Web Services are also created and deployed automatically, and the semantic annotations of our Web Services are added automatically using SAWSDL attributes. BioSemantic is downloadable at http://southgreen.cirad.fr/?q=content/Biosemantic. PMID:23586394
Analysis and preliminary design of Kunming land use and planning management information system
NASA Astrophysics Data System (ADS)
Li, Li; Chen, Zhenjie
2007-06-01
This article analyzes Kunming land use planning and management information system from the system building objectives and system building requirements aspects, nails down the system's users, functional requirements and construction requirements. On these bases, the three-tier system architecture based on C/S and B/S is defined: the user interface layer, the business logic layer and the data services layer. According to requirements for the construction of land use planning and management information database derived from standards of the Ministry of Land and Resources and the construction program of the Golden Land Project, this paper divides system databases into planning document database, planning implementation database, working map database and system maintenance database. In the design of the system interface, this paper uses various methods and data formats for data transmission and sharing between upper and lower levels. According to the system analysis results, main modules of the system are designed as follows: planning data management, the planning and annual plan preparation and control function, day-to-day planning management, planning revision management, decision-making support, thematic inquiry statistics, planning public participation and so on; besides that, the system realization technologies are discussed from the system operation mode, development platform and other aspects.
Marketing Secondary Information Services: How and to Whom.
ERIC Educational Resources Information Center
Wolinsky, Carol Baker
1983-01-01
Discussion of the marketing of bibliographic databases focuses on defining the market, the purchasing process, and the purchase decision process for researchers, managers, and librarians. The application of marketing concepts to the purchase of online information services is noted. (EJS)
UniGene Tabulator: a full parser for the UniGene format.
Lenzi, Luca; Frabetti, Flavia; Facchin, Federica; Casadei, Raffaella; Vitale, Lorenza; Canaider, Silvia; Carinci, Paolo; Zannotti, Maria; Strippoli, Pierluigi
2006-10-15
UniGene Tabulator 1.0 provides a solution for full parsing of UniGene flat file format; it implements a structured graphical representation of each data field present in UniGene following import into a common database managing system usable in a personal computer. This database includes related tables for sequence, protein similarity, sequence-tagged site (STS) and transcript map interval (TXMAP) data, plus a summary table where each record represents a UniGene cluster. UniGene Tabulator enables full local management of UniGene data, allowing parsing, querying, indexing, retrieving, exporting and analysis of UniGene data in a relational database form, usable on Macintosh (OS X 10.3.9 or later) and Windows (2000, with service pack 4, XP, with service pack 2 or later) operating systems-based computers. The current release, including both the FileMaker runtime applications, is freely available at http://apollo11.isto.unibo.it/software/
J.M. Bowker; C.M. Starbuck; D.B.K. English; J.C. Bergstrom; R.S. Rosenburger; D.C. McCollum
2009-01-01
The USDA Forest Service (FS) manages 193 million acres of public land in the United States. These public resources include vast quantities of natural resources including timber, wildlife, watersheds, air sheds, and ecosystems. The Forest Service was established in 1905, and the FS has been directed by Congress to manage the National Forests and Grasslands for the...
75 FR 61761 - Renewal of Charter for the Chronic Fatigue Syndrome Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
... professionals, and the biomedical, academic, and research communities about chronic fatigue syndrome advances... accessing the FACA database that is maintained by the Committee Management Secretariat under the General Services Administration. The Web site address for the FACA database is http://fido.gov/facadatabase . Dated...
Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa
2010-08-21
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.
2010-01-01
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200
Federated Web-accessible Clinical Data Management within an Extensible NeuroImaging Database
Keator, David B.; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R.; Bockholt, Jeremy; Grethe, Jeffrey S.
2010-01-01
Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site. PMID:20567938
Federated web-accessible clinical data management within an extensible neuroimaging database.
Ozyurt, I Burak; Keator, David B; Wei, Dingying; Fennema-Notestine, Christine; Pease, Karen R; Bockholt, Jeremy; Grethe, Jeffrey S
2010-12-01
Managing vast datasets collected throughout multiple clinical imaging communities has become critical with the ever increasing and diverse nature of datasets. Development of data management infrastructure is further complicated by technical and experimental advances that drive modifications to existing protocols and acquisition of new types of research data to be incorporated into existing data management systems. In this paper, an extensible data management system for clinical neuroimaging studies is introduced: The Human Clinical Imaging Database (HID) and Toolkit. The database schema is constructed to support the storage of new data types without changes to the underlying schema. The complex infrastructure allows management of experiment data, such as image protocol and behavioral task parameters, as well as subject-specific data, including demographics, clinical assessments, and behavioral task performance metrics. Of significant interest, embedded clinical data entry and management tools enhance both consistency of data reporting and automatic entry of data into the database. The Clinical Assessment Layout Manager (CALM) allows users to create on-line data entry forms for use within and across sites, through which data is pulled into the underlying database via the generic clinical assessment management engine (GAME). Importantly, the system is designed to operate in a distributed environment, serving both human users and client applications in a service-oriented manner. Querying capabilities use a built-in multi-database parallel query builder/result combiner, allowing web-accessible queries within and across multiple federated databases. The system along with its documentation is open-source and available from the Neuroimaging Informatics Tools and Resource Clearinghouse (NITRC) site.
National Agricultural Statistics Service (NASS): Agricultural Chemical Use
Management Agricultural Chemical Use Database Search Tips Usage Search | US Maps | Graphical Reports effort among USDA, the USDA Regional Pest Management Centers and the NSF Center for Integrated Pest Management (CIPM). All data available have been previously published by NASS and have been consolidated at
NASA Astrophysics Data System (ADS)
Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa
2016-04-01
We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.
Microcomputer Software for Libraries: A Survey.
ERIC Educational Resources Information Center
Nolan, Jeanne M.
1983-01-01
Reports on findings of research done by Nolan Information Management Services concerning availability of microcomputer software for libraries. Highlights include software categories (specific, generic-database management programs, original); number of programs available in 1982 for 12 applications; projections for 1983; and future software…
Wang, Tom Kai Ming; Chow, Kok-Lam; Lin, Aaron; Chataline, Alexei; White, Harvey; Dawes, Matthew; Gamble, Greg; Ellis, Chris
2018-03-09
To review the number, characteristics and clinical management of suspected ACS patients admitted to cardiology and non-cardiology services at Auckland City Hospital, to assess differences between these services and to assess the number who would potentially be enrolled in the All New Zealand Acute Coronary Syndrome (ACS) Quality Improvement Programme (ANZACS-QI) database. Auckland City Hospital patient data was extracted from the Australia and New Zealand ACS 'SNAPSHOT' audit, performed over 14 days in May 2012. There were 121 suspected ACS admissions to Auckland City hospital during the audit period, with 45 (37%) patients directly managed by the cardiology service, and 76 (63%) patients cared for by non-cardiology services. Based on the subsequent discharge diagnosis, the cardiology service had more patients with definite ACS than the non-cardiology services; 27/45 (60%) compared to 16/76 (21%), difference (95%CI) 39% (22-56), P<0.0001). Cardiology ACS patients were more likely to undergo echocardiography; 15/27 (56%) compared to 2/16 (13%), difference 42% (18-68), P=0.0089), coronary angiography; 21/27 (78%) compared to 3/16 (19%), difference (95%CI) 59% (34-84), P=0.0003), coronary revascularisation; 18/27 (67%) compared to 3/16 (19%), difference (95%CI) 48% (22-74), P=0.004, and be discharged on two antiplatelet agents; 18/26 (69%) compared to 3/15 (20%), difference (95%CI) 49% (22-76), P=0.0036, or an ACEI/ARB; 20/26 (77%) compared to 5/15 (33%), difference (95%CI) 44% (15-72), P=0.0088. In patients with a discharge diagnosis of definite ACS, those managed by non-cardiology services were less likely to receive guideline-recommended investigations, and management, in this relatively small cohort study. About one-third of all ACS patients are managed by non-cardiology services and would not be recorded by the ANZACS-QI database.
Information persistence using XML database technology
NASA Astrophysics Data System (ADS)
Clark, Thomas A.; Lipa, Brian E. G.; Macera, Anthony R.; Staskevich, Gennady R.
2005-05-01
The Joint Battlespace Infosphere (JBI) Information Management (IM) services provide information exchange and persistence capabilities that support tailored, dynamic, and timely access to required information, enabling near real-time planning, control, and execution for DoD decision making. JBI IM services will be built on a substrate of network centric core enterprise services and when transitioned, will establish an interoperable information space that aggregates, integrates, fuses, and intelligently disseminates relevant information to support effective warfighter business processes. This virtual information space provides individual users with information tailored to their specific functional responsibilities and provides a highly tailored repository of, or access to, information that is designed to support a specific Community of Interest (COI), geographic area or mission. Critical to effective operation of JBI IM services is the implementation of repositories, where data, represented as information, is represented and persisted for quick and easy retrieval. This paper will address information representation, persistence and retrieval using existing database technologies to manage structured data in Extensible Markup Language (XML) format as well as unstructured data in an IM services-oriented environment. Three basic categories of database technologies will be compared and contrasted: Relational, XML-Enabled, and Native XML. These technologies have diverse properties such as maturity, performance, query language specifications, indexing, and retrieval methods. We will describe our application of these evolving technologies within the context of a JBI Reference Implementation (RI) by providing some hopefully insightful anecdotes and lessons learned along the way. This paper will also outline future directions, promising technologies and emerging COTS products that can offer more powerful information management representations, better persistence mechanisms and improved retrieval techniques.
ERIC Educational Resources Information Center
Terawaki, Yuki; Takahashi, Yuichi; Kodama, Yasushi; Yana, Kazuo
2011-01-01
This paper describes an integration of different Relational Database Management System (RDBMS) of two Course Management Systems (CMS) called Sakai and the Common Factory for Inspiration and Value in Education (CFIVE). First, when the service of CMS is provided campus-wide, the problems of user support, CMS operation and customization of CMS are…
Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements
NASA Technical Reports Server (NTRS)
Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri
2006-01-01
NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.
Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL
NASA Astrophysics Data System (ADS)
Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong
2011-12-01
We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.
ERIC Educational Resources Information Center
Samfundet for Informationstjanst i Finland, Helsinki.
The 54 conference papers compiled in this proceedings include plenary addresses; reviews of Nordic databases; and discussions of documents, systems, services, and products as they relate to information resources management (IRM). Almost half of the presentations are in English: (1) "What Is Information Resources Management?" (Forest…
ERIC Educational Resources Information Center
Berger, Mary C.; Bourne, Charles P.
1988-01-01
The first paper discusses the factors involved in a decision to provide document delivery services, including user needs, competitive climate, business potential, fit with current business, and logistics of providing the service. The second reviews the kinds of additional products that can be developed as a byproduct of conventional database…
Review of the Composability Problem for System Evaluation
2004-11-01
burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services ...directory services (e.g., the Lightweight Directory Access Protocol (LDAP)), authentication (e.g., Kerberos), databases, user interface (e.g...exemplifies this type of development, by its use of commercial components and systems for authentication, access management, directory services
The USA-NPN Information Management System: A tool in support of phenological assessments
NASA Astrophysics Data System (ADS)
Rosemartin, A.; Vazquez, R.; Wilson, B. E.; Denny, E. G.
2009-12-01
The USA National Phenology Network (USA-NPN) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and all aspects of environmental change. Data management and information sharing are central to the USA-NPN mission. The USA-NPN develops, implements, and maintains a comprehensive Information Management System (IMS) to serve the needs of the network, including the collection, storage and dissemination of phenology data, access to phenology-related information, tools for data interpretation, and communication among partners of the USA-NPN. The IMS includes components for data storage, such as the National Phenology Database (NPD), and several online user interfaces to accommodate data entry, data download, data visualization and catalog searches for phenology-related information. The IMS is governed by a set of standards to ensure security, privacy, data access, and data quality. The National Phenology Database is designed to efficiently accommodate large quantities of phenology data, to be flexible to the changing needs of the network, and to provide for quality control. The database stores phenology data from multiple sources (e.g., partner organizations, researchers and citizen observers), and provides for integration with legacy datasets. Several services will be created to provide access to the data, including reports, visualization interfaces, and web services. These services will provide integrated access to phenology and related information for scientists, decision-makers and general audiences. Phenological assessments at any scale will rely on secure and flexible information management systems for the organization and analysis of phenology data. The USA-NPN’s IMS can serve phenology assessments directly, through data management and indirectly as a model for large-scale integrated data management.
The CMS dataset bookkeeping service
DOE Office of Scientific and Technical Information (OSTI.GOV)
Afaq, Anzar,; /Fermilab; Dolgert, Andrew
2007-10-01
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS ismore » available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.« less
Facility Registry Service (FRS)
This is a centrally managed database that identifies facilities either subject to environmental regulations or of environmental interest, providing an integrated source of air, water, and waste environmental data.
Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF
NASA Technical Reports Server (NTRS)
Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.
2001-01-01
The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.
Knowledge Management, User Education and Librarianship.
ERIC Educational Resources Information Center
Koenig, Michael E. D.
2003-01-01
Discusses the role of librarians in knowledge management in terms of designing information systems, creating classification systems and taxonomies, and implementing and operating the systems. Suggests the need for librarians to be involved in user education and training, including database searching, using current awareness services, and using…
The Corporate Library and Issues Management.
ERIC Educational Resources Information Center
Lancaster, F. W.; Loescher, Jane
1994-01-01
Discussion of corporate library services and the role of the librarian focuses on the recognition and tracking of issues of potential significance to the corporation, or issues management. Topics addressed include environmental scanning of relevant literature, and the use of databases to track issues. (16 references) (LRW)
FJET Database Project: Extract, Transform, and Load
NASA Technical Reports Server (NTRS)
Samms, Kevin O.
2015-01-01
The Data Mining & Knowledge Management team at Kennedy Space Center is providing data management services to the Frangible Joint Empirical Test (FJET) project at Langley Research Center (LARC). FJET is a project under the NASA Engineering and Safety Center (NESC). The purpose of FJET is to conduct an assessment of mild detonating fuse (MDF) frangible joints (FJs) for human spacecraft separation tasks in support of the NASA Commercial Crew Program. The Data Mining & Knowledge Management team has been tasked with creating and managing a database for the efficient storage and retrieval of FJET test data. This paper details the Extract, Transform, and Load (ETL) process as it is related to gathering FJET test data into a Microsoft SQL relational database, and making that data available to the data users. Lessons learned, procedures implemented, and programming code samples are discussed to help detail the learning experienced as the Data Mining & Knowledge Management team adapted to changing requirements and new technology while maintaining flexibility of design in various aspects of the data management project.
Abidi, S S
2001-06-01
Worldwide healthcare delivery trends are undergoing a subtle paradigm shift--patient centered services as opposed to provider centered services and wellness maintenance as opposed to illness management. In this paper we present a Tele-Healthcare project TIDE--Tele-Healthcare Information and Diagnostic Environment. TIDE manifests an 'intelligent' healthcare environment that aims to ensure lifelong coverage of person-specific health maintenance decision-support services--i.e., both wellness maintenance and illness management services--ubiquitously available via the Internet/WWW. Taking on an all-encompassing health maintenance role--spanning from wellness to illness issues--the functionality of TIDE involves the generation and delivery of (a) Personalized, Pro-active, Persistent, Perpetual, and Present wellness maintenance services, and (b) remote diagnostic services for managing noncritical illnesses. Technically, TIDE is an amalgamation of diverse computer technologies--Artificial Intelligence, Internet, Multimedia, Databases, and Medical Informatics--to implement a sophisticated healthcare delivery infostructure.
Contract management in USA hospitals: service duplication and access within local markets.
Carey, Kathleen; Dor, Avi
2008-08-01
This paper examines the extent to which hospitals that are under external contract management engage in service duplication, as well as the degree to which the various services they offer contribute to or detract from community access. The study incorporates all USA hospitals using data from the American Hospital Association Annual Survey Database, supplemented by county level measures obtained from the area resource file (ARF). Using data on the 3794 hospitals classified as acute care facilities in 2002, we performed a set of logistic regressions that analyzed whether a hospital offered each of 74 distinct services. For each service (regression), key independent variables measured the number of other hospitals in the local market area that also offered the service. Local area market definitions are the areas circumscribed by the hospital within distances of 10 and 20 miles. Results suggest that contract-managed (CM) hospitals display a more competitive pattern (service duplication) than hospitals in general, but CM hospitals that are the sole provider of services locally are less likely to offer services than traditionally managed sole hospital providers. Contract management does not appear to offer any particular advantages in improving access to hospital services.
Cloud Computing and Your Library
ERIC Educational Resources Information Center
Mitchell, Erik T.
2010-01-01
One of the first big shifts in how libraries manage resources was the move from print-journal purchasing models to database-subscription and electronic-journal purchasing models. Libraries found that this transition helped them scale their resources and provide better service just by thinking a bit differently about their services. Likewise,…
28 CFR 802.29 - Exemption of the Pretrial Services Agency System.
Code of Federal Regulations, 2013 CFR
2013-07-01
... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...
28 CFR 802.29 - Exemption of the Pretrial Services Agency System.
Code of Federal Regulations, 2014 CFR
2014-07-01
... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...
28 CFR 802.29 - Exemption of the Pretrial Services Agency System.
Code of Federal Regulations, 2012 CFR
2012-07-01
... THE DISTRICT OF COLUMBIA DISCLOSURE OF RECORDS Exemption of Records Systems Under the Privacy Act § 802.29 Exemption of the Pretrial Services Agency System. The Privacy Act permits specific systems of... Bail Agency Database (ABADABA) (CSOSA/PSA-1). (ii) Drug Test Management System (DTMS) (CSOSA/PSA-2...
Fleet, Richard; Archambault, Patrick; Légaré, France; Chauny, Jean-Marc; Lévesque, Jean-Frédéric; Ouimet, Mathieu; Dupuis, Gilles; Haggerty, Jeannie; Poitras, Julien; Tanguay, Alain; Simard-Racine, Geneviève; Gauthier, Josée
2013-01-01
Introduction Emergency departments are important safety nets for people who live in rural areas. Moreover, a serious problem in access to healthcare services has emerged in these regions. The challenges of providing access to quality rural emergency care include recruitment and retention issues, lack of advanced imagery technology, lack of specialist support and the heavy reliance on ambulance transport over great distances. The Quebec Ministry of Health and Social Services published a new version of the Emergency Department Management Guide, a document designed to improve the emergency department management and to humanise emergency department care and services. In particular, the Guide recommends solutions to problems that plague rural emergency departments. Unfortunately, no studies have evaluated the implementation of the proposed recommendations. Methods and analysis To develop a comprehensive portrait of all rural emergency departments in Quebec, data will be gathered from databases at the Quebec Ministry of Health and Social Services, the Quebec Trauma Registry and from emergency departments and ambulance services managers. Statistics Canada data will be used to describe populations and rural regions. To evaluate the use of the 2006 Emergency Department Management Guide and the implementation of its various recommendations, an online survey and a phone interview will be administered to emergency department managers. Two online surveys will evaluate quality of work life among physicians and nurses working at rural emergency departments. Quality-of-care indicators will be collected from databases and patient medical files. Data will be analysed using statistical (descriptive and inferential) procedures. Ethics and dissemination This protocol has been approved by the CSSS Alphonse–Desjardins research ethics committee (Project MP-HDL-1213-011). The results will be published in peer-reviewed scientific journals and presented at one or more scientific conferences. PMID:23633423
Fleet, Richard; Archambault, Patrick; Légaré, France; Chauny, Jean-Marc; Lévesque, Jean-Frédéric; Ouimet, Mathieu; Dupuis, Gilles; Haggerty, Jeannie; Poitras, Julien; Tanguay, Alain; Simard-Racine, Geneviève; Gauthier, Josée
2013-01-01
Emergency departments are important safety nets for people who live in rural areas. Moreover, a serious problem in access to healthcare services has emerged in these regions. The challenges of providing access to quality rural emergency care include recruitment and retention issues, lack of advanced imagery technology, lack of specialist support and the heavy reliance on ambulance transport over great distances. The Quebec Ministry of Health and Social Services published a new version of the Emergency Department Management Guide, a document designed to improve the emergency department management and to humanise emergency department care and services. In particular, the Guide recommends solutions to problems that plague rural emergency departments. Unfortunately, no studies have evaluated the implementation of the proposed recommendations. To develop a comprehensive portrait of all rural emergency departments in Quebec, data will be gathered from databases at the Quebec Ministry of Health and Social Services, the Quebec Trauma Registry and from emergency departments and ambulance services managers. Statistics Canada data will be used to describe populations and rural regions. To evaluate the use of the 2006 Emergency Department Management Guide and the implementation of its various recommendations, an online survey and a phone interview will be administered to emergency department managers. Two online surveys will evaluate quality of work life among physicians and nurses working at rural emergency departments. Quality-of-care indicators will be collected from databases and patient medical files. Data will be analysed using statistical (descriptive and inferential) procedures. This protocol has been approved by the CSSS Alphonse-Desjardins research ethics committee (Project MP-HDL-1213-011). The results will be published in peer-reviewed scientific journals and presented at one or more scientific conferences.
A Database as a Service for the Healthcare System to Store Physiological Signal Data.
Chang, Hsien-Tsung; Lin, Tsai-Huei
2016-01-01
Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records- 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users-we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance.
A Database as a Service for the Healthcare System to Store Physiological Signal Data
Lin, Tsai-Huei
2016-01-01
Wearable devices that measure physiological signals to help develop self-health management habits have become increasingly popular in recent years. These records are conducive for follow-up health and medical care. In this study, based on the characteristics of the observed physiological signal records– 1) a large number of users, 2) a large amount of data, 3) low information variability, 4) data privacy authorization, and 5) data access by designated users—we wish to resolve physiological signal record-relevant issues utilizing the advantages of the Database as a Service (DaaS) model. Storing a large amount of data using file patterns can reduce database load, allowing users to access data efficiently; the privacy control settings allow users to store data securely. The results of the experiment show that the proposed system has better database access performance than a traditional relational database, with a small difference in database volume, thus proving that the proposed system can improve data storage performance. PMID:28033415
Code of Federal Regulations, 2011 CFR
2011-10-01
... provision shall be included in the Service Management System tariff and in the local exchange carriers' toll free database access tariffs: [T]he Federal Communications Commission (“FCC”) has concluded that...
Kan, Yao-Chiang; Chen, Kai-Hong; Lin, Hsueh-Chun
2017-06-01
Self-management in healthcare can allow patients managing their health data anytime and everywhere for prevention of chronic diseases. This study established a prototype of ubiquitous health management system (UHMS) with healthy diet control (HDC) for people who need services of metabolic syndrome healthcare in Taiwan. System infrastructure comprises of three portals and a database tier with mutually supportive components to achieve functionality of diet diaries, nutrition guides, and health risk assessments for self-health management. With the diet, nutrition, and personal health database, the design enables the analytical diagrams on the interactive interface to support a mobile application for diet diary, a Web-based platform for health management, and the modules of research and development for medical care. For database integrity, dietary data can be stored at offline mode prior to transformation between mobile device and server site at online mode. The UHMS-HDC was developed by open source technology for ubiquitous health management with personalized dietary criteria. The system integrates mobile, internet, and electronic healthcare services with the diet diary functions to manage healthy diet behaviors of users. The virtual patients were involved to simulate the self-health management procedure. The assessment functions were approved by capturing the screen snapshots in the procedure. The proposed system development was capable for practical intervention. This approach details the expandable framework with collaborative components regarding the self-developed UHMS-HDC. The multi-disciplinary applications for self-health management can support the healthcare professionals to reduce medical resources and improve healthcare effects for the patient who requires monitoring personal health condition with diet control. The proposed system can be practiced for intervention in the hospital. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Campbell, William J.
1985-01-01
Intelligent data management is the concept of interfacing a user to a database management system with a value added service that will allow a full range of data management operations at a high level of abstraction using human written language. The development of such a system will be based on expert systems and related artificial intelligence technologies, and will allow the capturing of procedural and relational knowledge about data management operations and the support of a user with such knowledge in an on-line, interactive manner. Such a system will have the following capabilities: (1) the ability to construct a model of the users view of the database, based on the query syntax; (2) the ability to transform English queries and commands into database instructions and processes; (3) the ability to use heuristic knowledge to rapidly prune the data space in search processes; and (4) the ability to use an on-line explanation system to allow the user to understand what the system is doing and why it is doing it. Additional information is given in outline form.
Information management systems for pharmacogenomics.
Thallinger, Gerhard G; Trajanoski, Slave; Stocker, Gernot; Trajanoski, Zlatko
2002-09-01
The value of high-throughput genomic research is dramatically enhanced by association with key patient data. These data are generally available but of disparate quality and not typically directly associated. A system that could bring these disparate data sources into a common resource connected with functional genomic data would be tremendously advantageous. However, the integration of clinical and accurate interpretation of the generated functional genomic data requires the development of information management systems capable of effectively capturing the data as well as tools to make that data accessible to the laboratory scientist or to the clinician. In this review these challenges and current information technology solutions associated with the management, storage and analysis of high-throughput data are highlighted. It is suggested that the development of a pharmacogenomic data management system which integrates public and proprietary databases, clinical datasets, and data mining tools embedded in a high-performance computing environment should include the following components: parallel processing systems, storage technologies, network technologies, databases and database management systems (DBMS), and application services.
A secure data outsourcing scheme based on Asmuth-Bloom secret sharing
NASA Astrophysics Data System (ADS)
Idris Muhammad, Yusuf; Kaiiali, Mustafa; Habbal, Adib; Wazan, A. S.; Sani Ilyasu, Auwal
2016-11-01
Data outsourcing is an emerging paradigm for data management in which a database is provided as a service by third-party service providers. One of the major benefits of offering database as a service is to provide organisations, which are unable to purchase expensive hardware and software to host their databases, with efficient data storage accessible online at a cheap rate. Despite that, several issues of data confidentiality, integrity, availability and efficient indexing of users' queries at the server side have to be addressed in the data outsourcing paradigm. Service providers have to guarantee that their clients' data are secured against internal (insider) and external attacks. This paper briefly analyses the existing indexing schemes in data outsourcing and highlights their advantages and disadvantages. Then, this paper proposes a secure data outsourcing scheme based on Asmuth-Bloom secret sharing which tries to address the issues in data outsourcing such as data confidentiality, availability and order preservation for efficient indexing.
Datacube Services in Action, Using Open Source and Open Standards
NASA Astrophysics Data System (ADS)
Baumann, P.; Misev, D.
2016-12-01
Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.
Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo
2018-03-01
Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. PROSPERO 2018 CRD42018085205.
NASA Astrophysics Data System (ADS)
Hortos, William S.
2010-04-01
Broadband wireless access standards, together with advances in the development of commercial sensing and actuator devices, enable the feasibility of a consumer service for a multi-sensor system that monitors the conditions within a residence or office: the environment/infrastructure, patient-occupant health, and physical security. The proposed service is a broadband reimplementation and combination of existing services to allow on-demand reports on and management of the conditions by remote subscribers. The flow of on-demand reports to subscribers and to specialists contracted to mitigate out-of-tolerance conditions is the foreground process. Service subscribers for an over-the-horizon connected home/office (OCHO) monitoring system are the occupant of the premises and agencies, contracted by the service provider, to mitigate or resolve any observed out-of-tolerance condition(s) at the premises. Collectively, these parties are the foreground users of the OCHO system; the implemented wireless standards allow the foreground users to be mobile as they request situation reports on demand from the subsystems on remote conditions that comprise OCHO via wireless devices. An OCHO subscriber, i.e., a foreground user, may select the level of detail found in on-demand reports, i.e., the amount of information displayed in the report of monitored conditions at the premises. This is one context of system operations. While foreground reports are sent only periodically to subscribers, the information generated by the monitored conditions at the premises is continuous and is transferred to a background configuration of servers on which databases reside. These databases are each used, generally, in non-real time, for the assessment and management of situations defined by attributes like those being monitored in the foreground by OCHO. This is the second context of system operations. Context awareness and management of conditions at the premises by a second group of analysts and decision makers who extract information from the OCHO data in the databases form the foundation of the situation management problem.
2017 Joint Annual NDIA/AIA Industrial Security Committee Fall Conference
2017-11-15
beyond credit data to offer the insights that government professionals need to make informed decisions and ensure citizen safety, manage compliance...business that provides information technology and professional services. We specialize in managing business processes and systems integration for both... Information Security System ISFD Industrial Security Facilities Database OBMS ODAA Business Management System STEPP Security, Training, Education and
NASA Technical Reports Server (NTRS)
Miller, David A., Jr.
2004-01-01
JDD Inc, is a maintenance and custodial contracting company whose mission is to provide their clients in the private and government sectors "quality construction, construction management and cleaning services in the most efficient and cost effective manners, (JDD, Inc. Mission Statement)." This company provides facilities support for Fort Riley in Fo,rt Riley, Kansas and the NASA John H. Glenn Research Center at Lewis Field here in Cleveland, Ohio. JDD, Inc. is owned and operated by James Vaughn, who started as painter at NASA Glenn and has been working here for the past seventeen years. This summer I worked under Devan Anderson, who is the safety manager for JDD Inc. in the Logistics and Technical Information Division at Glenn Research Center The LTID provides all transportation, secretarial, security needs and contract management of these various services for the center. As a safety manager, my mentor provides Occupational Health and Safety Occupation (OSHA) compliance to all JDD, Inc. employees and handles all other issues (Environmental Protection Agency issues, workers compensation, safety and health training) involving to job safety. My summer assignment was not as considered "groundbreaking research" like many other summer interns have done in the past, but it is just as important and beneficial to JDD, Inc. I initially created a database using a Microsoft Excel program to classify and categorize data pertaining to numerous safety training certification courses instructed by our safety manager during the course of the fiscal year. This early portion of the database consisted of only data (training field index, employees who were present at these training courses and who was absent) from the training certification courses. Once I completed this phase of the database, I decided to expand the database and add as many dimensions to it as possible. Throughout the last seven weeks, I have been compiling more data from day to day operations and been adding the information to the database. It now consists of seven different categories of data (carpet cleaning, forms, NASA Event Schedules, training certifications, wall and vent cleaning, work schedules, and miscellaneous) . I also did some field inspecting with the supervisors around the site and was present at all of the training certification courses that have been scheduled since June 2004. My future outlook for the JDD, Inc. database is to have all of company s information from future contract proposals, weekly inventory, to employee timesheets all in this same database.
EPA Facility Registry Service (FRS): CERCLIS
This data provides location and attribute information on Facilities regulated under the Comprehensive Environmental Responsibility Compensation and Liability Information System (CERCLIS) for a intranet web feature service . The data provided in this service are obtained from EPA's Facility Registry Service (FRS). The FRS is an integrated source of comprehensive (air, water, and waste) environmental information about facilities, sites or places. This service connects directly to the FRS database to provide this data as a feature service. FRS creates high-quality, accurate, and authoritative facility identification records through rigorous verification and management procedures that incorporate information from program national systems, state master facility records, data collected from EPA's Central Data Exchange registrations and data management personnel. Additional Information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
The CMS Data Management System
NASA Astrophysics Data System (ADS)
Giffels, M.; Guo, Y.; Kuznetsov, V.; Magini, N.; Wildish, T.
2014-06-01
The data management elements in CMS are scalable, modular, and designed to work together. The main components are PhEDEx, the data transfer and location system; the Data Booking Service (DBS), a metadata catalog; and the Data Aggregation Service (DAS), designed to aggregate views and provide them to users and services. Tens of thousands of samples have been cataloged and petabytes of data have been moved since the run began. The modular system has allowed the optimal use of appropriate underlying technologies. In this contribution we will discuss the use of both Oracle and NoSQL databases to implement the data management elements as well as the individual architectures chosen. We will discuss how the data management system functioned during the first run, and what improvements are planned in preparation for 2015.
EPA Facility Registry Service (FRS): ICIS
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Integrated Compliance Information System (ICIS). When complete, ICIS will provide a database that will contain integrated enforcement and compliance information across most of EPA's programs. The vision for ICIS is to replace EPA's independent databases that contain enforcement data with a single repository for that information. Currently, ICIS contains all Federal Administrative and Judicial enforcement actions and a subset of the Permit Compliance System (PCS), which supports the National Pollutant Discharge Elimination System (NPDES). ICIS exchanges non-sensitive enforcement/compliance activities, non-sensitive formal enforcement actions and NPDES information with FRS. This web feature service contains the enforcement/compliance activities and formal enforcement action related facilities; the NPDES facilities are contained in the PCS_NPDES web feature service. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on f
London, Sue; Brahmi, Frances A
2005-01-01
As end-user demand for easy access to electronic full text continues to climb, an increasing number of information providers are combining that access with their other products and services, making navigating their Web sites by librarians seeking information on a given product or service more daunting than ever. One such provider of a complex array of products and services is Thomson Scientific. This paper looks at some of the many products and tools available from two of Thomson Scientific's businesses, Thomson ISI and Thomson ResearchSoft. Among the items of most interest to health sciences and veterinary librarians and their users are the variety of databases available via the ISI Web of Knowledge platform and the information management products available from ResearchSoft.
NASA Astrophysics Data System (ADS)
Sheldon, W.
2013-12-01
Managing data for a large, multidisciplinary research program such as a Long Term Ecological Research (LTER) site is a significant challenge, but also presents unique opportunities for data stewardship. LTER research is conducted within multiple organizational frameworks (i.e. a specific LTER site as well as the broader LTER network), and addresses both specific goals defined in an NSF proposal as well as broader goals of the network; therefore, every LTER data can be linked to rich contextual information to guide interpretation and comparison. The challenge is how to link the data to this wealth of contextual metadata. At the Georgia Coastal Ecosystems LTER we developed an integrated information management system (GCE-IMS) to manage, archive and distribute data, metadata and other research products as well as manage project logistics, administration and governance (figure 1). This system allows us to store all project information in one place, and provide dynamic links through web applications and services to ensure content is always up to date on the web as well as in data set metadata. The database model supports tracking changes over time in personnel roles, projects and governance decisions, allowing these databases to serve as canonical sources of project history. Storing project information in a central database has also allowed us to standardize both the formatting and content of critical project information, including personnel names, roles, keywords, place names, attribute names, units, and instrumentation, providing consistency and improving data and metadata comparability. Lookup services for these standard terms also simplify data entry in web and database interfaces. We have also coupled the GCE-IMS to our MATLAB- and Python-based data processing tools (i.e. through database connections) to automate metadata generation and packaging of tabular and GIS data products for distribution. Data processing history is automatically tracked throughout the data lifecycle, from initial import through quality control, revision and integration by our data processing system (GCE Data Toolbox for MATLAB), and included in metadata for versioned data products. This high level of automation and system integration has proven very effective in managing the chaos and scalability of our information management program.
Relay Forward-Link File Management Services (MaROS Phase 2)
NASA Technical Reports Server (NTRS)
Allard, Daniel A.; Wallick, Michael N.; Hy, Franklin H.; Gladden, Roy E.
2013-01-01
This software provides the service-level functionality to manage the delivery of files from a lander mission repository to an orbiter mission repository for eventual spacelink relay by the orbiter asset on a specific communications pass. It provides further functions to deliver and track a set of mission-defined messages detailing lander authorization instructions and orbiter data delivery state. All of the information concerning these transactions is persisted in a database providing a high level of accountability of the forward-link relay process.
Colliers, Annelies; Bartholomeeusen, Stefaan; Remmen, Roy; Coenen, Samuel; Michiels, Barbara; Bastiaens, Hilde; Van Royen, Paul; Verhoeven, Veronique; Holmgren, Philip; De Ruyck, Bernard; Philips, Hilde
2016-05-04
Primary out-of-hours care is developing throughout Europe. High-quality databases with linked data from primary health services can help to improve research and future health services. In 2014, a central clinical research database infrastructure was established (iCAREdata: Improving Care And Research Electronic Data Trust Antwerp, www.icaredata.eu ) for primary and interdisciplinary health care at the University of Antwerp, linking data from General Practice Cooperatives, Emergency Departments and Pharmacies during out-of-hours care. Medical data are pseudonymised using the services of a Trusted Third Party, which encodes private information about patients and physicians before data is sent to iCAREdata. iCAREdata provides many new research opportunities in the fields of clinical epidemiology, health care management and quality of care. A key aspect will be to ensure the quality of data registration by all health care providers. This article describes the establishment of a research database and the possibilities of linking data from different primary out-of-hours care providers, with the potential to help to improve research and the quality of health care services.
The QuakeSim Project: Web Services for Managing Geophysical Data and Applications
NASA Astrophysics Data System (ADS)
Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet
2008-04-01
We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.
Intelligent databases assist transparent and sound economic valuation of ecosystem services.
Villa, Ferdinando; Ceroni, Marta; Krivov, Sergey
2007-06-01
Assessment and economic valuation of services provided by ecosystems to humans has become a crucial phase in environmental management and policy-making. As primary valuation studies are out of the reach of many institutions, secondary valuation or benefit transfer, where the results of previous studies are transferred to the geographical, environmental, social, and economic context of interest, is becoming increasingly common. This has brought to light the importance of environmental valuation databases, which provide reliable valuation data to inform secondary valuation with enough detail to enable the transfer of values across contexts. This paper describes the role of next-generation, intelligent databases (IDBs) in assisting the activity of valuation. Such databases employ artificial intelligence to inform the transfer of values across contexts, enforcing comparability of values and allowing users to generate custom valuation portfolios that synthesize previous studies and provide aggregated value estimates to use as a base for secondary valuation. After a general introduction, we introduce the Ecosystem Services Database, the first IDB for environmental valuation to be made available to the public, describe its functionalities and the lessons learned from its usage, and outline the remaining needs and expected future developments in the field.
[Tumor Data Interacted System Design Based on Grid Platform].
Liu, Ying; Cao, Jiaji; Zhang, Haowei; Zhang, Ke
2016-06-01
In order to satisfy demands of massive and heterogeneous tumor clinical data processing and the multi-center collaborative diagnosis and treatment for tumor diseases,a Tumor Data Interacted System(TDIS)was established based on grid platform,so that an implementing virtualization platform of tumor diagnosis service was realized,sharing tumor information in real time and carrying on standardized management.The system adopts Globus Toolkit 4.0tools to build the open grid service framework and encapsulats data resources based on Web Services Resource Framework(WSRF).The system uses the middleware technology to provide unified access interface for heterogeneous data interaction,which could optimize interactive process with virtualized service to query and call tumor information resources flexibly.For massive amounts of heterogeneous tumor data,the federated stored and multiple authorized mode is selected as security services mechanism,real-time monitoring and balancing load.The system can cooperatively manage multi-center heterogeneous tumor data to realize the tumor patient data query,sharing and analysis,and compare and match resources in typical clinical database or clinical information database in other service node,thus it can assist doctors in consulting similar case and making up multidisciplinary treatment plan for tumors.Consequently,the system can improve efficiency of diagnosis and treatment for tumor,and promote the development of collaborative tumor diagnosis model.
Emilyn Sheffield; Leslie Furr; Charles Nelson
1992-01-01
Filevision IV is a multilayer imaging and data-base management system that combines drawing, filing and extensive report-writing capabilities (Filevision IV, 1988). Filevision IV users access data by attaching graphics to text-oriented data-base records. Tourist attractions, support services, and geo-graphic features can be located on a base map of an area or region....
Composite Materials Design Database and Data Retrieval System Requirements
1991-08-01
the present time, the majority of expert systems are stand-alone systems, and environments for effectively coupling heuristic data management with...nonheuristic data management remain to be developed. The only available recourse is to resort to traditional DBMS development and use, and to service...Organization for Data Management . Academic Press, 1986. Glaeser, P. S. (ed). " Data for Science and Technology." Proceedings of the Seventh
Integrating GIS, Archeology, and the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sera White; Brenda Ringe Pace; Randy Lee
2004-08-01
At the Idaho National Engineering and Environmental Laboratory's (INEEL) Cultural Resource Management Office, a newly developed Data Management Tool (DMT) is improving management and long-term stewardship of cultural resources. The fully integrated system links an archaeological database, a historical database, and a research database to spatial data through a customized user interface using ArcIMS and Active Server Pages. Components of the new DMT are tailored specifically to the INEEL and include automated data entry forms for historic and prehistoric archaeological sites, specialized queries and reports that address both yearly and project-specific documentation requirements, and unique field recording forms. The predictivemore » modeling component increases the DMT’s value for land use planning and long-term stewardship. The DMT enhances the efficiency of archive searches, improving customer service, oversight, and management of the large INEEL cultural resource inventory. In the future, the DMT will facilitate data sharing with regulatory agencies, tribal organizations, and the general public.« less
ERIC Educational Resources Information Center
Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.
1999-01-01
Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…
Identification of the condition of crops based on geospatial data embedded in graph databases
NASA Astrophysics Data System (ADS)
Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.
2017-07-01
The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.
Morris, Chris; Pajon, Anne; Griffiths, Susanne L.; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M.; Wilter da Silva, Alan; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S.; Stuart, David I.; Henrick, Kim; Esnouf, Robert M.
2011-01-01
The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service. PMID:21460443
Morris, Chris; Pajon, Anne; Griffiths, Susanne L; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M; da Silva, Alan Wilter; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S; Stuart, David I; Henrick, Kim; Esnouf, Robert M
2011-04-01
The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service.
'Am I covered?': an analysis of a national enquiry database on scope of practice.
Brady, Anne-Marie; Fealy, Gerard; Casey, Mary; Hegarty, Josephine; Kennedy, Catriona; McNamara, Martin; O'Reilly, Pauline; Prizeman, Geraldine; Rohde, Daniela
2015-10-01
Analysis of a national database of enquiries to a professional body pertaining to the scope of nursing and midwifery practice. Against a backdrop of healthcare reform is a demand for flexibility in nursing and midwifery roles with unprecedented redefinition of role boundaries and/or expansion. Guidance from professional regulatory bodies is being sought around issues of concern that are arising in practice. Qualitative thematic analysis. The database of telephone enquiries (n = 9818) made by Registered Nurses and midwives to a national regulatory body (2001-2013) was subjected to a cleaning process and examined to detect those concerns that pertained to scope of practice. A total of 978 enquiries were subjected to thematic analysis. Enquiries were concerned with three main areas: medication management, changing and evolving scope of practice and professional role boundaries. The context was service developments, staff shortages and uncertainty about role expansion and professional accountability. Other concerns related to expectations around responsibility and accountability for other support staff. Efforts by employers to maximize the skill mix of their staff and optimally deploy staff to meet service needs and/or address gaps in service represented the primary service context from which many enquiries arose. The greatest concern for nurses arises around medication management but innovation in healthcare delivery and the demands of service are also creating challenges for nurses and midwives. Maintaining and developing competence is a concern among nurses and midwives particularly in an environment of limited resources and where re-deployment is common. © 2015 John Wiley & Sons Ltd.
Managed care and inpatient mortality in adults: effect of primary payer.
Hines, Anika L; Raetzman, Susan O; Barrett, Marguerite L; Moy, Ernest; Andrews, Roxanne M
2017-02-08
Because managed care is increasingly prevalent in health care finance and delivery, it is important to ascertain its effects on health care quality relative to that of fee-for-service plans. Some stakeholders are concerned that basing gatekeeping, provider selection, and utilization management on cost may lower quality of care. To date, research on this topic has been inconclusive, largely because of variation in research methods and covariates. Patient age has been the only consistently evaluated outcome predictor. This study provides a comprehensive assessment of the association between managed care and inpatient mortality for Medicare and privately insured patients. A cross-sectional design was used to examine the association between managed care and inpatient mortality for four common inpatient conditions. Data from the 2009 Healthcare Cost and Utilization Project State Inpatient Databases for 11 states were linked to data from the American Hospital Association Annual Survey Database. Hospital discharges were categorized as managed care or fee for service. A phased approach to multivariate logistic modeling examined the likelihood of inpatient mortality when adjusting for individual patient and hospital characteristics and for county fixed effects. Results showed different effects of managed care for Medicare and privately insured patients. Privately insured patients in managed care had an advantage over their fee-for-service counterparts in inpatient mortality for acute myocardial infarction, stroke, pneumonia, and congestive heart failure; no such advantage was found for the Medicare managed care population. To the extent that the study showed a protective effect of privately insured managed care, it was driven by individuals aged 65 years and older, who had consistently better outcomes than their non-managed care counterparts. Privately insured patients in managed care plans, especially older adults, had better outcomes than those in fee-for-service plans. Patients in Medicare managed care had outcomes similar to those in Medicare FFS. Additional research is needed to understand the role of patient selection, hospital quality, and differences among county populations in the decreased odds of inpatient mortality among patients in private managed care and to determine why this result does not hold for Medicare.
A privacy-preserved analytical method for ehealth database with minimized information loss.
Chen, Ya-Ling; Cheng, Bo-Chao; Chen, Hsueh-Lin; Lin, Chia-I; Liao, Guo-Tan; Hou, Bo-Yu; Hsu, Shih-Chun
2012-01-01
Digitizing medical information is an emerging trend that employs information and communication technology (ICT) to manage health records, diagnostic reports, and other medical data more effectively, in order to improve the overall quality of medical services. However, medical information is highly confidential and involves private information, even legitimate access to data raises privacy concerns. Medical records provide health information on an as-needed basis for diagnosis and treatment, and the information is also important for medical research and other health management applications. Traditional privacy risk management systems have focused on reducing reidentification risk, and they do not consider information loss. In addition, such systems cannot identify and isolate data that carries high risk of privacy violations. This paper proposes the Hiatus Tailor (HT) system, which ensures low re-identification risk for medical records, while providing more authenticated information to database users and identifying high-risk data in the database for better system management. The experimental results demonstrate that the HT system achieves much lower information loss than traditional risk management methods, with the same risk of re-identification.
ERIC Educational Resources Information Center
Gonzalez, Linda
2005-01-01
Catalogers, catalog managers, and others in library technical services have become increasingly interested in, worried over, and excited about FRBR (the acronym for Functional Requirements of Bibliographic Records). Staff outside of the management of the library's bibliographic database may wonder what the fuss is about (FERBER? FURBUR?), assuming…
Migration of the CERN IT Data Centre Support System to ServiceNow
NASA Astrophysics Data System (ADS)
Alvarez Alonso, R.; Arneodo, G.; Barring, O.; Bonfillou, E.; Coelho dos Santos, M.; Dore, V.; Lefebure, V.; Fedorko, I.; Grossir, A.; Hefferman, J.; Mendez Lorenzo, P.; Moller, M.; Pera Mira, O.; Salter, W.; Trevisani, F.; Toteva, Z.
2014-06-01
The large potential and flexibility of the ServiceNow infrastructure based on "best practises" methods is allowing the migration of some of the ticketing systems traditionally used for the monitoring of the servers and services available at the CERN IT Computer Centre. This migration enables the standardization and globalization of the ticketing and control systems implementing a generic system extensible to other departments and users. One of the activities of the Service Management project together with the Computing Facilities group has been the migration of the ITCM structure based on Remedy to ServiceNow within the context of one of the ITIL processes called Event Management. The experience gained during the first months of operation has been instrumental towards the migration to ServiceNow of other service monitoring systems and databases. The usage of this structure is also extended to the service tracking at the Wigner Centre in Budapest.
2009-01-01
lilt’ \\P1T \\ p,I, ’lloch .11 \\\\"lIl.k.-k \\1\\ \\Inh,.ll t << 1,ŕ Br... ~’ J.’llk ’’’ ’’’’\\I Ih"l Ill<" mnh ,’,lll’ ’.111\\1" ’hill:! Ih
Review of Spatial-Database System Usability: Recommendations for the ADDNS Project
2007-12-01
basic GIS background information , with a closer look at spatial databases. A GIS is also a computer- based system designed to capture, manage...foundation for deploying enterprise-wide spatial information systems . According to Oracle® [18], it enables accurate delivery of location- based services...Toronto TR 2007-141 Lanter, D.P. (1991). Design of a lineage- based meta-data base for GIS. Cartography and Geographic Information Systems , 18
David N. Bengston; David P. Fan
1999-01-01
An indicator of the level of conflict over natural resource management was developed and applied to the case of U.S. national forest policy and management. Computer-coded content analysis was used to identify expressions of conflict in a national database of almost 10,000 news media stories about the U.S. Forest Service. Changes in the amount of news media discussion...
Integrative medicine for managing the symptoms of lupus nephritis
Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo
2018-01-01
Abstract Background: Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. Methods and analyses: The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. Dissemination: This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. Trial registration number: PROSPERO 2018 CRD42018085205 PMID:29595669
Dollar, Daniel M; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-04-01
To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database ("electronic shelf reading"), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes.
Development of an electronic database for Acute Pain Service outcomes
Love, Brandy L; Jensen, Louise A; Schopflocher, Donald; Tsui, Ban CH
2012-01-01
BACKGROUND: Quality assurance is increasingly important in the current health care climate. An electronic database can be used for tracking patient information and as a research tool to provide quality assurance for patient care. OBJECTIVE: An electronic database was developed for the Acute Pain Service, University of Alberta Hospital (Edmonton, Alberta) to record patient characteristics, identify at-risk populations, compare treatment efficacies and guide practice decisions. METHOD: Steps in the database development involved identifying the goals for use, relevant variables to include, and a plan for data collection, entry and analysis. Protocols were also created for data cleaning quality control. The database was evaluated with a pilot test using existing data to assess data collection burden, accuracy and functionality of the database. RESULTS: A literature review resulted in an evidence-based list of demographic, clinical and pain management outcome variables to include. Time to assess patients and collect the data was 20 min to 30 min per patient. Limitations were primarily software related, although initial data collection completion was only 65% and accuracy of data entry was 96%. CONCLUSIONS: The electronic database was found to be relevant and functional for the identified goals of data storage and research. PMID:22518364
Data Management Applications for the Service Preparation Subsystem
NASA Technical Reports Server (NTRS)
Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.;
2009-01-01
These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.
A Geospatial Database that Supports Derivation of Climatological Features of Severe Weather
NASA Astrophysics Data System (ADS)
Phillips, M.; Ansari, S.; Del Greco, S.
2007-12-01
The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, Web Feature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buche, D. L.; Perry, S.
This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.
Service management at CERN with Service-Now
NASA Astrophysics Data System (ADS)
Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.
2012-12-01
The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.
Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys
Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya
2011-01-01
Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.
Chen, Zhijun; Zhu, Jing; Zhou, Mingjian
2015-03-01
Building on a social identity framework, our cross-level process model explains how a manager's servant leadership affects frontline employees' service performance, measured as service quality, customer-focused citizenship behavior, and customer-oriented prosocial behavior. Among a sample of 238 hairstylists in 30 salons and 470 of their customers, we found that hair stylists' self-identity embedded in the group, namely, self-efficacy and group identification, partially mediated the positive effect of salon managers' servant leadership on stylists' service performance as rated by the customers, after taking into account the positive influence of transformational leadership. Moreover, group competition climate strengthened the positive relationship between self-efficacy and service performance. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Intelligent distributed medical image management
NASA Astrophysics Data System (ADS)
Garcia, Hong-Mei C.; Yun, David Y.
1995-05-01
The rapid advancements in high performance global communication have accelerated cooperative image-based medical services to a new frontier. Traditional image-based medical services such as radiology and diagnostic consultation can now fully utilize multimedia technologies in order to provide novel services, including remote cooperative medical triage, distributed virtual simulation of operations, as well as cross-country collaborative medical research and training. Fast (efficient) and easy (flexible) retrieval of relevant images remains a critical requirement for the provision of remote medical services. This paper describes the database system requirements, identifies technological building blocks for meeting the requirements, and presents a system architecture for our target image database system, MISSION-DBS, which has been designed to fulfill the goals of Project MISSION (medical imaging support via satellite integrated optical network) -- an experimental high performance gigabit satellite communication network with access to remote supercomputing power, medical image databases, and 3D visualization capabilities in addition to medical expertise anywhere and anytime around the country. The MISSION-DBS design employs a synergistic fusion of techniques in distributed databases (DDB) and artificial intelligence (AI) for storing, migrating, accessing, and exploring images. The efficient storage and retrieval of voluminous image information is achieved by integrating DDB modeling and AI techniques for image processing while the flexible retrieval mechanisms are accomplished by combining attribute- based and content-based retrievals.
Integrating personal medicine into service delivery: empowering people in recovery.
MacDonald-Wilson, Kim L; Deegan, Patricia E; Hutchison, Shari L; Parrotta, Nancy; Schuster, James M
2013-12-01
Illness management and recovery strategies are considered evidence-based practices. The article describes how a web-based application, CommonGround, has been used to support implementation of such strategies in outpatient mental health services and assess its impact. The specific focus of this article is Personal Medicine, self-management strategies that are a salient component of the CommonGround intervention. With support from counties and a not-for-profit managed care organization, CommonGround has been introduced in 10 medication clinics, one Assertive Community Treatment (ACT) team, and one peer support center across Pennsylvania. Methods include analysis of data from the application's database and evaluation of health functioning, symptoms, and progress toward recovery. Health functioning improved over time and use of self-management strategies was associated with fewer concerns about medication side effects, fewer concerns about the impact of mental health medicine on physical health, more reports that mental health medicines were helping, and greater progress in individuals' recovery. Using Personal Medicine empowers individuals to work with their prescribers to find a "right balance" between what they do to be well and what they take to be well. This program helps individuals and their service team focus on individual strengths and resilient self-care strategies. More research is needed to assess factors that may predict changes in outcomes and how a web-based tool focused on self-management strategies may moderate those factors. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Considerations and benefits of implementing an online database tool for business continuity.
Mackinnon, Susanne; Pinette, Jennifer
2016-01-01
In today's challenging climate of ongoing fiscal restraints, limited resources and complex organisational structures there is an acute need to investigate opportunities to facilitate enhanced delivery of business continuity programmes while maintaining or increasing acceptable levels of service delivery. In 2013, Health Emergency Management British Columbia (HEMBC), responsible for emergency management and business continuity activities across British Columbia's health sector, transitioned its business continuity programme from a manual to automated process with the development of a customised online database, known as the Health Emergency Management Assessment Tool (HEMAT). Key benefits to date include a more efficient business continuity input process, immediate situational awareness for use in emergency response and/or advanced planning and streamlined analyses for generation of reports.
Lee, S L
2000-05-01
Nurses, therapists and case managers were spending too much time each week on the phone waiting to read patient reports to live transcriptionists who would then type the reports for storage in VNSNY's clinical management mainframe database. A speech recognition system helped solve the problem by providing the staff 24-hour access to an automated transcription service any day of the week. Nurses and case managers no longer wait in long queues to transmit patient reports or to retrieve information from the database. Everything is done automatically within minutes. VNSNY saved both time and money by updating its transcription strategy. Now nurses can spend more time with patients and less time on the phone transcribing notes. It also means fewer staff members are needed on weekends to do manual transcribing.
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Thouret, Valérie; Fontaine, Alain
2016-04-01
IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Database Portal (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). The new IAGOS Database Portal has been released in December 2015. The main improvement is the interoperability implementation with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the CAMS data center in Jülich (http://join.iek.fz-juelich.de). The CAMS (Copernicus Atmospheric Monitoring Service) project is a prominent user of the IGAS data network. The new portal provides improved and new services such as the download in NetCDF or NASA Ames formats, plotting tools (maps, time series, vertical profiles, etc.) and user management. Added value products are available on the portal: back trajectories, origin of air masses, co-location with satellite data, etc. The link with the CAMS data center, through JOIN (Jülich OWS Interface), allows to combine model outputs with IAGOS data for inter-comparison. Finally IAGOS metadata has been standardized (ISO 19115) and now provides complete information about data traceability and quality.
Towards an e-Health Cloud Solution for Remote Regions at Bahia-Brazil.
Sarinho, V T; Mota, A O; Silva, E P
2017-12-19
This paper presents CloudMedic, an e-Health Cloud solution that manages health care services in remote regions of Bahia-Brazil. For that, six main modules: Clinic, Hospital, Supply, Administrative, Billing and Health Business Intelligence, were developed to control the health flow among health actors at health institutions. They provided database model and procedures for health business rules, a standard gateway for data maintenance between web views and database layer, and a multi-front-end framework based on web views and web commands configurations. These resources were used by 2042 health actors in 261 health posts covering health demands from 118 municipalities at Bahia state. They also managed approximately 2.4 million health service 'orders and approximately 13.5 million health exams for more than 1.3 million registered patients. As a result, a collection of health functionalities available in a cloud infrastructure was successfully developed, deployed and validated in more than 28% of Bahia municipalities. A viable e-Health Cloud solution that, despite municipality limitations in remote regions, decentralized and improved the access to health care services at Bahia state.
1992-07-01
database programs, such as dBase or Microsoft Excell, to yield statistical reports that can profile the health care facility . Ladeen (1989) feels that the...service specific space status report would be beneficial to the specific service(s) under study, it would not provide sufficient data for facility -wide...change in the Master Space Plan. The revised methodology also provides a mechanism and forum for spuce management education within the facility . The
Implementation of a Computerized Maintenance Management System
NASA Technical Reports Server (NTRS)
Shen, Yong-Hong; Askari, Bruce
1994-01-01
A primer Computerized Maintenance Management System (CMMS) has been established for NASA Ames pressure component certification program. The CMMS takes full advantage of the latest computer technology and SQL relational database to perform periodic services for vital pressure components. The Ames certification program is briefly described and the aspects of the CMMS implementation are discussed as they are related to the certification objectives.
Delayed Instantiation Bulk Operations for Management of Distributed, Object-Based Storage Systems
2009-08-01
source and destination object sets, while they have attribute pages to indicate that history . Fourth, we allow for operations to occur on any objects...client dialogue to the PostgreSQL database where server-side functions implement the service logic for the requests. The translation is done...to satisfy client requests, and performs delayed instantiation bulk operations. It is built around a PostgreSQL database with tables for storing
Integrated cluster management at Manchester
NASA Astrophysics Data System (ADS)
McNab, Andrew; Forti, Alessandra
2012-12-01
We describe an integrated management system using third-party, open source components used in operating a large Tier-2 site for particle physics. This system tracks individual assets and records their attributes such as MAC and IP addresses; derives DNS and DHCP configurations from this database; creates each host's installation and re-configuration scripts; monitors the services on each host according to the records of what should be running; and cross references tickets with asset records and per-asset monitoring pages. In addition, scripts which detect problems and automatically remove hosts record these new states in the database which are available to operators immediately through the same interface as tickets and monitoring.
VOMS/VOMRS utilization patterns and convergence plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceccanti, A.; /INFN, CNAF; Ciaschini, V.
2010-01-01
The Grid community uses two well-established registration services, which allow users to be authenticated under the auspices of Virtual Organizations (VOs). The Virtual Organization Membership Service (VOMS), developed in the context of the Enabling Grid for E-sciencE (EGEE) project, is an Attribute Authority service that issues attributes expressing membership information of a subject within a VO. VOMS allows to partition users in groups, assign them roles and free-form attributes which are then used to drive authorization decisions. The VOMS administrative application, VOMS-Admin, manages and populates the VOMS database with membership information. The Virtual Organization Management Registration Service (VOMRS), developed atmore » Fermilab, extends the basic registration and management functionalities present in VOMS-Admin. It implements a registration workflow that requires VO usage policy acceptance and membership approval by administrators. VOMRS supports management of multiple grid certificates, and handling users' request for group and role assignments, and membership status. VOMRS is capable of interfacing to local systems with personnel information (e.g. the CERN Human Resource Database) and of pulling relevant member information from them. VOMRS synchronizes the relevant subset of information with VOMS. The recent development of new features in VOMS-Admin raises the possibility of rationalizing the support and converging on a single solution by continuing and extending existing collaborations between EGEE and OSG. Such strategy is supported by WLCG, OSG, US CMS, US Atlas, and other stakeholders worldwide. In this paper, we will analyze features in use by major experiments and the use cases for registration addressed by the mature single solution.« less
VOMS/VOMRS utilization patterns and convergence plan
NASA Astrophysics Data System (ADS)
Ceccanti, A.; Ciaschini, V.; Dimou, M.; Garzoglio, G.; Levshina, T.; Traylen, S.; Venturi, V.
2010-04-01
The Grid community uses two well-established registration services, which allow users to be authenticated under the auspices of Virtual Organizations (VOs). The Virtual Organization Membership Service (VOMS), developed in the context of the Enabling Grid for E-sciencE (EGEE) project, is an Attribute Authority service that issues attributes expressing membership information of a subject within a VO. VOMS allows to partition users in groups, assign them roles and free-form attributes which are then used to drive authorization decisions. The VOMS administrative application, VOMS-Admin, manages and populates the VOMS database with membership information. The Virtual Organization Management Registration Service (VOMRS), developed at Fermilab, extends the basic registration and management functionalities present in VOMS-Admin. It implements a registration workflow that requires VO usage policy acceptance and membership approval by administrators. VOMRS supports management of multiple grid certificates, and handling users' request for group and role assignments, and membership status. VOMRS is capable of interfacing to local systems with personnel information (e.g. the CERN Human Resource Database) and of pulling relevant member information from them. VOMRS synchronizes the relevant subset of information with VOMS. The recent development of new features in VOMS-Admin raises the possibility of rationalizing the support and converging on a single solution by continuing and extending existing collaborations between EGEE and OSG. Such strategy is supported by WLCG, OSG, US CMS, US Atlas, and other stakeholders worldwide. In this paper, we will analyze features in use by major experiments and the use cases for registration addressed by the mature single solution.
NASA Astrophysics Data System (ADS)
Kuznetsov, Valentin; Riley, Daniel; Afaq, Anzar; Sekhri, Vijay; Guo, Yuyi; Lueking, Lee
2010-04-01
The CMS experiment has implemented a flexible and powerful system enabling users to find data within the CMS physics data catalog. The Dataset Bookkeeping Service (DBS) comprises a database and the services used to store and access metadata related to CMS physics data. To this, we have added a generalized query system in addition to the existing web and programmatic interfaces to the DBS. This query system is based on a query language that hides the complexity of the underlying database structure by discovering the join conditions between database tables. This provides a way of querying the system that is simple and straightforward for CMS data managers and physicists to use without requiring knowledge of the database tables or keys. The DBS Query Language uses the ANTLR tool to build the input query parser and tokenizer, followed by a query builder that uses a graph representation of the DBS schema to construct the SQL query sent to underlying database. We will describe the design of the query system, provide details of the language components and overview of how this component fits into the overall data discovery system architecture.
The Neotoma Paleoecology Database
NASA Astrophysics Data System (ADS)
Grimm, E. C.; Ashworth, A. C.; Barnosky, A. D.; Betancourt, J. L.; Bills, B.; Booth, R.; Blois, J.; Charles, D. F.; Graham, R. W.; Goring, S. J.; Hausmann, S.; Smith, A. J.; Williams, J. W.; Buckland, P.
2015-12-01
The Neotoma Paleoecology Database (www.neotomadb.org) is a multiproxy, open-access, relational database that includes fossil data for the past 5 million years (the late Neogene and Quaternary Periods). Modern distributional data for various organisms are also being made available for calibration and paleoecological analyses. The project is a collaborative effort among individuals from more than 20 institutions worldwide, including domain scientists representing a spectrum of Pliocene-Quaternary fossil data types, as well as experts in information technology. Working groups are active for diatoms, insects, ostracodes, pollen and plant macroscopic remains, testate amoebae, rodent middens, vertebrates, age models, geochemistry and taphonomy. Groups are also active in developing online tools for data analyses and for developing modules for teaching at different levels. A key design concept of NeotomaDB is that stewards for various data types are able to remotely upload and manage data. Cooperatives for different kinds of paleo data, or from different regions, can appoint their own stewards. Over the past year, much progress has been made on development of the steward software-interface that will enable this capability. The steward interface uses web services that provide access to the database. More generally, these web services enable remote programmatic access to the database, which both desktop and web applications can use and which provide real-time access to the most current data. Use of these services can alleviate the need to download the entire database, which can be out-of-date as soon as new data are entered. In general, the Neotoma web services deliver data either from an entire table or from the results of a view. Upon request, new web services can be quickly generated. Future developments will likely expand the spatial and temporal dimensions of the database. NeotomaDB is open to receiving new datasets and stewards from the global Quaternary community. Research is supported by NSF EAR-0622349.
The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).
ERIC Educational Resources Information Center
Library Software Review, 1984
1984-01-01
Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-10
... Request (ICR), Office of Management and Budget (OMB) control number 1652-0034, abstracted below that we... Air Marshal Service (FAMS) maintenance of a database of all Federal, State and local law enforcement...
Yoshioka, Yoji; Tamiya, Nanako; Kashiwagi, Masayo; Sato, Mikiya; Okubo, Ichiro
2010-01-01
Long-Term Care Insurance (LTCI), which started in April 2000, allowed private business corporations to provide long-term care services which had been provided by social welfare corporations or public agencies in the previous long-term care scheme. This study compared differences in care management plans for community-dwelling frail elderly people between public care management agencies and private care management agencies. The subjects were 309 community-dwelling frail elderly people living in a suburban city with a population of approximately 55,000 and who had been using community-based long-term care services of the LTCI for 6 months from April 2000. The characteristics of the care management agencies (public/private) were identified using a claims database. After comparing profiles of users and their care mix between those managed by public agencies and by private agencies, the effect of the characteristics of care management agencies on LTCI service use was examined. Public care management agencies favored younger subjects (P = 0.003), male subjects (P = 0.006) and people with a higher need for care (P = 0.02) than private agencies. The number of service items used was significantly larger in public agencies than in their private counterparts. In multivariate regression analysis, the utilization of community-based long-term care service was significantly greater among beneficiaries managed by private agencies than those managed by public agencies (P = 0.02). Private care management agencies play an important role in promoting the use of care services, but their quality of care plans might be questionable.
History and use of remote sensing for conservation and management of federal lands in Alaska, USA
Markon, Carl
1995-01-01
Remote sensing has been used to aid land use planning efforts for federal public lands in Alaska since the 1940s. Four federal land management agencies-the U.S. Fish and Wildlife Service, US. Bureau of Land Management, US. National Park Service, and U.S. Forest Service-have used aerial photography and satellite imagery to document the extent, type, and condition of Alaska's natural resources. Aerial photographs have been used to collect detailed information over small to medium-sized areas. This standard management tool is obtainable using equipment ranging from hand-held 35-mm cameras to precision metric mapping cameras. Satellite data, equally important, provide synoptic views of landscapes, are digitally manipulatable, and are easily merged with other digital databases. To date, over 109.2 million ha (72%) of Alaska's land cover have been mapped via remote sensing. This information has provided a base for conservation, management, and planning on federal public lands in Alaska.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented thatmore » communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.« less
AGM: A DSL for mobile cloud computing based on directed graph
NASA Astrophysics Data System (ADS)
Tanković, Nikola; Grbac, Tihana Galinac
2016-06-01
This paper summarizes a novel approach for consuming a domain specific language (DSL) by transforming it to a directed graph representation persisted by a graph database. Using such specialized database enables advanced navigation trough the stored model exposing only relevant subsets of meta-data to different involved services and components. We applied this approach in a mobile cloud computing system and used it to model several mobile applications in retail, supply chain management and merchandising domain. These application are distributed in a Software-as-a-Service (SaaS) fashion and used by thousands of customers in Croatia. We report on lessons learned and propose further research on this topic.
Cervone, Maria A; Savel, Thomas G
2006-01-01
The National Center on Birth Defects and Developmental Disabilities (NCBDDD) at the Centers for Disease Control and Prevention (CDC) sought to establish a database to proactively manage their partner relationships with external organizations. A user needs analysis was conducted, and CDC's Public Health Information Network Directory (PHINDIR) was evaluated as a possible solution. PHINDIR could sufficiently maintain contact information but did not address customer relationships; however, its flexible architecture allows add-on applications via web services. Thus, NCBDDD's needs could be met via PHINDIR.
Is Case Management Effective for Long-Lasting Suicide Prevention?
Wang, Liang-Jen; Wu, Ya-Wen; Chen, Chih-Ken
2015-01-01
Case management services have been implemented in suicide prevention programs. To investigate whether case management is an effective strategy for reducing the risks of repeated suicide attempts and completed suicides in a city with high suicide rates in northern Taiwan. The Suicide Prevention Center of Keelung City (KSPC) was established in April 2005. Subjects included a consecutive sample of individuals (N = 2,496) registered in KSPC databases between January 1, 2006, and December 31, 2011, with at least one episode of nonfatal self-harm. Subjects were tracked for the duration of the study. Of all the subjects, 1,013 (40.6%) received case management services; 416 (16.7%) had at least one other deliberate self-harm episode and 52 (2.1%) eventually died by suicide. No significant differences were found in the risks of repeated self-harm and completed suicides between suicide survivors who received case management and those who refused the services. However, a significant reduction in suicide rates was found after KSPC was established. Findings suggest that case management services might not reduce the risks of suicide repetition among suicide survivors during long-term follow-up. Future investigation is warranted to determine factors impacting the downward trend of suicide rates.
A GH-Based Ontology to Support Applications for Automating Decision Support
2005-03-01
architecture for a decision support sys - tem. For this reason, it obtains data from, and updates, a database. IDA also wanted the prototype’s architecture...Chief In- formation Officer CoABS Control of Agent Based Sys - tems DBMS Database Management System DoD Department of Defense DTD Document Type...Generic Hub, the Moyeu Générique, and the Generische Nabe , specifying each as a separate service description with property names and values of the GH
Wang, Y; Yeo, Q Q; Ko, Y
2016-04-01
To review and evaluate the most recent literature on the economic outcomes of pharmacist-managed services in people with diabetes. The global prevalence of diabetes is increasing. Although pharmacist-managed services have been shown to improve people's health outcomes, the economic impact of these programmes remains unclear. A systematic review was conducted of six databases. Study inclusion criteria were: (1) original research; (2) evaluation of pharmacist-managed services in people with diabetes; (3) an economic evaluation; (4) English-language publication; and (5) full-text, published between January 2006 and December 2014. The quality of the full economic evaluations reviewed was evaluated using the Consolidated Health Economic Evaluation Reporting Standards checklist. A total of 2204 articles were screened and 25 studies were selected. These studies were conducted in a community pharmacy (n = 10), a clinic- /hospital-based outpatient facility (n = 8), or others. Pharmacist-managed services included targeted education (n = 24), general pharmacotherapeutic monitoring (n = 21), health screening or laboratory testing services (n = 9), immunization services (n = 2) and pharmacokinetic monitoring (n = 1). Compared with usual care, pharmacist-managed services resulted in cost savings that varied from $7 to $65,000 ($8 to $85,000 in 2014 US dollars) per person per year, and generated higher quality-adjusted life years with lower costs. Benefit-to-cost ratios ranged from 1:1 to 8.5:1. Among the 25 studies reviewed, 11 were full economic evaluations of moderate quality. Pharmacist-managed services had a positive return in terms of economic viability. With the expanding role of pharmacists in the healthcare sector, alongside increasing health expenditure, future economic studies of high quality are needed to investigate the cost-effectiveness of these services. © 2015 Diabetes UK.
Schools Inc.: An Administrator's Guide to the Business of Education.
ERIC Educational Resources Information Center
McCarthy, Bob; And Others
1989-01-01
This theme issue describes ways in which educational administrators are successfully automating many of their administrative tasks. Articles focus on student management; office automation, including word processing, databases, and spreadsheets; human resources; support services, including supplies, textbooks, and learning resources; financial…
Fifteen hundred guidelines and growing: the UK database of clinical guidelines.
van Loo, John; Leonard, Niamh
2006-06-01
The National Library for Health offers a comprehensive searchable database of nationally approved clinical guidelines, called the Guidelines Finder. This resource, commissioned in 2002, is managed and developed by the University of Sheffield Health Sciences Library. The authors introduce the historical and political dimension of guidelines and the nature of guidelines as a mechanism to ensure clinical effectiveness in practice. The article then outlines the maintenance and organisation of the Guidelines Finder database itself, the criteria for selection, who publishes guidelines and guideline formats, usage of the Guidelines Finder service and finally looks at some lessons learnt from a local library offering a national service. Clinical guidelines are central to effective clinical practice at the national, organisational and individual level. The Guidelines Finder is one of the most visited resources within the National Library for Health and is successful in answering information needs related to specific patient care, clinical research, guideline development and education.
Karp, Peter D; Paley, Suzanne; Romero, Pedro
2002-01-01
Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Gautron, Benoit; Schultz, Martin; Brötz, Björn; Rauthe-Schöch, Armin; Thouret, Valérie
2015-04-01
IAGOS (In-service Aircraft for a Global Observing System) aims at the provision of long-term, frequent, regular, accurate, and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft. The IAGOS database is an essential part of the global atmospheric monitoring network. Data access is handled by open access policy based on the submission of research requests which are reviewed by the PIs. The IAGOS database (http://www.iagos.fr, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data centre Ether (CNES and CNRS). In the framework of the IGAS project (IAGOS for Copernicus Atmospheric Service) interoperability with international portals or other databases is implemented in order to improve IAGOS data discovery. The IGAS data network is composed of three data centres: the IAGOS database in Toulouse including IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data since January 2015; the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de); and the MACC data centre in Jülich (http://join.iek.fz-juelich.de). The MACC (Monitoring Atmospheric Composition and Climate) project is a prominent user of the IGAS data network. In June 2015 a new version of the IAGOS database will be released providing improved services such as download in NetCDF or NASA Ames formats; graphical tools (maps, scatter plots, etc.); standardized metadata (ISO 19115) and a better users management. The link with the MACC data centre, through JOIN (Jülich OWS Interface), will allow to combine model outputs with IAGOS data for intercomparison. The interoperability within the IGAS data network, implemented thanks to many web services, will improve the functionalities of the web interfaces of each data centre.
Kang, Young Gon; Suh, Eunkyung; Lee, Jae-woo; Kim, Dong Wook; Cho, Kyung Hee; Bae, Chul-Young
2018-01-01
Purpose A comprehensive health index is needed to measure an individual’s overall health and aging status and predict the risk of death and age-related disease incidence, and evaluate the effect of a health management program. The purpose of this study is to demonstrate the validity of estimated biological age (BA) in relation to all-cause mortality and age-related disease incidence based on National Sample Cohort database. Patients and methods This study was based on National Sample Cohort database of the National Health Insurance Service – Eligibility database and the National Health Insurance Service – Medical and Health Examination database of the year 2002 through 2013. BA model was developed based on the National Health Insurance Service – National Sample Cohort (NHIS – NSC) database and Cox proportional hazard analysis was done for mortality and major age-related disease incidence. Results For every 1 year increase of the calculated BA and chronological age difference, the hazard ratio for mortality significantly increased by 1.6% (1.5% in men and 2.0% in women) and also for hypertension, diabetes mellitus, heart disease, stroke, and cancer incidence by 2.5%, 4.2%, 1.3%, 1.6%, and 0.4%, respectively (p<0.001). Conclusion Estimated BA by the developed BA model based on NHIS – NSC database is expected to be used not only as an index for assessing health and aging status and predicting mortality and major age-related disease incidence, but can also be applied to various health care fields. PMID:29593385
Chen, Hsiao-Mei; Han, Tung-Chen; Chen, Ching-Min
2014-04-01
Population aging has caused significant rises in the prevalence of chronic diseases and the utilization of healthcare services in Taiwan. The current healthcare delivery system is fragmented. Integrating medical services may increase the quality of healthcare, enhance patient and patient family satisfaction with healthcare services, and better contain healthcare costs. This article introduces two continuing care models: discharge planning and case management. Further, the effectiveness and essential components of these two models are analyzed using a systematic review method. Articles included in this systematic review were all original articles on discharge-planning or case-management interventions published between February 1999 and March 2013 in any of 6 electronic databases (Medline, PubMed, Cinahl Plus with full Text, ProQuest, Cochrane Library, CEPS and Center for Chinese Studies electronic databases). Of the 70 articles retrieved, only 7 were randomized controlled trial studies. Three types of continuity-of-care models were identified: discharge planning, case management, and a hybrid of these two. All three models used logical and systematic processes to conduct assessment, planning, implementation, coordination, follow-up, and evaluation activities. Both the discharge planning model and the case management model were positively associated with improved self-care knowledge, reduced length of stay, decreased medical costs, and better quality of life. This study cross-referenced all reviewed articles in terms of target clients, content, intervention schedules, measurements, and outcome indicators. Study results may be referenced in future implementations of continuity-care models and may provide a reference for future research.
Managing hydrological measurements for small and intermediate projects: RObsDat
NASA Astrophysics Data System (ADS)
Reusser, Dominik E.
2014-05-01
Hydrological measurements need good management for the data not to be lost. Multiple, often overlapping files from various loggers with heterogeneous formats need to be merged. Data needs to be validated and cleaned and subsequently converted to the format for the hydrological target application. Preferably, all these steps should be easily tracable. RObsDat is an R package designed to support such data management. It comes with a command line user interface to support hydrologists to enter and adjust their data in a database following the Observations Data Model (ODM) standard by QUASHI. RObsDat helps in the setup of the database within one of the free database engines MySQL, PostgreSQL or SQLite. It imports the controlled water vocabulary from the QUASHI web service and provides a smart interface between the hydrologist and the database: Already existing data entries are detected and duplicates avoided. The data import function converts different data table designes to make import simple. Cleaning and modifications of data are handled with a simple version control system. Variable and location names are treated in a user friendly way, accepting and processing multiple versions. A new development is the use of spacetime objects for subsequent processing.
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.; Sheneman, L.; Gollberg, G.
2013-12-01
The Regional Approaches to Climate Change for Pacific Northwest Agriculture (REACCH PNA) is a five-year USDA/NIFA-funded coordinated agriculture project to examine the sustainability of cereal crop production systems in the Pacific Northwest, in relationship to ongoing climate change. As part of this effort, an extensive data management system has been developed to enable researchers, students, and the public, to upload, manage, and analyze various data. The REACCH PNA data management team has developed three core systems to encompass cyberinfrastructure and data management needs: 1) the reacchpna.org portal (https://www.reacchpna.org) is the entry point for all public and secure information, with secure access by REACCH PNA members for data analysis, uploading, and informational review; 2) the REACCH PNA Data Repository is a replicated, redundant database server environment that allows for file and database storage and access to all core data; and 3) the REACCH PNA Libraries which are functional groupings of data for REACCH PNA members and the public, based on their access level. These libraries are accessible thru our https://www.reacchpna.org portal. The developed system is structured in a virtual server environment (data, applications, web) that includes a geospatial database/geospatial web server for web mapping services (ArcGIS Server), use of ESRI's Geoportal Server for data discovery and metadata management (under the ISO 19115-2 standard), Thematic Realtime Environmental Distributed Data Services (THREDDS) for data cataloging, and Interactive Python notebook server (IPython) technology for data analysis. REACCH systems are housed and maintained by the Northwest Knowledge Network project (www.northwestknowledge.net), which provides data management services to support research. Initial project data harvesting and meta-tagging efforts have resulted in the interrogation and loading of over 10 terabytes of climate model output, regional entomological data, agricultural and atmospheric information, as well as imagery, publications, videos, and other soft content. In addition, the outlined data management approach has focused on the integration and interconnection of hard data (raw data output) with associated publications, presentations, or other narrative documentation - through metadata lineage associations. This harvest-and-consume data management methodology could additionally be applied to other research team environments that involve large and divergent data.
A service-based framework for pharmacogenomics data integration
NASA Astrophysics Data System (ADS)
Wang, Kun; Bai, Xiaoying; Li, Jing; Ding, Cong
2010-08-01
Data are central to scientific research and practices. The advance of experiment methods and information retrieval technologies leads to explosive growth of scientific data and databases. However, due to the heterogeneous problems in data formats, structures and semantics, it is hard to integrate the diversified data that grow explosively and analyse them comprehensively. As more and more public databases are accessible through standard protocols like programmable interfaces and Web portals, Web-based data integration becomes a major trend to manage and synthesise data that are stored in distributed locations. Mashup, a Web 2.0 technique, presents a new way to compose content and software from multiple resources. The paper proposes a layered framework for integrating pharmacogenomics data in a service-oriented approach using the mashup technology. The framework separates the integration concerns from three perspectives including data, process and Web-based user interface. Each layer encapsulates the heterogeneous issues of one aspect. To facilitate the mapping and convergence of data, the ontology mechanism is introduced to provide consistent conceptual models across different databases and experiment platforms. To support user-interactive and iterative service orchestration, a context model is defined to capture information of users, tasks and services, which can be used for service selection and recommendation during a dynamic service composition process. A prototype system is implemented and cases studies are presented to illustrate the promising capabilities of the proposed approach.
An approach to efficient mobility management in intelligent networks
NASA Technical Reports Server (NTRS)
Murthy, K. M. S.
1995-01-01
Providing personal communication systems supporting full mobility require intelligent networks for tracking mobile users and facilitating outgoing and incoming calls over different physical and network environments. In realizing the intelligent network functionalities, databases play a major role. Currently proposed network architectures envision using the SS7-based signaling network for linking these DB's and also for interconnecting DB's with switches. If the network has to support ubiquitous, seamless mobile services, then it has to support additionally mobile application parts, viz., mobile origination calls, mobile destination calls, mobile location updates and inter-switch handovers. These functions will generate significant amount of data and require them to be transferred between databases (HLR, VLR) and switches (MSC's) very efficiently. In the future, the users (fixed or mobile) may use and communicate with sophisticated CPE's (e.g. multimedia, multipoint and multisession calls) which may require complex signaling functions. This will generate volumness service handling data and require efficient transfer of these message between databases and switches. Consequently, the network providers would be able to add new services and capabilities to their networks incrementally, quickly and cost-effectively.
Case management redesign in an urban facility.
Almaden, Stefany; Freshman, Brenda; Quaye, Beverly
2011-01-01
To explore strategies for improving patient throughput and to redesign case management processes to facilitate level of care transitions and safe discharges. Large Urban Medical Center in South Los Angeles County, with 384 licensed beds that services poor, underserved communities. Both qualitative and quantitative methods were applied. Combined theoretical frameworks were used for needs assessment, intervention strategies, and change management. Observations, interviews, surveys, and database extraction methods were used. The sample consisted of case management staff members and several other staff from nursing, social work, and emergency department staff. Postintervention measures indicated improvement in reimbursements for services, reduction in length of stay, increased productivity, improved patients' access to care, and avoiding unnecessary readmission or emergency department visits. Effective change management strategies must consider multiple factors that influence daily operations and service delivery. Creating accountability by using performance measures associated with patient transitions is highlighted by the case study results. The authors developed a process model to assist in identifying and tracking outcome measures related to patient throughput, front-end assessments, and effective patient care transitions. This model can be used in future research to further investigate best case management practices.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Knüppe, Kathrin; Knieper, Christian; Pahl-Wostl, Claudia
2018-04-01
The utilization of ecosystem services in flood management is challenged by the complexity of human-nature interactions and practical implementation barriers towards more ecosystem-based solutions, such as riverine urban areas or technical infrastructure. This paper analyses how flood management has dealt with trade-offs between ecosystem services and practical constrains towards more ecosystem-based solutions. To this end, we study the evolution of flood management in four case studies in the Dutch and German Rhine, the Hungarian Tisza, and the Chinese Yangtze basins during the last decades, focusing on the development and implementation of institutions and their link to ecosystem services. The complexity of human-nature interactions is addressed by exploring the impacts on ecosystem services through the lens of three management paradigms: (1) the control paradigm, (2) the ecosystem-based paradigm, and (3) the stakeholder involvement paradigm. Case study data from expert interviews and a literature search were structured using a database approach prior to qualitative interpretation. Results show the growing importance of the ecosystem-based and stakeholder involvement paradigms which has led to the consideration of a range of regulating and cultural ecosystem services that had previously been neglected. We detected a trend in flood management practice towards the combination of the different paradigms under the umbrella of integrated flood management, which aims at finding the most suitable solution depending on the respective regional conditions.
A knowledge management platform for infrastructure performance modeling
DOT National Transportation Integrated Search
2011-05-10
The ITS/JPO Evaluation Program is requesting ITS costs information in order to update the ITS Costs database with current data and account for new/emerging services and technologies. If you have ITS Costs on recent ITS projects, or if you have ITS co...
System Thinking Tutorial and Reef Link Database Fact Sheets
The sustainable well-being of communities is inextricably linked to both the health of the earth’s ecosystems and the health of humans living in the community. Currently, many policy and management decisions are made without considering the goods and services humans derive from ...
34 CFR 361.23 - Requirements related to the statewide workforce investment system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... technology for individuals with disabilities; (ii) The use of information and financial management systems... statistics, job vacancies, career planning, and workforce investment activities; (iii) The use of customer service features such as common intake and referral procedures, customer databases, resource information...
ERIC Educational Resources Information Center
Feinberg, Rosa Castro
1995-01-01
A school board member guides a tour of what is available for organizing school executives' schedules. Describes time-planner notebooks, computer software, and various academic and commercial online databases. Sidebars list the programs and products mentioned, discuss online services, and describe using a time-management planner in campaigning for…
Integrated Geo Hazard Management System in Cloud Computing Technology
NASA Astrophysics Data System (ADS)
Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.
2016-11-01
Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.
Comparison of Online Agricultural Information Services.
ERIC Educational Resources Information Center
Reneau, Fred; Patterson, Richard
1984-01-01
Outlines major online agricultural information services--agricultural databases, databases with agricultural services, educational databases in agriculture--noting services provided, access to the database, and costs. Benefits of online agricultural database sources (availability of agricultural marketing, weather, commodity prices, management…
The development of an intelligent user interface for NASA's scientific databases
NASA Technical Reports Server (NTRS)
Campbell, William J.; Roelofs, Larry H.
1986-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI effort is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. This paper presents the design concepts, development approach and evaluation of performance of a prototype Intelligent User Interface Subsystem (IUIS) supporting an operational database.
Supporting reputation based trust management enhancing security layer for cloud service models
NASA Astrophysics Data System (ADS)
Karthiga, R.; Vanitha, M.; Sumaiya Thaseen, I.; Mangaiyarkarasi, R.
2017-11-01
In the existing system trust between cloud providers and consumers is inadequate to establish the service level agreement though the consumer’s response is good cause to assess the overall reliability of cloud services. Investigators recognized the significance of trust can be managed and security can be provided based on feedback collected from participant. In this work a face recognition system that helps to identify the user effectively. So we use an image comparison algorithm where the user face is captured during registration time and get stored in database. With that original image we compare it with the sample image that is already stored in database. If both the image get matched then the users are identified effectively. When the confidential data are subcontracted to the cloud, data holders will become worried about the confidentiality of their data in the cloud. Encrypting the data before subcontracting has been regarded as the important resources of keeping user data privacy beside the cloud server. So in order to keep the data secure we use an AES algorithm. Symmetric-key algorithms practice a shared key concept, keeping data secret requires keeping this key secret. So only the user with private key can decrypt data.
Save medical personnel's time by improved user interfaces.
Kindler, H
1997-01-01
Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.
Ko, Seung Hyun; Han, Kyungdo; Lee, Yong Ho; Noh, Junghyun; Park, Cheol Young; Kim, Dae Jung; Jung, Chang Hee; Lee, Ki Up; Ko, Kyung Soo
2018-04-01
Korea's National Healthcare Program, the National Health Insurance Service (NHIS), a government-affiliated agency under the Korean Ministry of Health and Welfare, covers the entire Korean population. The NHIS supervises all medical services in Korea and establishes a systematic National Health Information database (DB). A health information DB system including all of the claims, medications, death information, and health check-ups, both in the general population and in patients with various diseases, is not common worldwide. On June 9, 2014, the NHIS signed a memorandum of understanding with the Korean Diabetes Association (KDA) to provide limited open access to its DB. By October 31, 2017, seven papers had been published through this collaborative research project. These studies were conducted to investigate the past and current status of type 2 diabetes mellitus and its complications and management in Korea. This review is a brief summary of the collaborative projects between the KDA and the NHIS over the last 3 years. According to the analysis, the national health check-up DB or claim DB were used, and the age category or study period were differentially applied. Copyright © 2018 Korean Diabetes Association.
Morris, D G; Hayward, T
2000-01-01
The early diagnosis of fetal abnormalities and their consequent management comprise a relatively new specialty, which has developed to a large extent from the advances in and availability of ultrasound imaging techniques. The Women's and Children's antenatal diagnosis and counselling service (ADACS) is composed of a multidisciplinary group of health professionals and deals with issues relating to pre-pregnancy counselling, fetal diagnosis of abnormalities and the management of these conditions, including all aspects of pregnancy loss. Detailed minutes are recorded for each case and key information is stored in a relational database. Weekly meetings are held to discuss selected cases and, where possible, are presented to the group by the referring practitioner, either in person or using the telemedicine facilities. Telemedicine has provided a significant enhancement to the ADACS service.
Mobile Location-Based Services for Trusted Information in Disaster Management
NASA Astrophysics Data System (ADS)
Ragia, Lemonia; Deriaz, Michel; Seigneur, Jean-Marc
The goal of the present chapter is to provide location-based services for disaster management. The application involves services related to the safety of the people due to an unexpected event. The current prototype is implemented for a specific issue of disaster management which is road traffic control. The users can ask requests on cell phones or via Internet to the system and get an answer in a display or in textual form. The data are in a central database and every user can input data via virtual tags. The system is based on spatial messages which can be sent from any user to any other in a certain distance. In this way all the users and not a separate source provide the necessary information for a dangerous situation. To avoid any contamination problems we use trust security to check the input to the system and a trust engine model to provide information with a considerable reliability.
Chao, Tian-Jy; Kim, Younghun
2015-02-10
An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.
Second-Tier Database for Ecosystem Focus, 2003-2004 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
University of Washington, Columbia Basin Research, DART Project Staff,
2004-12-01
The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities essential to sound operational and resource management. The database also assists with juvenile and adult mainstem passage modeling supporting federal decisions affecting the operation of the FCRPS. The Second-Tier Database known as Data Access in Real Time (DART) integrates public data for effective access, consideration and application. DART also provides analysis tools and performance measures for evaluating the condition of Columbia Basin salmonid stocks. These services are critical tomore » BPA's implementation of its fish and wildlife responsibilities under the Endangered Species Act (ESA).« less
Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro
2011-07-01
Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.
Ocean data management in OMP Data Service
NASA Astrophysics Data System (ADS)
Fleury, Laurence; André, François; Belmahfoud, Nizar; Boichard, Jean-Luc; Brissebrat, Guillaume; Ferré, Hélène; Mière, Arnaud
2014-05-01
The Observatoire Midi-Pyrénées Data Service (SEDOO) is a development team, dedicated to environmental data management and dissemination application set up, in the framework of intensive field campaigns and long term observation networks. SEDOO developped some applications dealing with ocean data only, but also generic databases that enable to store and distribute multidisciplinary datasets. SEDOO is in charge of the in situ data management and the data portal for international and multidisciplinary programmes as large as African Monsoon Multidisciplinary Analyses (AMMA) and Mediterranean Integrated STudies at Regional And Local Scales (MISTRALS). The AMMA and MISTRALS databases are distributed and the data portals enable to access datasets managed by other data centres (IPSL, CORIOLIS...) through interoperability protocols (OPeNDAP, xml requests...). AMMA and MISTRALS metadata (data description) are standardized and comply with international standards (ISO 19115-19139; INSPIRE European Directive; Global Change Master Directory Thesaurus). Most of the AMMA and MISTRALS in situ ocean data sets are homogenized and inserted in a relational database, in order to enable accurate data selection and download of different data sets in a shared format. Data selection criteria are location, period, physical property name, physical property range... The data extraction procedure include format output selection among CSV, NetCDF, Nasa Ames... The AMMA database - http://database.amma-international.org/ - contains field campaign observations in the Guinea Gulf (EGEE 2005-2007) and Atlantic Tropical Ocean (AEROSE-II 2006...), as well as long term monitoring data (PIRATA, ARGO...). Operational analysis (MERCATOR) and satellite products (TMI, SSMI...) are managed by IPSL data centre and can be accessed too. They have been projected over regular latitude-longitude grids and converted into the NetCDF format. The MISTRALS data portal - http://mistrals.sedoo.fr/ - enables to access ocean datasets produced by the contributing programmes: Hydrological cycle in the Mediterranean eXperiment (HyMeX), Chemistry-Aerosol Mediterranean eXperiment (ChArMEx), Marine Mediterranean eXperiment (MERMeX)... The programmes include many field campaigns from 2011 to 2015, collecting general and specific properties. Long term monitoring networks, like Mediterranean Ocean Observing System on Environment (MOOSE) or Mediterranean Eurocentre for Underwater Sciences and Technologies (MEUST-SE), contribute to the MISTRALS data portal as well. Relevant model outputs and satellite products managed by external data centres (IPSL, ENEA...) can be accessed too. SEDOO manages the SSS (Sea Surface Salinity) national observation service data: http://sss.sedoo.fr/. SSS aims at collecting, validating, archiving and distributing in situ SSS measurements derived from Voluntary Observing Ship programs. The SSS data user interface enables to built multicriteria data request and download relevant datasets. SEDOO contributes to the SOLWARA project that aims at understanding the oceanic circulation in the Coral Sea and the Solomon Sea and their role in both the climate system and the oceanic chemistry. The research programme include in situ measurements, numerical modelling and compiled analyses of past data. The website http://thredds.sedoo.fr/solwara/ enables to access, visualize and download Solwara gridded data and model simulations, using Thredds associated services (OPEnDAP, NCSS and WMS). In order to improve the application user-friendliness, SSS and SOLWARA web interfaces are JEE applications build with GWT Framework, and share many modules.
NASA Astrophysics Data System (ADS)
Wolfgramm, Bettina; Hurni, Hans; Liniger, Hanspeter; Ruppen, Sebastian; Milne, Eleanor; Bader, Hans-Peter; Scheidegger, Ruth; Amare, Tadele; Yitaferu, Birru; Nazarmavloev, Farrukh; Conder, Malgorzata; Ebneter, Laura; Qadamov, Aslam; Shokirov, Qobiljon; Hergarten, Christian; Schwilch, Gudrun
2013-04-01
There is a fundamental mutual interest between enhancing soil organic carbon (SOC) in the world's soils and the objectives of the major global environmental conventions (UNFCCC, UNCBD, UNCCD). While there is evidence at the case study level that sustainable land management (SLM) technologies increase SOC stocks and SOC related benefits, there is no quantitative data available on the potential for increasing SOC benefits from different SLM technologies and especially from case studies in the developing countries, and a clear understanding of the trade-offs related to SLM up-scaling is missing. This study aims at assessing the potential increase of SOC under SLM technologies worldwide, evaluating tradeoffs and gains in up-scaling SLM for case studies in Tajikistan, Ethiopia and Switzerland. It makes use of the SLM technologies documented in the online database of the World Overview of Conservation Approaches and Technologies (WOCAT). The study consists of three components: 1) Identifying SOC benefits contributing to the major global environmental issues for SLM technologies worldwide as documented in the WOCAT global database 2) Validation of SOC storage potentials and SOC benefit predictions for SLM technologies from the WOCAT database using results from existing comparative case studies at the plot level, using soil spectral libraries and standardized documentations of ecosystem service from the WOCAT database. 3) Understanding trade-offs and win-win scenarios of up-scaling SLM technologies from the plot to the household and landscape level using material flow analysis. This study builds on the premise that the most promising way to increase benefits from land management is to consider already existing sustainable strategies. Such SLM technologies from all over the world documented are accessible in a standardized way in the WOCAT online database. The study thus evaluates SLM technologies from the WOCAT database by calculating the potential SOC storage increase and related benefits by comparing SOC estimates before-and-after establishment of the SLM technology. These results are validated using comparative case studies of plots with-and-without SLM technologies (existing SLM systems versus surrounding, degrading systems). In view of upscaling SLM technologies, it is crucial to understand tradeoffs and gains supporting or hindering the further spread. Systemic biomass management analysis using material flow analysis allows quantifying organic carbon flows and storages for different land management options at the household, but also at landscape level. The study shows results relevant for science, policy and practice for accounting, monitoring and evaluating SOC related ecosystem services: - A comprehensive methodology for SLM impact assessments allowing quantification of SOC storage and SOC related benefits under different SLM technologies, and - Improved understanding of upscaling options for SLM technologies and tradeoffs as well as win-win opportunities for biomass management, SOC content increase, and ecosystem services improvement at the plot and household level.
Discrepancy Reporting Management System
NASA Technical Reports Server (NTRS)
Cooper, Tonja M.; Lin, James C.; Chatillon, Mark L.
2004-01-01
Discrepancy Reporting Management System (DRMS) is a computer program designed for use in the stations of NASA's Deep Space Network (DSN) to help establish the operational history of equipment items; acquire data on the quality of service provided to DSN customers; enable measurement of service performance; provide early insight into the need to improve processes, procedures, and interfaces; and enable the tracing of a data outage to a change in software or hardware. DRMS is a Web-based software system designed to include a distributed database and replication feature to achieve location-specific autonomy while maintaining a consistent high quality of data. DRMS incorporates commercial Web and database software. DRMS collects, processes, replicates, communicates, and manages information on spacecraft data discrepancies, equipment resets, and physical equipment status, and maintains an internal station log. All discrepancy reports (DRs), Master discrepancy reports (MDRs), and Reset data are replicated to a master server at NASA's Jet Propulsion Laboratory; Master DR data are replicated to all the DSN sites; and Station Logs are internal to each of the DSN sites and are not replicated. Data are validated according to several logical mathematical criteria. Queries can be performed on any combination of data.
The crustal dynamics intelligent user interface anthology
NASA Technical Reports Server (NTRS)
Short, Nicholas M., Jr.; Campbell, William J.; Roelofs, Larry H.; Wattawa, Scott L.
1987-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has, as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI is to develop a friendly and intelligent user interface service based on expert systems and natural language processing technologies. The purpose of such a service is to support the large number of potential scientific and engineering users that have need of space and land-related research and technical data, but have little or no experience in query languages or understanding of the information content or architecture of the databases of interest. This document presents the design concepts, development approach and evaluation of the performance of a prototype IUI system for the Crustal Dynamics Project Database, which was developed using a microcomputer-based expert system tool (M. 1), the natural language query processor THEMIS, and the graphics software system GSS. The IUI design is based on a multiple view representation of a database from both the user and database perspective, with intelligent processes to translate between the views.
ERIC Educational Resources Information Center
Fischer, Audrey; Cole, John Y.; Tarr, Susan M.; Carey, Len; Mehnert, Robert; Sherman, Andrew M.; Davis, Linda; Leahy, Debra W.; Chute, Adrienne; Willard, Robert S.; Dunn, Christina
2003-01-01
Includes annual reports from 12 federal agencies and libraries that discuss security, budgets, legislation, digital projects, preservation, government role, information management, personnel changes, collections, databases, financial issues, services, administration, Web sites, access to information, customer service, statistics, international…
CMS users data management service integration and first experiences with its NoSQL data storage
NASA Astrophysics Data System (ADS)
Riahi, H.; Spiga, D.; Boccali, T.; Ciangottini, D.; Cinquilli, M.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Santocchia, A.
2014-06-01
The distributed data analysis workflow in CMS assumes that jobs run in a different location to where their results are finally stored. Typically the user outputs must be transferred from one site to another by a dedicated CMS service, AsyncStageOut. This new service is originally developed to address the inefficiency in using the CMS computing resources when transferring the analysis job outputs, synchronously, once they are produced in the job execution node to the remote site. The AsyncStageOut is designed as a thin application relying only on the NoSQL database (CouchDB) as input and data storage. It has progressed from a limited prototype to a highly adaptable service which manages and monitors the whole user files steps, namely file transfer and publication. The AsyncStageOut is integrated with the Common CMS/Atlas Analysis Framework. It foresees the management of nearly nearly 200k users' files per day of close to 1000 individual users per month with minimal delays, and providing a real time monitoring and reports to users and service operators, while being highly available. The associated data volume represents a new set of challenges in the areas of database scalability and service performance and efficiency. In this paper, we present an overview of the AsyncStageOut model and the integration strategy with the Common Analysis Framework. The motivations for using the NoSQL technology are also presented, as well as data design and the techniques used for efficient indexing and monitoring of the data. We describe deployment model for the high availability and scalability of the service. We also discuss the hardware requirements and the results achieved as they were determined by testing with actual data and realistic loads during the commissioning and the initial production phase with the Common Analysis Framework.
NASA Technical Reports Server (NTRS)
Murray, ShaTerea R.
2004-01-01
This summer I had the opportunity to work in the Environmental Management Office (EMO) under the Chemical Sampling and Analysis Team or CS&AT. This team s mission is to support Glenn Research Center (GRC) and EM0 by providing chemical sampling and analysis services and expert consulting. Services include sampling and chemical analysis of water, soil, fbels, oils, paint, insulation materials, etc. One of this team s major projects is the Drinking Water Project. This is a project that is done on Glenn s water coolers and ten percent of its sink every two years. For the past two summers an intern had been putting together a database for this team to record the test they had perform. She had successfully created a database but hadn't worked out all the quirks. So this summer William Wilder (an intern from Cleveland State University) and I worked together to perfect her database. We began be finding out exactly what every member of the team thought about the database and what they would change if any. After collecting this data we both had to take some courses in Microsoft Access in order to fix the problems. Next we began looking at what exactly how the database worked from the outside inward. Then we began trying to change the database but we quickly found out that this would be virtually impossible.
DataHub: Knowledge-based data management for data discovery
NASA Astrophysics Data System (ADS)
Handley, Thomas H.; Li, Y. Philip
1993-08-01
Currently available database technology is largely designed for business data-processing applications, and seems inadequate for scientific applications. The research described in this paper, the DataHub, will address the issues associated with this shortfall in technology utilization and development. The DataHub development is addressing the key issues in scientific data management of scientific database models and resource sharing in a geographically distributed, multi-disciplinary, science research environment. Thus, the DataHub will be a server between the data suppliers and data consumers to facilitate data exchanges, to assist science data analysis, and to provide as systematic approach for science data management. More specifically, the DataHub's objectives are to provide support for (1) exploratory data analysis (i.e., data driven analysis); (2) data transformations; (3) data semantics capture and usage; analysis-related knowledge capture and usage; and (5) data discovery, ingestion, and extraction. Applying technologies that vary from deductive databases, semantic data models, data discovery, knowledge representation and inferencing, exploratory data analysis techniques and modern man-machine interfaces, DataHub will provide a prototype, integrated environement to support research scientists' needs in multiple disciplines (i.e. oceanography, geology, and atmospheric) while addressing the more general science data management issues. Additionally, the DataHub will provide data management services to exploratory data analysis applications such as LinkWinds and NCSA's XIMAGE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buche, D. L.
This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.
A Virtual "Hello": A Web-Based Orientation to the Library.
ERIC Educational Resources Information Center
Borah, Eloisa Gomez
1997-01-01
Describes the development of Web-based library services and resources available at the Rosenfeld Library of the Anderson Graduate School of Management at University of California at Los Angeles. Highlights include library orientation sessions; virtual tours of the library; a database of basic business sources; and research strategies, including…
Key questions dominating contemporary ecological research and management concern interactions between biodiversity, ecosystem processes, and ecosystem services provision in the face of global change. This is particularly salient for freshwater biodiversity and in the context of r...
Small Business and Defense Acquisitions: A Review of Policies and Current Practices
2011-01-01
Office of Management and Budget xviii Small Business and Defense Acquisitions: A Review of Policies and Current Practices PSC Product and Service Code...themselves as minority-owned, women-owned, veteran- owned, or small disadvantaged businesses . The resulting database gives sourcing managers a tool for...REPORT DATE 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Small Business and Defense Acquisitions: A
Suicide prevention in primary care: General practitioners' views on service availability
2010-01-01
Background Primary care may be a key setting for suicide prevention. However, comparatively little is known about the services available in primary care for suicide prevention. The aims of the current study were to describe services available in general practices for the management of suicidal patients and to examine GPs views on these services. We carried out a questionnaire and interview study in the North West of England. We collected data on GPs views of suicide prevention generally as well as local mental health service provision. Findings During the study period (2003-2005) we used the National Confidential Inquiry Suicide database to identify 286 general practitioners (GPs) who had registered patients who had died by suicide. Data were collected from GPs and practice managers in 167 practices. Responses suggested that there was greater availability of services and training for general mental health issues than for suicide prevention specifically. The three key themes which emerged from GP interviews were: barriers accessing primary or secondary mental health services; obstacles faced when referring a patient to mental health services; managing change within mental health care services Conclusions Health professionals have an important role to play in preventing suicide. However, GPs expressed concerns about the quality of primary care mental health service provision and difficulties with access to secondary mental health services. Addressing these issues could facilitate future suicide prevention in primary care. PMID:20920302
Compound Passport Service: supporting corporate collection owners in open innovation.
Andrews, David M; Degorce, Sébastien L; Drake, David J; Gustafsson, Magnus; Higgins, Kevin M; Winter, Jon J
2015-10-01
A growing number of early discovery collaborative agreements are being put in place between large pharma companies and partners in which the rights for assets can reside with a partner, exclusively or jointly. Our corporate screening collection, like many others, was built on the premise that compounds generated in-house and not the subject of paper or patent disclosure were proprietary to the company. Collaborative screening arrangements and medicinal chemistry now make the origin, ownership rights and usage of compounds difficult to determine and manage. The Compound Passport Service is a dynamic database, managed and accessed through a set of reusable services that borrows from social media concepts to allow sample owners to take control of their samples in a much more active way. Copyright © 2015 Elsevier Ltd. All rights reserved.
Carlson, Mary H.; Zientek, Michael L.; Causey, J. Douglas; Kayser, Helen Z.; Spanski, Gregory T.; Wilson, Anna B.; Van Gosen, Bradley S.; Trautwein, Charles M.
2007-01-01
This report compiles selected results from 13 U.S. Geological Survey (USGS) mineral resource assessment studies conducted in Idaho and Montana into consistent spatial databases that can be used in a geographic information system. The 183 spatial databases represent areas of mineral potential delineated in these studies and include attributes on mineral deposit type, level of mineral potential, certainty, and a reference. The assessments were conducted for five 1? x 2? quadrangles (Butte, Challis, Choteau, Dillon, and Wallace), several U.S. Forest Service (USFS) National Forests (including Challis, Custer, Gallatin, Helena, and Payette), and one Bureau of Land Management (BLM) Resource Area (Dillon). The data contained in the spatial databases are based on published information: no new interpretations are made. This digital compilation is part of an ongoing effort to provide mineral resource information formatted for use in spatial analysis. In particular, this is one of several reports prepared to address USFS needs for science information as forest management plans are revised in the Northern Rocky Mountains.
CampusGIS of the University of Cologne: a tool for orientation, navigation, and management
NASA Astrophysics Data System (ADS)
Baaser, U.; Gnyp, M. L.; Hennig, S.; Hoffmeister, D.; Köhn, N.; Laudien, R.; Bareth, G.
2006-10-01
The working group for GIS and Remote Sensing at the Department of Geography at the University of Cologne has established a WebGIS called CampusGIS of the University of Cologne. The overall task of the CampusGIS is the connection of several existing databases at the University of Cologne with spatial data. These existing databases comprise data about staff, buildings, rooms, lectures, and general infrastructure like bus stops etc. These information were yet not linked to their spatial relation. Therefore, a GIS-based method is developed to link all the different databases to spatial entities. Due to the philosophy of the CampusGIS, an online-GUI is programmed which enables users to search for staff, buildings, or institutions. The query results are linked to the GIS database which allows the visualization of the spatial location of the searched entity. This system was established in 2005 and is operational since early 2006. In this contribution, the focus is on further developments. First results of (i) including routing services in, (ii) programming GUIs for mobile devices for, and (iii) including infrastructure management tools in the CampusGIS are presented. Consequently, the CampusGIS is not only available for spatial information retrieval and orientation. It also serves for on-campus navigation and administrative management.
Moving BASISplus and TECHLIBplus from VAX/VMS to UNIX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dominiak, R.
1993-12-31
BASISplus is used at the Laboratory by the Technical Information Services (TIS) Department which is part of the Information and Publishing Division at ARGONNE. TIS operates the Argonne Information Management System (AIM). The AIM System consists of the ANL Libraries On-Line Database (a TECHLIBplus database), the Current Journals Database (IDI`s current contents search), the ANL Publications Tracking Database (a TECHLIBplus database), the Powder Diffraction File Database, and several CD-ROM databases available through a Novell network. The AIM System is available from the desktop of ANL staff through modem and network connections, as well as from the 10 science libraries atmore » ARGONNE. TIS has been a BASISplus and TECHLIBplus site from the start, and never migrated from BASIS K. The decision to migrate from the VAX/VMS platform to a UNIX platform. Migrating a product from one platform to another involves many decisions and considerations. These justifications, decisions, and considerations are explored in this report.« less
Mandellos, George J; Koutelakis, George V; Panagiotakopoulos, Theodor C; Koukias, Andreas M; Koukias, Mixalis N; Lymberopoulos, Dimitrios K
2008-01-01
Early and specialized pre-hospital patient treatment improves outcome in terms of mortality and morbidity, in emergency cases. This paper focuses on the design and implementation of a telemedicine system that supports diverse types of endpoints including moving transports (MT) (ambulances, ships, planes, etc.), handheld devices and fixed units, using diverse communication networks. Target of the above telemedicine system is the pre-hospital patient treatment. While vital sign transmission is prior to other services provided by the telemedicine system (videoconference, remote management, voice calls etc.), a predefined algorithm controls provision and quality of the other services. A distributed database system controlled by a central server, aims to manage patient attributes, exams and incidents handled by different Telemedicine Coordination Centers (TCC).
The Houston Academy of Medicine--Texas Medical Center Library management information system.
Camille, D; Chadha, S; Lyders, R A
1993-01-01
A management information system (MIS) provides a means for collecting, reporting, and analyzing data from all segments of an organization. Such systems are common in business but rare in libraries. The Houston Academy of Medicine-Texas Medical Center Library developed an MIS that operates on a system of networked IBM PCs and Paradox, a commercial database software package. The data collected in the system include monthly reports, client profile information, and data collected at the time of service requests. The MIS assists with enforcement of library policies, ensures that correct information is recorded, and provides reports for library managers. It also can be used to help answer a variety of ad hoc questions. Future plans call for the development of an MIS that could be adapted to other libraries' needs, and a decision-support interface that would facilitate access to the data contained in the MIS databases. PMID:8251972
Beyond the online catalog: developing an academic information system in the sciences.
Crawford, S; Halbrook, B; Kelly, E; Stucki, L
1987-01-01
The online public access catalog consists essentially of a machine-readable database with network capabilities. Like other computer-based information systems, it may be continuously enhanced by the addition of new capabilities and databases. It may also become a gateway to other information networks. This paper reports the evolution of the Bibliographic Access and Control System (BACS) of Washington University in end-user searching, current awareness services, information management, and administrative functions. Ongoing research and development and the future of the online catalog are also discussed. PMID:3315052
Beyond the online catalog: developing an academic information system in the sciences.
Crawford, S; Halbrook, B; Kelly, E; Stucki, L
1987-07-01
The online public access catalog consists essentially of a machine-readable database with network capabilities. Like other computer-based information systems, it may be continuously enhanced by the addition of new capabilities and databases. It may also become a gateway to other information networks. This paper reports the evolution of the Bibliographic Access and Control System (BACS) of Washington University in end-user searching, current awareness services, information management, and administrative functions. Ongoing research and development and the future of the online catalog are also discussed.
Work injury management model and implication in Hong Kong: a literature review.
Chong, Cecilia Suk-Mei; Cheng, Andy Shu-Kei
2010-01-01
The objective of this review is to explore the work injury management models in literatures and the essential components in different models. The resulting information could be used to develop an integrated holistic model that could be applied in the work injury management system in Hong Kong. A keyword search of MEDLINE and CINAHL databases was conducted. A total of 68 studies related to the management of an injury were found within the above mentioned electronic database. Together with the citation tracking, there were 13 studies left for selection after the exclusion screening. Only 7 out of those 13 studies met the inclusion criteria for review. It is noticeable that the most important component in the injury management model in the reviewed literatures is early intervention. Because of limitations in Employees' Compensation Ordinance in Hong Kong, there is an impetus to have a model and practice guideline for work injury management in Hong Kong to ensure the quality of injury management services. At the end of this paper, the authors propose a work injury management model based on the employees' compensation system in Hong Kong. This model can be used as a reference for those countries adopting similar legislation as in Hong Kong.
NASA Astrophysics Data System (ADS)
Palumbo, Gaetano; Powlesland, Dominic
1996-12-01
The Getty Conservation Institute is exploring the feasibility of using remote sensing associated with a geographic database management system (GDBMS) in order to provide archaeological and historic site managers with sound evaluations of the tools available for site and information management. The World Heritage Site of Chaco Canyon, New Mexico, a complex of archeological sites dating to the 10th to the 13th centuries AD, was selected as a test site. Information from excavations conducted there since the 1930s, and a range of documentation generated by the National Park Service was gathered. NASA's John C. Stennis Space Center contributed multispectral data of the area, and the Jet Propulsion Laboratory contributed data from ATLAS (airborne terrestrial applications sensor) and CAMS (calibrated airborne multispectral scanner) scanners. Initial findings show that while 'automatic monitoring systems' will probably never be a reality, with careful comparisons of historic and modern photographs, and performing digital analysis of remotely sensed data, excellent results are possible.
A Data Management System for International Space Station Simulation Tools
NASA Technical Reports Server (NTRS)
Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)
2002-01-01
Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.
Ajami, Sima; Rajabzadeh, Ahmad; Ketabi, Saeedeh
2014-01-01
Organizations try to outsource their activities as much as possible in order to prevent the problems and use organizational capabilities in Information Technology (IT) field. The purpose of this paper was first, to express the effective criteria for selecting suppliers of IT services, second, to explain the advantages and disadvantages of outsourcing IT in hospitals. This study was narrative review, which search was conducted with the help of libraries, books, conference proceedings, and databases of Science Direct, PubMed, Proquest, Springer, and SID (Scientific Information Database). In our searches, we employed the following keywords and their combinations: Outsourcing, information technology, hospital, decision making, and criteria. The preliminary search resulted in 120 articles, which were published between 2000 and 2013 during July 2013. After a careful analysis of the content of each paper, a total of 46 papers were selected based on their relevancy. The criteria and sub-criteria influencing outsourcing decisions in Iranian hospitals were identified in six major categories including administrative issues, issues related to the service/product, technology factors, environmental factors, risks, and economic factors associated with 15 sub-criteria containing business integration, dependence on suppliers, human resources, focus on core competencies, facilities and physical capital, innovation, quality, speed of service delivery, flexibility, market capabilities, geographical location, security, management control, cost, and financial capability. Identify the advantages and disadvantages of outsourcing and effective criteria in IT services supplier selection causes the managers be able to take the most appropriate decision to select supplier of IT services. This is a general review on influencing criteria for electing of supplier of information technology services in hospitals. PMID:25540781
Ajami, Sima; Rajabzadeh, Ahmad; Ketabi, Saeedeh
2014-01-01
Organizations try to outsource their activities as much as possible in order to prevent the problems and use organizational capabilities in Information Technology (IT) field. The purpose of this paper was first, to express the effective criteria for selecting suppliers of IT services, second, to explain the advantages and disadvantages of outsourcing IT in hospitals. This study was narrative review, which search was conducted with the help of libraries, books, conference proceedings, and databases of Science Direct, PubMed, Proquest, Springer, and SID (Scientific Information Database). In our searches, we employed the following keywords and their combinations: Outsourcing, information technology, hospital, decision making, and criteria. The preliminary search resulted in 120 articles, which were published between 2000 and 2013 during July 2013. After a careful analysis of the content of each paper, a total of 46 papers were selected based on their relevancy. The criteria and sub-criteria influencing outsourcing decisions in Iranian hospitals were identified in six major categories including administrative issues, issues related to the service/product, technology factors, environmental factors, risks, and economic factors associated with 15 sub-criteria containing business integration, dependence on suppliers, human resources, focus on core competencies, facilities and physical capital, innovation, quality, speed of service delivery, flexibility, market capabilities, geographical location, security, management control, cost, and financial capability. Identify the advantages and disadvantages of outsourcing and effective criteria in IT services supplier selection causes the managers be able to take the most appropriate decision to select supplier of IT services. This is a general review on influencing criteria for electing of supplier of information technology services in hospitals.
NASA Astrophysics Data System (ADS)
Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.
2011-06-01
Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search "deep" web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.
Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.
2011-01-01
Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.
Example of monitoring measurements in a virtual eye clinic using 'big data'.
Jones, Lee; Bryan, Susan R; Miranda, Marco A; Crabb, David P; Kotecha, Aachal
2017-10-26
To assess the equivalence of measurement outcomes between patients attending a standard glaucoma care service, where patients see an ophthalmologist in a face-to-face setting, and a glaucoma monitoring service (GMS). The average mean deviation (MD) measurement on the visual field (VF) test for 250 patients attending a GMS were compared with a 'big data' repository of patients attending a standard glaucoma care service (reference database). In addition, the speed of VF progression between GMS patients and reference database patients was compared. Reference database patients were used to create expected outcomes that GMS patients could be compared with. For GMS patients falling outside of the expected limits, further analysis was carried out on the clinical management decisions for these patients. The average MD of patients in the GMS ranged from +1.6 dB to -18.9 dB between two consecutive appointments at the clinic. In the first analysis, 12 (4.8%; 95% CI 2.5% to 8.2%) GMS patients scored outside the 90% expected values based on the reference database. In the second analysis, 1.9% (95% CI 0.4% to 5.4%) GMS patients had VF changes outside of the expected 90% limits. Using 'big data' collected in the standard glaucoma care service, we found that patients attending a GMS have equivalent outcomes on the VF test. Our findings provide support for the implementation of virtual healthcare delivery in the hospital eye service. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
Oil Spills in U.S. Coastal Waters: Background Governance, and Issues for Congress
2010-04-30
20 09 G al lo ns OCS Pipelines OCS Facilities Source: Prepared by CRS with data from the Minerals Management Service (MMS) spill database, at http...Fund was particularly vulnerable to a large and costly spill: Fund managers had projected the fund would be completely depleted by FY2009. Recent...which released approximately 11 million gallons of crude oil into Prince William Sound, Alaska. The Exxon Valdez spill—the largest and most expensive
NASA Astrophysics Data System (ADS)
Meyer, Hanna; Authmann, Christian; Dreber, Niels; Hess, Bastian; Kellner, Klaus; Morgenthal, Theunis; Nauss, Thomas; Seeger, Bernhard; Tsvuura, Zivanai; Wiegand, Kerstin
2017-04-01
Bush encroachment is a syndrome of land degradation that occurs in many savannas including those of southern Africa. The increase in density, cover or biomass of woody vegetation often has negative effects on a range of ecosystem functions and services, which are hardly reversible. However, despite its importance, neither the causes of bush encroachment, nor the consequences of different resource management strategies to combat or mitigate related shifts in savanna states are fully understood. The project "IDESSA" (An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas) aims to improve the understanding of the complex interplays between land use, climate patterns and vegetation dynamics and to implement an integrative monitoring and decision-support system for the sustainable management of different savanna types. For this purpose, IDESSA follows an innovative approach that integrates local knowledge, botanical surveys, remote-sensing and machine-learning based time-series of atmospheric and land-cover dynamics, spatially explicit simulation modeling and analytical database management. The integration of the heterogeneous data will be implemented in a user oriented database infrastructure and scientific workflow system. Accessible via web-based interfaces, this database and analysis system will allow scientists to manage and analyze monitoring data and scenario computations, as well as allow stakeholders (e. g. land users, policy makers) to retrieve current ecosystem information and seasonal outlooks. We present the concept of the project and show preliminary results of the realization steps towards the integrative savanna management and decision-support system.
Spaceflight Operations Services Grid Prototype
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Mehrotra, Piyush; Lisotta, Anthony
2004-01-01
NASA over the years has developed many types of technologies and conducted various types of science resulting in numerous variations of operations, data and applications. For example, operations range from deep space projects managed by JPL, Saturn and Shuttle operations managed from JSC and KSC, ISS science operations managed from MSFC and numerous low earth orbit satellites managed from GSFC that are varied and intrinsically different but require many of the same types of services to fulfill their missions. Also, large data sets (databases) of Shuttle flight data, solar system projects and earth observing data exist which because of their varied and sometimes outdated technologies are not and have not been fully examined for additional information and knowledge. Many of the applications/systems supporting operational services e.g. voice, video, telemetry and commanding, are outdated and obsolete. The vast amounts of data are located in various formats, at various locations and range over many years. The ability to conduct unified space operations, access disparate data sets and to develop systems and services that can provide operational services does not currently exist in any useful form. In addition, adding new services to existing operations is generally expensive and with the current budget constraints not feasible on any broad level of implementation. To understand these services a discussion of each one follows. The Spaceflight User-based Services are those services required to conduct space flight operations. Grid Services are those Grid services that will be used to overcome, through middleware software, some or all the problems that currently exists. In addition, Network Services will be discussed briefly. Network Services are crucial to any type of remedy and are evolving adequately to support any technology currently in development.
Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro
2011-01-01
Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org. PMID:21632604
Self-management model in the scheduling of successive appointments in rheumatology.
Castro Corredor, David; Cuadra Díaz, José Luis; Mateos Rodríguez, Javier José; Anino Fernández, Joaquín; Mínguez Sánchez, María Dolores; de Lara Simón, Isabel María; Tébar, María Ángeles; Añó, Encarnación; Sanz, María Dolores; Ballester, María Nieves
2018-01-08
The rheumatology service of Ciudad Real Hospital, located in an autonomous community of that same name that is nearly in the center of Spain, implemented a self-management model of successive appointments more than 10 years ago. Since then, the physicians of the department schedule follow-up visits for their patients depending on the disease, its course and ancillary tests. The purpose of this study is to evaluate and compare the self-management model for successive appointments in the rheumatology service of Ciudad Real Hospital versus the model of external appointment management implemented in 8 of the hospital's 15 medical services. A comparative and multivariate analysis was performed to identify variables with statistically significant differences, in terms of activity and/or performance indicators and quality perceived by users. The comparison involved the self-management model for successive appointments employed in the rheumatology service of Ciudad Real Hospital and the model for external appointment management used in 8 hospital medical services between January 1 and May 31, 2016. In a database with more than 100,000 records of appointments involving the set of services included in the study, the mean waiting time and the numbers of non-appearances and rescheduling of follow-up visits in the rheumatology department were significantly lower than in the other services. The number of individuals treated in outpatient rheumatology services was 7,768, and a total of 280 patients were surveyed (response rate 63.21%). They showed great overall satisfaction, and the incidence rate of claims was low. Our results show that the self-management model of scheduling appointments has better results in terms of activity indicators and in quality perceived by users, despite the intense activity. Thus, this study could be fundamental for decision making in the management of health care organizations. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Energy science and technology database (on the internet). Online data
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Energy Science and Technology Database (EDB) is a multidisciplinary file containing worldwide references to basic and applied scientific and technical research literature. The information is collected for use by government managers, researchers at the national laboratories, and other research efforts sponsored by the U.S. Department of Energy, and the results of this research are transferred to the public. Abstracts are included for records from 1976 to the present. The EDB also contains the Nuclear Science Abstracts which is a comprehensive abstract and index collection to the international nuclear science and technology literature for the period 1948 through 1976. Includedmore » are scientific and technical reports of the U.S. Atomic Energy Commission, U.S. Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Approximately 25% of the records in the file contain abstracts. Nuclear Science Abstracts contains over 900,000 bibliographic records. The entire Energy Science and Technology Database contains over 3 million bibliographic records. This database is now available for searching through the GOV. Research-Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less
78 FR 20669 - National Institute on Aging; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-05
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Institute on Aging... personal privacy. Name of Committee: National Institute on Aging Special Emphasis Panel; Management of the Primate Aging Database Date: April 23, 2013. Time: 1:00 p.m. to 4:00 p.m. Agenda: To review and evaluate...
Cost-Effectiveness of Social Work Services in Aging: An Updated Systematic Review
ERIC Educational Resources Information Center
Rizzo, Victoria M.; Rowe, Jeannine M.
2016-01-01
Objectives: This study examines the impact of social work interventions in aging on quality of life (QOL) and cost outcomes in four categories (health, mental health, geriatric evaluation and management, and caregiving). Methods: Systematic review methods are employed. Databases were searched for articles published in English between 2004 and 2012…
Secure Cooperative Data Access in Multi-Cloud Environment
ERIC Educational Resources Information Center
Le, Meixing
2013-01-01
In this dissertation, we discuss the problem of enabling cooperative query execution in a multi-cloud environment where the data is owned and managed by multiple enterprises. Each enterprise maintains its own relational database using a private cloud. In order to implement desired business services, parties need to share selected portion of their…
Don't Toss the Floss! The Benefits of Daily Cleaning Between Teeth
... Flossing for the management of periodontal diseases and dental caries in adults. Sambunjak D, Nickerson JW, Poklepovic T, et al., Cochrane Database Syst Rev . 2011 Dec 7;(12):CD008829. doi: 10.1002/14651858.CD008829.pub2. Review. PMID: ... of Health and Human Services Back to Top
The Database Business: Managing Today--Planning for Tomorrow. Optimizing Human Resource Factors.
ERIC Educational Resources Information Center
Clark, Joseph E.; And Others
1988-01-01
The first paper describes the National Technical Information Service productivity improvement system and its emphasis on human resources development. The second addresses the benefits of telecommuting to employers and employees. The third discusses the problems generated by the baby boom work force pressing for advancement at a time when many…
A neotropical migratory bird prioritization for National Forests and Grasslands
Dick Roth; Richard Peterson
1997-01-01
The Rocky Mountain Region of the USDA Forest Service provides nesting habitat for 146 species of neotropical migratory birds. Interactive, prioritization databases were developed for each National Forest and National Grassland in the Region to assist land managers in making informed decisions about resource allocations. The data was processed using Paradox software....
A Knowledge Base for FIA Data Uses
Victor A. Rudis
2005-01-01
Knowledge management provides a way to capture the collective wisdom of an organization, facilitate organizational learning, and foster opportunities for improvement. This paper describes a knowledge base compiled from uses of field observations made by the U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis program and a citation database of...
Schwartz, Yannick; Barbot, Alexis; Thyreau, Benjamin; Frouin, Vincent; Varoquaux, Gaël; Siram, Aditya; Marcus, Daniel S; Poline, Jean-Baptiste
2012-01-01
As neuroimaging databases grow in size and complexity, the time researchers spend investigating and managing the data increases to the expense of data analysis. As a result, investigators rely more and more heavily on scripting using high-level languages to automate data management and processing tasks. For this, a structured and programmatic access to the data store is necessary. Web services are a first step toward this goal. They however lack in functionality and ease of use because they provide only low-level interfaces to databases. We introduce here PyXNAT, a Python module that interacts with The Extensible Neuroimaging Archive Toolkit (XNAT) through native Python calls across multiple operating systems. The choice of Python enables PyXNAT to expose the XNAT Web Services and unify their features with a higher level and more expressive language. PyXNAT provides XNAT users direct access to all the scientific packages in Python. Finally PyXNAT aims to be efficient and easy to use, both as a back-end library to build XNAT clients and as an alternative front-end from the command line.
Schwartz, Yannick; Barbot, Alexis; Thyreau, Benjamin; Frouin, Vincent; Varoquaux, Gaël; Siram, Aditya; Marcus, Daniel S.; Poline, Jean-Baptiste
2012-01-01
As neuroimaging databases grow in size and complexity, the time researchers spend investigating and managing the data increases to the expense of data analysis. As a result, investigators rely more and more heavily on scripting using high-level languages to automate data management and processing tasks. For this, a structured and programmatic access to the data store is necessary. Web services are a first step toward this goal. They however lack in functionality and ease of use because they provide only low-level interfaces to databases. We introduce here PyXNAT, a Python module that interacts with The Extensible Neuroimaging Archive Toolkit (XNAT) through native Python calls across multiple operating systems. The choice of Python enables PyXNAT to expose the XNAT Web Services and unify their features with a higher level and more expressive language. PyXNAT provides XNAT users direct access to all the scientific packages in Python. Finally PyXNAT aims to be efficient and easy to use, both as a back-end library to build XNAT clients and as an alternative front-end from the command line. PMID:22654752
The research and development of water resources management information system based on ArcGIS
NASA Astrophysics Data System (ADS)
Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai
According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..
Past, present, and future of water data delivery from the U.S. Geological Survey
Hirsch, Robert M.; Fisher, Gary T.
2014-01-01
We present an overview of national water databases managed by the U.S. Geological Survey, including surface-water, groundwater, water-quality, and water-use data. These are readily accessible to users through web interfaces and data services. Multiple perspectives of data are provided, including search and retrieval of real-time data and historical data, on-demand current conditions and alert services, data compilations, spatial representations, analytical products, and availability of data across multiple agencies.
1990-02-01
in section S3.1, Airspace Management. oue ORNL 1988 data. S3.10-2 S3.10.4 Community Services Community services provided at the county level include...Under the Base Realignment Project. Mountain Home City Clerk’s Office. 1989. Proposed Fiscal 1990 Budget. Oak Ridge National Laboratory ( ORNL ). 1988. Low...Altitude Airspace Database. Submitted to the U.S. Air Force. ORNL and Consultants. 1988. Reviews of Scientific Literatures on the Environmental
Development of XML Schema for Broadband Digital Seismograms and Data Center Portal
NASA Astrophysics Data System (ADS)
Takeuchi, N.; Tsuboi, S.; Ishihara, Y.; Nagao, H.; Yamagishi, Y.; Watanabe, T.; Yanaka, H.; Yamaji, H.
2008-12-01
There are a number of data centers around the globe, where the digital broadband seismograms are opened to researchers. Those centers use their own user interfaces and there are no standard to access and retrieve seismograms from different data centers using unified interface. One of the emergent technologies to realize unified user interface for different data centers is the concept of WebService and WebService portal. Here we have developed a prototype of data center portal for digital broadband seismograms. This WebService portal uses WSDL (Web Services Description Language) to accommodate differences among the different data centers. By using the WSDL, alteration and addition of data center user interfaces can be easily managed. This portal, called NINJA Portal, assumes three WebServices: (1) database Query service, (2) Seismic event data request service, and (3) Seismic continuous data request service. Current system supports both station search of database Query service and seismic continuous data request service. Data centers supported by this NINJA portal will be OHP data center in ERI and Pacific21 data center in IFREE/JAMSTEC in the beginning. We have developed metadata standard for seismological data based on QuakeML for parametric data, which has been developed by ETH Zurich, and XML-SEED for waveform data, which was developed by IFREE/JAMSTEC. The prototype of NINJA portal is now released through IFREE web page (http://www.jamstec.go.jp/pacific21/).
Privacy-Aware Location Database Service for Granular Queries
NASA Astrophysics Data System (ADS)
Kiyomoto, Shinsaku; Martin, Keith M.; Fukushima, Kazuhide
Future mobile markets are expected to increasingly embrace location-based services. This paper presents a new system architecture for location-based services, which consists of a location database and distributed location anonymizers. The service is privacy-aware in the sense that the location database always maintains a degree of anonymity. The location database service permits three different levels of query and can thus be used to implement a wide range of location-based services. Furthermore, the architecture is scalable and employs simple functions that are similar to those found in general database systems.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
Health information and communication system for emergency management in a developing country, Iran.
Seyedin, Seyed Hesam; Jamali, Hamid R
2011-08-01
Disasters are fortunately rare occurrences. However, accurate and timely information and communication are vital to adequately prepare individual health organizations for such events. The current article investigates the health related communication and information systems for emergency management in Iran. A mixed qualitative and quantitative methodology was used in this study. A sample of 230 health service managers was surveyed using a questionnaire and 65 semi-structured interviews were also conducted with public health and therapeutic affairs managers who were responsible for emergency management. A range of problems were identified including fragmentation of information, lack of local databases, lack of clear information strategy and lack of a formal system for logging disaster related information at regional or local level. Recommendations were made for improving the national emergency management information and communication system. The findings have implications for health organizations in developing and developed countries especially in the Middle East. Creating disaster related information databases, creating protocols and standards, setting an information strategy, training staff and hosting a center for information system in the Ministry of Health to centrally manage and share the data could improve the current information system.
Big issues, small systems: managing with information in medical research.
Jones, J; Preston, H
2000-08-01
This subject of this article is the design of a database system for handling files related to the work of the Molecular Genetics Department of the International Blood Group Reference Laboratory. It examines specialist information needs identified within this organization and it indicates how the design of the Rhesus Information Tracking System was able to meet current needs. Rapid Applications Development prototyping forms the basis of the investigation, linked to interview, questionnaire, and observation techniques in order to establish requirements for interoperability. In particular, the place of this specialist database within the much broader information strategy of the National Blood Service will be examined. This unique situation is analogous to management activities in broader environments and a number of generic issues are highlighted by the research.
Jiang, Kaifeng; Hu, Jia; Hong, Ying; Liao, Hui; Liu, Songbo
2016-11-01
Prior research has demonstrated that service climate can enhance unit performance by guiding employees' service behavior to satisfy customers. Extending this literature, we identified ethical climate toward customers as another indispensable organizational climate in service contexts and examined how and when service climate operates in conjunction with ethical climate to enhance business performance of service units. Based on data collected in 2 phases over 6 months from multiple sources of 196 movie theaters, we found that service climate and ethical climate had disparate impacts on business performance, operationalized as an index of customer attendance rate and operating income per labor hour, by enhancing service behavior and reducing unethical behavior, respectively. Furthermore, we found that service behavior and unethical behavior interacted to affect business performance, in such a way that service behavior was more positively related to business performance when unethical behavior was low than when it was high. This interactive effect between service and unethical behaviors was further strengthened by high market turbulence and competitive intensity. These findings provide new insight into theoretical development of service management and offer practical implications about how to maximize business performance of service units by managing organizational climates and employee behaviors synergistically. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Seufert, V.; Wood, S.; Reid, A.; Gonzalez, A.; Rhemtulla, J.; Ramankutty, N.
2014-12-01
The most important current driver of biodiversity loss is the conversion of natural habitats for human land uses, mostly for the purpose of food production. However, by causing this biodiversity loss, food production is eroding the very same ecosystem services (e.g. pollination and soil fertility) that it depends on. We therefore need to adopt more wildlife-friendly agricultural practices that can contribute to preserving biodiversity. Organic farming has been shown to typically host higher biodiversity than conventional farming. But how is the biodiversity benefit of organic management dependent on the landscape context farms are situated in? To implement organic farming as an effective means for protecting biodiversity and enhancing ecosystem services we need to understand better under what conditions organic management is most beneficial for species. We conducted a meta-analysis of the literature to answer this question, compiling the most comprehensive database to date of studies that monitored biodiversity in organic vs. conventional fields. We also collected information about the landscape surrounding these fields from remote sensing products. Our database consists of 348 study sites across North America and Europe. Our analysis shows that organic management can improve biodiversity in agricultural fields substantially. It is especially effective at preserving biodiversity in homogeneous landscapes that are structurally simplified and dominated by either cropland or pasture. In heterogeneous landscapes conventional agriculture might instead already hold high biodiversity, and organic management does not appear to provide as much of a benefit for species richness as in simplified landscapes. Our results suggest that strategies to maintain biodiversity-dependent ecosystem services should include a combination of pristine natural habitats, wildlife-friendly farming systems like organic farming, and high-yielding conventional systems, interspersed in structurally diverse, heterogeneous landscapes.
Trevatt, Alexander E J; Kirkham, Emily N; Allix, Bradley; Greenwood, Rosemary; Coy, Karen; Hollén, Linda I; Young, Amber E R
2016-09-01
There is a paucity of evidence guiding management of small area partial thickness paediatric scalds. This has prevented the development of national management guidelines for these injuries. This research aimed to investigate whether a lack of evidence for national guidelines has resulted in variations in both management and outcomes of paediatric small area scalds across England and Wales (E&W). A national survey of initial management of paediatric scalds ≤5% Total Body Surface Area (%TBSA) was sent to 14 burns services in E&W. Skin graft rates of anonymised burns services over seven years were collected from the international Burns Injury Database (iBID). Average skin grafting rates across services were compared. Length of stay and proportion of patients receiving general anaesthesia for dressing application at each service were also compared. All 14 burns services responded to the survey. Only 50% of services had a protocol in place for the management of small area burns. All protocols varied in how partial thickness paediatrics scalds ≤5% TBSA should be managed. There was no consensus as to which scalds should be treated using biosynthetic dressings. Data from iBID for 11,917 patients showed that the average reported skin grafting rate across all burns services was 2.3% (95% CI 2.1, 2.6) but varied from 0.3% to 7.1% (P<0.001). Service provider remained associated with likelihood of skin grafting when variations in the %TBSA case mix seen by each service were controlled for (χ(2)=87.3, P<0.001). The use of general anaesthetics across services varied between 0.6 and 35.5% (P<0.001). The median length of stay across services varied from 1 to 3 days (P<0.001). A lack of evidence guiding management of small-area paediatric scalds has resulted in variation in management of these injuries across E&W. There is also significant variation in outcomes for these injuries. Further research is indicated to determine if care pathways and outcomes are linked. An evidence-based national policy for the management of small area paediatric scalds would ensure that high quality, standardised care is delivered throughout E&W and variations in outcome are reduced. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Enabling search over encrypted multimedia databases
NASA Astrophysics Data System (ADS)
Lu, Wenjun; Swaminathan, Ashwin; Varna, Avinash L.; Wu, Min
2009-02-01
Performing information retrieval tasks while preserving data confidentiality is a desirable capability when a database is stored on a server maintained by a third-party service provider. This paper addresses the problem of enabling content-based retrieval over encrypted multimedia databases. Search indexes, along with multimedia documents, are first encrypted by the content owner and then stored onto the server. Through jointly applying cryptographic techniques, such as order preserving encryption and randomized hash functions, with image processing and information retrieval techniques, secure indexing schemes are designed to provide both privacy protection and rank-ordered search capability. Retrieval results on an encrypted color image database and security analysis of the secure indexing schemes under different attack models show that data confidentiality can be preserved while retaining very good retrieval performance. This work has promising applications in secure multimedia management.
The Joint Committee for Traceability in Laboratory Medicine (JCTLM) - its history and operation.
Jones, Graham R D; Jackson, Craig
2016-01-30
The Joint Committee for Traceability in Laboratory Medicine (JCTLM) was formed to bring together the sciences of metrology, laboratory medicine and laboratory quality management. The aim of this collaboration is to support worldwide comparability and equivalence of measurement results in clinical laboratories for the purpose of improving healthcare. The JCTLM has its origins in the activities of international metrology treaty organizations, professional societies and federations devoted to improving measurement quality in physical, chemical and medical sciences. The three founding organizations, the International Committee for Weights and Measures (CIPM), the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) and the International Laboratory Accreditation Cooperation (ILAC) are the leaders of this activity. The main service of the JCTLM is a web-based database with a list of reference materials, reference methods and reference measurement services meeting appropriate international standards. This database allows manufacturers to select references for assay traceability and provides support for suppliers of these services. As of mid 2015 the database lists 295 reference materials for 162 analytes, 170 reference measurement procedures for 79 analytes and 130 reference measurement services for 39 analytes. There remains a need for the development and implementation of metrological traceability in many areas of laboratory medicine and the JCTLM will continue to promote these activities into the future. Copyright © 2015 Elsevier B.V. All rights reserved.
Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App
NASA Astrophysics Data System (ADS)
Nurnawati, E. K.; Ermawati, E.
2018-02-01
An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.
Transforming data into action: the Sonoma County Human Services Department.
Harrison, Lindsay
2012-01-01
In order to centralize data-based initiatives, the Director of the Department worked with the Board of Supervisors and the executive team to develop a new Planning, Research, and Evaluation (PRE) division. PRE is establishing rules for data-based decision making and consolidating data collection to ensure quality and consistency. It aims to target resources toward visionary, pro-active program planning and implementation, and inform the public about the role of Human Services in creating a healthy, safe and productive environment. PRE staff spent several months studying the job functions of staff, to determine how they use information to inform practice, consulting other counties about their experiences. The PRE team developed Datascript, outlining two agency aims: (a) foster a decision-making environment that values and successfully uses empirical evidence for strategic change, and (b) manage the role and image of the Human Services Department in the external environment. The case study describes action steps developed to achieve each aim. Copyright © Taylor & Francis Group, LLC
A Conceptual Model and Database to Integrate Data and Project Management
NASA Astrophysics Data System (ADS)
Guarinello, M. L.; Edsall, R.; Helbling, J.; Evaldt, E.; Glenn, N. F.; Delparte, D.; Sheneman, L.; Schumaker, R.
2015-12-01
Data management is critically foundational to doing effective science in our data-intensive research era and done well can enhance collaboration, increase the value of research data, and support requirements by funding agencies to make scientific data and other research products available through publically accessible online repositories. However, there are few examples (but see the Long-term Ecological Research Network Data Portal) of these data being provided in such a manner that allows exploration within the context of the research process - what specific research questions do these data seek to answer? what data were used to answer these questions? what data would have been helpful to answer these questions but were not available? We propose an agile conceptual model and database design, as well as example results, that integrate data management with project management not only to maximize the value of research data products but to enhance collaboration during the project and the process of project management itself. In our project, which we call 'Data Map,' we used agile principles by adopting a user-focused approach and by designing our database to be simple, responsive, and expandable. We initially designed Data Map for the Idaho EPSCoR project "Managing Idaho's Landscapes for Ecosystem Services (MILES)" (see https://www.idahoecosystems.org//) and will present example results for this work. We consulted with our primary users- project managers, data managers, and researchers to design the Data Map. Results will be useful to project managers and to funding agencies reviewing progress because they will readily provide answers to the questions "For which research projects/questions are data available and/or being generated by MILES researchers?" and "Which research projects/questions are associated with each of the 3 primary questions from the MILES proposal?" To be responsive to the needs of the project, we chose to streamline our design for the prototype database and build it in a way that is modular and can be changed or expanded to meet user needs. Our hope is that others, especially those managing large collaborative research grants, will be able to use our project model and database design to enhance the value of their project and data management both during and following the active research period.
Problem reporting management system performance simulation
NASA Technical Reports Server (NTRS)
Vannatta, David S.
1993-01-01
This paper proposes the Problem Reporting Management System (PRMS) model as an effective discrete simulation tool that determines the risks involved during the development phase of a Trouble Tracking Reporting Data Base replacement system. The model considers the type of equipment and networks which will be used in the replacement system as well as varying user loads, size of the database, and expected operational availability. The paper discusses the dynamics, stability, and application of the PRMS and addresses suggested concepts to enhance the service performance and enrich them.
Evaluating the operations capability of Freedom's Data Management System
NASA Technical Reports Server (NTRS)
Sowizral, Henry A.
1990-01-01
Three areas of Data Management System (DMS) performance are examined: raw processor speed, the subjective speed of the Lynx OS X-Window system, and the operational capacity of the Runtime Object Database (RODB). It is concluded that the proposed processor will operate at its specified rate of speed and that the X-Window system operates within users' subjective needs. It is also concluded that the RODB cannot provide the required level of service, even with a two-order of magnitude (100 fold) improvement in speed.
Issues with medication supply and management in a rural community in Queensland.
Tan, Amy C W; Emmerton, Lynne M; Hattingh, H Laetitia
2012-06-01
To identify the key issues reported by rural health-care providers in their provision of medication supply and related cognitive services, and in order to advise health workforce and role development and thus improve the quality use of medicines in rural communities. Exploratory semistructured interview research. A rural community comprising four towns in a rural health service district in Queensland, Australia. Forty-nine health-care providers (medical practitioners, pharmacists, nurses and others) with medication-related roles who serviced the study community, identified through databases and local contacts. Medication-related roles undertaken by the health-care providers, focusing on medication supply and cognitive services; challenges in undertaking these roles. Medical and nursing providers reported challenges in ensuring continuity in supply of medications due to their existing medical workload demands. Local pharmacists were largely involved in medication supply, with limited capacity for extended cognitive roles. Participants identified a lack of support for their medication roles and the potential value of clinically focused pharmacists in medication management services. Medication supply may become more efficient with extended roles for certain health-care providers. The need for cognitive medication management services suggests potential for clinical pharmacists' role development in rural areas. © 2012 The Authors. Australian Journal of Rural Health © National Rural Health Alliance Inc.
Disease management as a performance improvement strategy.
McClatchey, S
2001-11-01
Disease management is a strategy of organizing care and services for a patient population across the continuum. It is characterized by a population database, interdisciplinary and interagency collaboration, and evidence-based clinical information. The effectiveness of a disease management program has been measured by a combination of clinical, financial, and quality of life outcomes. In early 1997, driven by a strategic planning process that established three Centers of Excellence (COE), we implemented disease management as the foundation for a new approach to performance improvement utilizing five key strategies. The five implementation strategies are outlined, in addition to a review of the key elements in outcome achievement.
Information-seeking behavior of basic science researchers: implications for library services.
Haines, Laura L; Light, Jeanene; O'Malley, Donna; Delwiche, Frances A
2010-01-01
This study examined the information-seeking behaviors of basic science researchers to inform the development of customized library services. A qualitative study using semi-structured interviews was conducted on a sample of basic science researchers employed at a university medical school. The basic science researchers used a variety of information resources ranging from popular Internet search engines to highly technical databases. They generally relied on basic keyword searching, using the simplest interface of a database or search engine. They were highly collegial, interacting primarily with coworkers in their laboratories and colleagues employed at other institutions. They made little use of traditional library services and instead performed many traditional library functions internally. Although the basic science researchers expressed a positive attitude toward the library, they did not view its resources or services as integral to their work. To maximize their use by researchers, library resources must be accessible via departmental websites. Use of library services may be increased by cultivating relationships with key departmental administrative personnel. Despite their self-sufficiency, subjects expressed a desire for centralized information about ongoing research on campus and shared resources, suggesting a role for the library in creating and managing an institutional repository.
Information-seeking behavior of basic science researchers: implications for library services
Haines, Laura L.; Light, Jeanene; O'Malley, Donna; Delwiche, Frances A.
2010-01-01
Objectives: This study examined the information-seeking behaviors of basic science researchers to inform the development of customized library services. Methods: A qualitative study using semi-structured interviews was conducted on a sample of basic science researchers employed at a university medical school. Results: The basic science researchers used a variety of information resources ranging from popular Internet search engines to highly technical databases. They generally relied on basic keyword searching, using the simplest interface of a database or search engine. They were highly collegial, interacting primarily with coworkers in their laboratories and colleagues employed at other institutions. They made little use of traditional library services and instead performed many traditional library functions internally. Conclusions: Although the basic science researchers expressed a positive attitude toward the library, they did not view its resources or services as integral to their work. To maximize their use by researchers, library resources must be accessible via departmental websites. Use of library services may be increased by cultivating relationships with key departmental administrative personnel. Despite their self-sufficiency, subjects expressed a desire for centralized information about ongoing research on campus and shared resources, suggesting a role for the library in creating and managing an institutional repository. PMID:20098658
QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm.
Bao, Ying; Lei, Weimin; Zhang, Wei; Zhan, Yuzhuo
2016-01-01
At present, to realize or improve the quality of experience (QoE) is a major goal for network media transmission service, and QoE evaluation is the basis for adjusting the transmission control mechanism. Therefore, a kind of QoE collaborative evaluation method based on fuzzy clustering heuristic algorithm is proposed in this paper, which is concentrated on service score calculation at the server side. The server side collects network transmission quality of service (QoS) parameter, node location data, and user expectation value from client feedback information. Then it manages the historical data in database through the "big data" process mode, and predicts user score according to heuristic rules. On this basis, it completes fuzzy clustering analysis, and generates service QoE score and management message, which will be finally fed back to clients. Besides, this paper mainly discussed service evaluation generative rules, heuristic evaluation rules and fuzzy clustering analysis methods, and presents service-based QoE evaluation processes. The simulation experiments have verified the effectiveness of QoE collaborative evaluation method based on fuzzy clustering heuristic rules.
Management of information in distributed biomedical collaboratories.
Keator, David B
2009-01-01
Organizing and annotating biomedical data in structured ways has gained much interest and focus in the last 30 years. Driven by decreases in digital storage costs and advances in genetics sequencing, imaging, electronic data collection, and microarray technologies, data is being collected at an alarming rate. The specialization of fields in biology and medicine demonstrates the need for somewhat different structures for storage and retrieval of data. For biologists, the need for structured information and integration across a number of domains drives development. For clinical researchers and hospitals, the need for a structured medical record accessible to, ideally, any medical practitioner who might require it during the course of research or patient treatment, patient confidentiality, and security are the driving developmental factors. Scientific data management systems generally consist of a few core services: a backend database system, a front-end graphical user interface, and an export/import mechanism or data interchange format to both get data into and out of the database and share data with collaborators. The chapter introduces some existing databases, distributed file systems, and interchange languages used within the biomedical research and clinical communities for scientific data management and exchange.
OntoGene web services for biomedical text mining.
Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul
2014-01-01
Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them.
OntoGene web services for biomedical text mining
2014-01-01
Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges, with top ranked results in several of them. PMID:25472638
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Tamura, Haruki; Mezaki, Koji
This paper describes fundamental idea of technical information management in Mitsubishi Heavy Industries, Ltd., and present status of the activities. Then it introduces the background and history of the development, problems and countermeasures against them regarding Mitsubishi Heavy Industries Technical Information Retrieval System (called MARON) which started its service in May, 1985. The system deals with databases which cover information common to the whole company (in-house research and technical reports, holding information of books, journals and so on), and local information held in each business division or department. Anybody from any division can access to these databases through the company-wide network. The in-house interlibrary loan subsystem called Orderentry is available, which supports acquiring operation of original materials.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
... access. Needs and Uses: Section 15.615 requires entities operating Access BPL systems shall supply to an... systems for inclusion into a publicly available database, within 30 days prior to installation of service... comments to Nicholas A. Fraser, Office of Management and Budget, via fax at 202-395-5167 or via the...
Information Technology and the Evolution of the Library
2009-03-01
Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise
United States Army Medical Materiel Development Activity: 1997 Annual Report.
1997-01-01
business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was
A radiology department intranet: development and applications.
Willing, S J; Berland, L L
1999-01-01
An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.
NASA Astrophysics Data System (ADS)
Zhu, Wenmin; Jia, Yuanhua
2018-01-01
Based on the risk management theory and the PDCA cycle model, requirements of the railway passenger transport safety production is analyzed, and the establishment of the security risk assessment team is proposed to manage risk by FTA with Delphi from both qualitative and quantitative aspects. The safety production committee is also established to accomplish performance appraisal, which is for further ensuring the correctness of risk management results, optimizing the safety management business processes and improving risk management capabilities. The basic framework and risk information database of risk management information system of railway passenger transport safety are designed by Ajax, Web Services and SQL technologies. The system realizes functions about risk management, performance appraisal and data management, and provides an efficient and convenient information management platform for railway passenger safety manager.
Wilson, Charlotte; Alam, Rahul; Latif, Saima; Knighting, Katherine; Williamson, Susan; Beaver, Kinta
2012-01-01
A higher risk of diabetes mellitus in South Asian and Black African populations combined with lower reported access and self-management-related health outcomes informed the aims of this study. Our aims were to synthesise and evaluate evidence relating to patient self-management and access to healthcare services for ethnic minority groups living with diabetes. A comprehensive search strategy was developed capturing a full range of study types from 1995-2010, including relevant hand-searched literature pre-dating 1995. Systematic database searches of MEDLINE, Cochrane, DARE, HTA and NHSEED, the British Nursing Index, CAB abstracts, EMBASE, Global Health, Health Management Information Consortium and PsychInfo were conducted, yielding 21,288 abstracts. Following search strategy refinement and the application of review eligibility criteria; 11 randomised controlled trials (RCTs), 18 qualitative studies and 18 quantitative studies were evaluated and principal results extracted. Results suggest that self-management practices are in need of targeted intervention in terms of patients' knowledge and understanding of their illness, inadequacy of information and language and communication difficulties arising from cultural differences. Access to health-care is similarly hindered by a lack of cultural sensitivity in service provision and under use of clinic-based interpreters and community-based services. Recommendations for practice and subsequent intervention primarily rest at the service level but key barriers at patient and provider levels are also identified. © 2011 Blackwell Publishing Ltd.
A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network
NASA Astrophysics Data System (ADS)
Lussana, C.; Ranci, M.; Uboldi, F.
2012-04-01
In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.
Implementing home care in Canada: four critical elements.
Richardson, B
2000-01-01
While MacAdam proposes a "national approach to home care#8221; the obstacles to this are well known and substantial. They are the likely cost and the limitations of the federal government s role in healthcare. Building on MacAdam's assessment, this paper outlines four problems embedded in the various home-care service delivery models in Canada: the lack of factual client outcome information to support decision-making, the limited client choice of provider, the perverse incentive of fee for service and the bias against the for-profit provider. The paper proposes that the assessment, classification and measurement of outcomes for every recipient of home-care services be standardized using a proven assessment instrument, such as OASIS-B or MDS-HC, by healthcare professionals certified in its use. The resulting information would be captured in a regional database and available for analysis and research. CIHI would be contracted to manage a national database and to fund the training and certification of assessors. The paper proposes a new service delivery and funding model, utilizing standard client outcome information, different roles for regional health authorities and service providers, and a prospective payment mechanism replacing fee for service. A national home care program may be an elusive dream, but that shouldn't stop experimentation, evaluation and improvement.
Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian
2012-01-01
Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases. PMID:23066385
Seliske, Laura; Pickett, William; Bates, Rebecca; Janssen, Ian
2012-08-01
Many studies examining the food retail environment rely on geographic information system (GIS) databases for location information. The purpose of this study was to validate information provided by two GIS databases, comparing the positional accuracy of food service places within a 1 km circular buffer surrounding 34 schools in Ontario, Canada. A commercial database (InfoCanada) and an online database (Yellow Pages) provided the addresses of food service places. Actual locations were measured using a global positioning system (GPS) device. The InfoCanada and Yellow Pages GIS databases provided the locations for 973 and 675 food service places, respectively. Overall, 749 (77.1%) and 595 (88.2%) of these were located in the field. The online database had a higher proportion of food service places found in the field. The GIS locations of 25% of the food service places were located within approximately 15 m of their actual location, 50% were within 25 m, and 75% were within 50 m. This validation study provided a detailed assessment of errors in the measurement of the location of food service places in the two databases. The location information was more accurate for the online database, however, when matching criteria were more conservative, there were no observed differences in error between the databases.
Brenn, B Randall; Choudhry, Dinesh K; Sacks, Karen; Como-Fluehr, Sandra; Strain, Robert
2016-09-01
Despite increased focus on pediatric pain, uncontrolled pain is still a problem for hospitalized pediatric inpatients. A program was designed to find patients with uncontrolled pain and develop a framework to oversee their pain management. This report details the development of a pain stewardship program with data from the first year of its activity. Hospitalized inpatients in a tertiary care pediatric center in the mid-Atlantic region were included in the study. Pain scores are recorded every 4 hours in the hospital electronic health record. A report was constructed to find all patients with an average pain score ≥7 in the preceding 12 hours. The charts of these patients were reviewed by our anesthesia pain service, and all patients were grouped into 1 of the following action categories: (1) no action required; (2) telephone call to the patient's attending physician; (3) one-time consultation; (4) consultation with ongoing management; or (5) patient was already on the anesthesia pain service. Demographic data, pain regimens, and outcomes were recorded in a prospectively collected database. There were 843 records on 441 unique patients. Only 22% required action to be taken by the anesthesia pain service. The pain stewardship database revealed that patients with sickle cell disease or abdominal pain required more frequent attention. An electronic health record-based pain stewardship program is an important step in identifying all children in the hospital with undermanaged pain, and it provides a warning system that may improve patient care, outcomes, and satisfaction. Copyright © 2016 by the American Academy of Pediatrics.
EPA Facility Registry Service (FRS): CAMDBS
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Clean Air Markets Division Business System (CAMDBS). Administered by the EPA Clean Air Markets Division, within the Office of Air and Radiation, CAMDBS supports the implementation of market-based air pollution control programs, including the Acid Rain Program and regional programs designed to reduce the transport of ozone. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to CAMDBS facilities once the CAMDBS data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
Barbara, Angela M; Dobbins, Maureen; Brian Haynes, R; Iorio, Alfonso; Lavis, John N; Raina, Parminder; Levinson, Anthony J
2017-07-11
The objective of this work was to provide easy access to reliable health information based on good quality research that will help health care professionals to learn what works best for seniors to stay as healthy as possible, manage health conditions and build supportive health systems. This will help meet the demands of our aging population that clinicians provide high quality care for older adults, that public health professionals deliver disease prevention and health promotion strategies across the life span, and that policymakers address the economic and social need to create a robust health system and a healthy society for all ages. The McMaster Optimal Aging Portal's (Portal) professional bibliographic database contains high quality scientific evidence about optimal aging specifically targeted to clinicians, public health professionals and policymakers. The database content comes from three information services: McMaster Premium LiteratUre Service (MacPLUS™), Health Evidence™ and Health Systems Evidence. The Portal is continually updated, freely accessible online, easily searchable, and provides email-based alerts when new records are added. The database is being continually assessed for value, usability and use. A number of improvements are planned, including French language translation of content, increased linkages between related records within the Portal database, and inclusion of additional types of content. While this article focuses on the professional database, the Portal also houses resources for patients, caregivers and the general public, which may also be of interest to geriatric practitioners and researchers.
Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less
A Content Markup Language for Data Services
NASA Astrophysics Data System (ADS)
Noviello, C.; Acampa, P.; Mango Furnari, M.
Network content delivery and documents sharing is possible using a variety of technologies, such as distributed databases, service-oriented applications, and so forth. The development of such systems is a complex job, because document life cycle involves a strong cooperation between domain experts and software developers. Furthermore, the emerging software methodologies, such as the service-oriented architecture and knowledge organization (e.g., semantic web) did not really solve the problems faced in a real distributed and cooperating settlement. In this chapter the authors' efforts to design and deploy a distribute and cooperating content management system are described. The main features of the system are a user configurable document type definition and a management middleware layer. It allows CMS developers to orchestrate the composition of specialized software components around the structure of a document. In this chapter are also reported some of the experiences gained on deploying the developed framework in a cultural heritage dissemination settlement.
A cyber infrastructure for the SKA Telescope Manager
NASA Astrophysics Data System (ADS)
Barbosa, Domingos; Barraca, João. P.; Carvalho, Bruno; Maia, Dalmiro; Gupta, Yashwant; Natarajan, Swaminathan; Le Roux, Gerhard; Swart, Paul
2016-07-01
The Square Kilometre Array Telescope Manager (SKA TM) will be responsible for assisting the SKA Operations and Observation Management, carrying out System diagnosis and collecting Monitoring and Control data from the SKA subsystems and components. To provide adequate compute resources, scalability, operation continuity and high availability, as well as strict Quality of Service, the TM cyber-infrastructure (embodied in the Local Infrastructure - LINFRA) consists of COTS hardware and infrastructural software (for example: server monitoring software, host operating system, virtualization software, device firmware), providing a specially tailored Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) solution. The TM infrastructure provides services in the form of computational power, software defined networking, power, storage abstractions, and high level, state of the art IaaS and PaaS management interfaces. This cyber platform will be tailored to each of the two SKA Phase 1 telescopes (SKA_MID in South Africa and SKA_LOW in Australia) instances, each presenting different computational and storage infrastructures and conditioned by location. This cyber platform will provide a compute model enabling TM to manage the deployment and execution of its multiple components (observation scheduler, proposal submission tools, MandC components, Forensic tools and several Databases, etc). In this sense, the TM LINFRA is primarily focused towards the provision of isolated instances, mostly resorting to virtualization technologies, while defaulting to bare hardware if specifically required due to performance, security, availability, or other requirement.
Bell, Janice F; Krupski, Antoinette; Joesch, Jutta M; West, Imara I; Atkins, David C; Court, Beverly; Mancuso, David; Roy-Byrne, Peter
2015-06-01
To evaluate outcomes of a registered nurse-led care management intervention for disabled Medicaid beneficiaries with high health care costs. Washington State Department of Social and Health Services Client Outcomes Database, 2008-2011. In a randomized controlled trial with intent-to-treat analysis, outcomes were compared for the intervention (n = 557) and control groups (n = 563). A quasi-experimental subanalysis compared outcomes for program participants (n = 251) and propensity score-matched controls (n = 251). Administrative data were linked to describe costs and use of health services, criminal activity, homelessness, and death. In the intent-to-treat analysis, the intervention group had higher odds of outpatient mental health service use and higher prescription drug costs than controls in the postperiod. In the subanalysis, participants had fewer unplanned hospital admissions and lower associated costs; higher prescription drug costs; higher odds of long-term care service use; higher drug/alcohol treatment costs; and lower odds of homelessness. We found no health care cost savings for disabled Medicaid beneficiaries randomized to intensive care management. Among participants, care management may have the potential to increase access to needed care, slow growth in the number and therefore cost of unplanned hospitalizations, and prevent homelessness. These findings apply to start-up care management programs targeted at high-cost, high-risk Medicaid populations. © Health Research and Educational Trust.
NESDIS OSPO Data Access Policy and CRM
NASA Astrophysics Data System (ADS)
Seybold, M. G.; Donoho, N. A.; McNamara, D.; Paquette, J.; Renkevens, T.
2012-12-01
The Office of Satellite and Product Operations (OSPO) is the NESDIS office responsible for satellite operations, product generation, and product distribution. Access to and distribution of OSPO data was formally established in a Data Access Policy dated February, 2011. An extension of the data access policy is the OSPO Customer Relationship Management (CRM) Database, which has been in development since 2008 and is reaching a critical level of maturity. This presentation will provide a summary of the data access policy and standard operating procedure (SOP) for handling data access requests. The tangential CRM database will be highlighted including the incident tracking system, reporting and notification capabilities, and the first comprehensive portfolio of NESDIS satellites, instruments, servers, applications, products, user organizations, and user contacts. Select examples of CRM data exploitation will show how OSPO is utilizing the CRM database to more closely satisfy the user community's satellite data needs with new product promotions, as well as new data and imagery distribution methods in OSPO's Environmental Satellite Processing Center (ESPC). In addition, user services and outreach initiatives from the Satellite Products and Services Division will be highlighted.
D Partition-Based Clustering for Supply Chain Data Management
NASA Astrophysics Data System (ADS)
Suhaibah, A.; Uznir, U.; Anton, F.; Mioc, D.; Rahman, A. A.
2015-10-01
Supply Chain Management (SCM) is the management of the products and goods flow from its origin point to point of consumption. During the process of SCM, information and dataset gathered for this application is massive and complex. This is due to its several processes such as procurement, product development and commercialization, physical distribution, outsourcing and partnerships. For a practical application, SCM datasets need to be managed and maintained to serve a better service to its three main categories; distributor, customer and supplier. To manage these datasets, a structure of data constellation is used to accommodate the data into the spatial database. However, the situation in geospatial database creates few problems, for example the performance of the database deteriorate especially during the query operation. We strongly believe that a more practical hierarchical tree structure is required for efficient process of SCM. Besides that, three-dimensional approach is required for the management of SCM datasets since it involve with the multi-level location such as shop lots and residential apartments. 3D R-Tree has been increasingly used for 3D geospatial database management due to its simplicity and extendibility. However, it suffers from serious overlaps between nodes. In this paper, we proposed a partition-based clustering for the construction of a hierarchical tree structure. Several datasets are tested using the proposed method and the percentage of the overlapping nodes and volume coverage are computed and compared with the original 3D R-Tree and other practical approaches. The experiments demonstrated in this paper substantiated that the hierarchical structure of the proposed partitionbased clustering is capable of preserving minimal overlap and coverage. The query performance was tested using 300,000 points of a SCM dataset and the results are presented in this paper. This paper also discusses the outlook of the structure for future reference.
Web-services-based spatial decision support system to facilitate nuclear waste siting
NASA Astrophysics Data System (ADS)
Huang, L. Xinglai; Sheng, Grant
2006-10-01
The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.
Backhouse, Amy; Richards, David A; McCabe, Rose; Watkins, Ross; Dickens, Chris
2017-11-22
Interventions aiming to coordinate services for the community-based dementia population vary in components, organisation and implementation. In this review we aimed to investigate the views of stakeholders on the key components of community-based interventions coordinating care in dementia. We searched four databases from inception to June 2015; Medline, The Cochrane Library, EMBASE and PsycINFO, this was aided by a search of four grey literature databases, and backward and forward citation tracking of included papers. Title and abstract screening was followed by a full text screen by two independent reviewers, and quality was assessed using the CASP appraisal tool. We then conducted thematic synthesis on extracted data. A total of seven papers from five independent studies were included in the review, and encompassed the views of over 100 participants from three countries. Through thematic synthesis we identified 32 initial codes that were grouped into 5 second-order themes: (1) case manager had four associated codes and described preferences for the case manager personal and professional attributes, including a sound knowledge in dementia and availability of local services; (2) communication had five associated codes and emphasized the importance stakeholders placed on multichannel communication with service users, as well as between multidisciplinary teams and across organisations; (3) intervention had 11 associated codes which focused primarily on the practicalities of implementation such as the contact type and frequency between case managers and service users, and the importance of case manager training and service evaluation; (4) resources had five associated codes which outlined stakeholder views on the required resources for coordinating interventions and potential overlap with existing resources, as well as arising issues when available resources do not meet those required for successful implementation; and (5) support had seven associated codes that reflect the importance that was placed on the support network around the case manager and the investment of professionals involved directly in care as well as the wider professional network. The synthesis of relevant qualitative studies has shown how various stakeholder groups considered dementia care coordination interventions to be acceptable, useful and appropriate for dementia care, and have clear preferences for components, implementation methods and settings of these interventions. By incorporating stakeholders' perspectives and preferences when planning and developing coordinating interventions we may increase the likelihood of successful implementation and patient benefits.
The role of complaint management in the service recovery process.
Bendall-Lyon, D; Powers, T L
2001-05-01
Patient satisfaction and retention can be influenced by the development of an effective service recovery program that can identify complaints and remedy failure points in the service system. Patient complaints provide organizations with an opportunity to resolve unsatisfactory situations and to track complaint data for quality improvement purposes. Service recovery is an important and effective customer retention tool. One way an organization can ensure repeat business is by developing a strong customer service program that includes service recovery as an essential component. The concept of service recovery involves the service provider taking responsive action to "recover" lost or dissatisfied customers and convert them into satisfied customers. Service recovery has proven to be cost-effective in other service industries. The complaint management process involves six steps that organizations can use to influence effective service recovery: (1) encourage complaints as a quality improvement tool; (2) establish a team of representatives to handle complaints; (3) resolve customer problems quickly and effectively; (4) develop a complaint database; (5) commit to identifying failure points in the service system; and (6) track trends and use information to improve service processes. Customer retention is enhanced when an organization can reclaim disgruntled patients through the development of effective service recovery programs. Health care organizations can become more customer oriented by taking advantage of the information provided by patient complaints, increasing patient satisfaction and retention in the process.
A proposed group management scheme for XTP multicast
NASA Technical Reports Server (NTRS)
Dempsey, Bert J.; Weaver, Alfred C.
1990-01-01
The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.
The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access
NASA Astrophysics Data System (ADS)
Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.
2011-12-01
The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.
A Grid Metadata Service for Earth and Environmental Sciences
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni
2010-05-01
Critical challenges for climate modeling researchers are strongly connected with the increasingly complex simulation models and the huge quantities of produced datasets. Future trends in climate modeling will only increase computational and storage requirements. For this reason the ability to transparently access to both computational and data resources for large-scale complex climate simulations must be considered as a key requirement for Earth Science and Environmental distributed systems. From the data management perspective (i) the quantity of data will continuously increases, (ii) data will become more and more distributed and widespread, (iii) data sharing/federation will represent a key challenging issue among different sites distributed worldwide, (iv) the potential community of users (large and heterogeneous) will be interested in discovery experimental results, searching of metadata, browsing collections of files, compare different results, display output, etc.; A key element to carry out data search and discovery, manage and access huge and distributed amount of data is the metadata handling framework. What we propose for the management of distributed datasets is the GRelC service (a data grid solution focusing on metadata management). Despite the classical approaches, the proposed data-grid solution is able to address scalability, transparency, security and efficiency and interoperability. The GRelC service we propose is able to provide access to metadata stored in different and widespread data sources (relational databases running on top of MySQL, Oracle, DB2, etc. leveraging SQL as query language, as well as XML databases - XIndice, eXist, and libxml2 based documents, adopting either XPath or XQuery) providing a strong data virtualization layer in a grid environment. Such a technological solution for distributed metadata management leverages on well known adopted standards (W3C, OASIS, etc.); (ii) supports role-based management (based on VOMS), which increases flexibility and scalability; (iii) provides full support for Grid Security Infrastructure, which means (authorization, mutual authentication, data integrity, data confidentiality and delegation); (iv) is compatible with existing grid middleware such as gLite and Globus and finally (v) is currently adopted at the Euro-Mediterranean Centre for Climate Change (CMCC - Italy) to manage the entire CMCC data production activity as well as in the international Climate-G testbed.
2005-01-01
Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB. PMID:16046824
Total quality management: It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.
NASA Technical Reports Server (NTRS)
Hu, Chaumin
2007-01-01
IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.
A Study To Increase Computer Applications in Social Work Management.
ERIC Educational Resources Information Center
Lucero, John A.
The purpose of this study was to address the use of computers in social work practice and to survey the field for tools, concepts, and trends that could assist social workers in their practice. In addition to a review of the relevant literature, information was requested from the Social Work Service and Ambulatory Care Database Section at Walter…
Interactive access to forest inventory data for the South Central United States
William H. McWilliams
1990-01-01
On-line access to USDA, Forest Service successive forest inventory data for the South Central United States is provided by two computer systems. The Easy Access to Forest Inventory and Analysis Tables program (EZTAB) produces a set of tables for specific geographic areas. The Interactive Graphics and Retrieval System (INGRES) is a database management system that...
Chris Ringo; Alan A. Ager; Michelle A. Day; Sarah Crim
2016-01-01
Understanding the capacity to reduce wildfire risk and restore dry forests on Western national forests is a key part of prioritizing new accelerated restoration programs initiated by the Forest Service. Although a number of social and biophysical factors influence the ability to implement restoration programs, one key driver is the suite of forest plan land...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steiner, C.K.; Causley, M.C.; Yocke, M.A.
1994-04-01
The 1990 Clean Air Act Amendments require the Minerals Management Service (MMS) to conduct a research study to assess the potential onshore air quality impact from the development of outer continental shelf (OCS) petroleum resources in the Gulf of Mexico. The need for this study arises from concern about the cumulative impacts of current and future OCS emissions on ozone concentrations on nonattainment areas, particularly in Texas and Louisiana. To make quantitative assessments of these impacts, MMS has commissioned an air quality study which includes as a major component the development of a comprehensive emission inventory for photochemical grid modeling.more » The emission inventories prepared in this study include both onshore and offshore emissions. All relevant emissions from anthropogenic and biogenic sources are considered, with special attention focused on offshore anthropogenic sources, including OCS oil and gas production facilities, crew and supply vessels and helicopters serving OCS facilities, commercial shipping and fishing, recreational boating, intercoastal barge traffic and other sources located in the adjacent state waters. This document describes the database created during this study that contains the activity information collected for the development of the OCS platform, and crew/supply vessel and helicopter emission inventories.« less
Horvath , E.A.; Fosnight, E.A.; Klingebiel, A.A.; Moore, D.G.; Stone, J.E.; Reybold, W.U.; Petersen, G.W.
1987-01-01
A methodology has been developed to create a spatial database by referencing digital elevation, Landsat multispectral scanner data, and digitized soil premap delineations of a number of adjacent 7.5-min quadrangle areas to a 30-m Universal Transverse Mercator projection. Slope and aspect transformations are calculated from elevation data and grouped according to field office specifications. An unsupervised classification is performed on a brightness and greenness transformation of the spectral data. The resulting spectral, slope, and aspect maps of each of the 7.5-min quadrangle areas are then plotted and submitted to the field office to be incorporated into the soil premapping stages of a soil survey. A tabular database is created from spatial data by generating descriptive statistics for each data layer within each soil premap delineation. The tabular data base is then entered into a data base management system to be accessed by the field office personnel during the soil survey and to be used for subsequent resource management decisions.Large amounts of data are collected and archived during resource inventories for public land management. Often these data are stored as stacks of maps or folders in a file system in someone's office, with the maps in a variety of formats, scales, and with various standards of accuracy depending on their purpose. This system of information storage and retrieval is cumbersome at best when several categories of information are needed simultaneously for analysis or as input to resource management models. Computers now provide the resource scientist with the opportunity to design increasingly complex models that require even more categories of resource-related information, thus compounding the problem.Recently there has been much emphasis on the use of geographic information systems (GIS) as an alternative method for map data archives and as a resource management tool. Considerable effort has been devoted to the generation of tabular databases, such as the U.S. Department of Agriculture's SCS/S015 (Soil Survey Staff, 1983), to archive the large amounts of information that are collected in conjunction with mapping of natural resources in an easily retrievable manner.During the past 4 years the U.S. Geological Survey's EROS Data Center, in a cooperative effort with the Bureau of Land Management (BLM) and the Soil Conservation Service (SCS), developed a procedure that uses spatial and tabular databases to generate elevation, slope, aspect, and spectral map products that can be used during soil premapping. The procedure results in tabular data, residing in a database management system, that are indexed to the final soil delineations and help quantify soil map unit composition.The procedure was developed and tested on soil surveys on over 600 000 ha in Wyoming, Nevada, and Idaho. A transfer of technology from the EROS Data Center to the BLM will enable the Denver BLM Service Center to use this procedure in soil survey operations on BLM lands. Also underway is a cooperative effort between the EROS Data Center and SCS to define and evaluate maps that can be produced as derivatives of digital elevation data for 7.5-min quadrangle areas, such as those used during the premapping stage of the soil surveys mentioned above, the idea being to make such products routinely available.The procedure emphasizes the applications of digital elevation and spectral data to order-three soil surveys on rangelands, and will:Incorporate digital terrain and spectral data into a spatial database for soil surveys.Provide hardcopy products (that can be generated from digital elevation model and spectral data) that are useful during the soil pre-mapping process.Incorporate soil premaps into a spatial database that can be accessed during the soil survey process along with terrain and spectral data.Summarize useful quantitative information for soil mapping and for making interpretations for resource management.
Analysis, requirements and development of a collaborative social and medical services data model.
Bobroff, R B; Petermann, C A; Beck, J R; Buffone, G J
1994-01-01
In any medical and social service setting, patient data must be readily shared among multiple providers for delivery of expeditious, quality care. This paper describes the development and implementation of a generalized social and medical services data model for an ambulatory population. The model, part of the Collaborative Social and Medical Services System Project, is based on the data needs of the Baylor College of Medicine Teen Health Clinics and follows the guidelines of the ANSI HISPP/MSDS JWG for a Common Data Model. Design details were determined by informal staff interviews, operational observations, and examination of clinic guidelines and forms. The social and medical services data model is implemented using object-oriented data modeling techniques and will be implemented in C++ using an Object-Oriented Database Management System.
ERIC Educational Resources Information Center
Griffiths, Jose-Marie; And Others
This document contains validated activities and competencies needed by librarians working in a database distributor/service organization. The activities of professionals working in database distributor/service organizations are listed by function: Database Processing; Customer Support; System Administration; and Planning. The competencies are…
Nicholas, Lauren Hersch
2013-01-01
Do differences in rates of use among managed care and Fee-for-Service Medicare beneficiaries reflect selection bias or successful care management by insurers? I demonstrate a new method to estimate the treatment effect of insurance status on health care utilization. Using clinical information and risk-adjustment techniques on data on acute admission that are unrelated to recent medical care, I create a proxy measure of unobserved health status. I find that positive selection accounts for between one-quarter and one-third of the risk-adjusted differences in rates of hospitalization for ambulatory care sensitive conditions and elective procedures among Medicare managed care and Fee-for-Service enrollees in 7 years of Healthcare Cost and Utilization Project State Inpatient Databases from Arizona, Florida, New Jersey and New York matched to Medicare enrollment data. Beyond selection effects, I find that managed care plans reduce rates of potentially preventable hospitalizations by 12.5 per 1,000 enrollees (compared to mean of 46 per 1,000) and reduce annual rates of elective admissions by 4 per 1,000 enrollees (mean 18.6 per 1,000). PMID:24533012
Detection and Prevention of Insider Threats in Database Driven Web Services
NASA Astrophysics Data System (ADS)
Chumash, Tzvi; Yao, Danfeng
In this paper, we take the first step to address the gap between the security needs in outsourced hosting services and the protection provided in the current practice. We consider both insider and outsider attacks in the third-party web hosting scenarios. We present SafeWS, a modular solution that is inserted between server side scripts and databases in order to prevent and detect website hijacking and unauthorized access to stored data. To achieve the required security, SafeWS utilizes a combination of lightweight cryptographic integrity and encryption tools, software engineering techniques, and security data management principles. We also describe our implementation of SafeWS and its evaluation. The performance analysis of our prototype shows the overhead introduced by security verification is small. SafeWS will allow business owners to significantly reduce the security risks and vulnerabilities of outsourcing their sensitive customer data to third-party providers.
Autonomous mission planning and scheduling: Innovative, integrated, responsive
NASA Technical Reports Server (NTRS)
Sary, Charisse; Liu, Simon; Hull, Larry; Davis, Randy
1994-01-01
Autonomous mission scheduling, a new concept for NASA ground data systems, is a decentralized and distributed approach to scientific spacecraft planning, scheduling, and command management. Systems and services are provided that enable investigators to operate their own instruments. In autonomous mission scheduling, separate nodes exist for each instrument and one or more operations nodes exist for the spacecraft. Each node is responsible for its own operations which include planning, scheduling, and commanding; and for resolving conflicts with other nodes. One or more database servers accessible to all nodes enable each to share mission and science planning, scheduling, and commanding information. The architecture for autonomous mission scheduling is based upon a realistic mix of state-of-the-art and emerging technology and services, e.g., high performance individual workstations, high speed communications, client-server computing, and relational databases. The concept is particularly suited to the smaller, less complex missions of the future.
Active in-database processing to support ambient assisted living systems.
de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas
2014-08-12
As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.
Active In-Database Processing to Support Ambient Assisted Living Systems
de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas
2014-01-01
As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164
DOEDEF Software System, Version 2. 2: Operational instructions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meirans, L.
The DOEDEF (Department of Energy Data Exchange Format) Software System is a collection of software routines written to facilitate the manipulation of IGES (Initial Graphics Exchange Specification) data. Typically, the IGES data has been produced by the IGES processors for a Computer-Aided Design (CAD) system, and the data manipulations are user-defined ''flavoring'' operations. The DOEDEF Software System is used in conjunction with the RIM (Relational Information Management) DBMS from Boeing Computer Services (Version 7, UD18 or higher). The three major pieces of the software system are: Parser, reads an ASCII IGES file and converts it to the RIM database equivalent;more » Kernel, provides the user with IGES-oriented interface routines to the database; and Filewriter, writes the RIM database to an IGES file.« less
Review of telehealth stuttering management.
Lowe, Robyn; O'Brian, Sue; Onslow, Mark
2013-01-01
Telehealth is the use of communication technology to provide health care services by means other than typical in-clinic attendance models. Telehealth is increasingly used for the management of speech, language and communication disorders. The aim of this article is to review telehealth applications to stuttering management. We conducted a search of peer-reviewed literature for the past 20 years using the Institute for Scientific Information Web of Science database, PubMed: The Bibliographic Database and a search for articles by hand. Outcomes for telehealth stuttering treatment were generally positive, but there may be a compromise of treatment efficiency with telehealth treatment of young children. Our search found no studies dealing with stuttering assessment procedures using telehealth models. No economic analyses of this delivery model have been reported. This review highlights the need for continued research about telehealth for stuttering management. Evidence from research is needed to inform the efficacy of assessment procedures using telehealth methods as well as guide the development of improved treatment procedures. Clinical and technical guidelines are urgently needed to ensure that the evolving and continued use of telehealth to manage stuttering does not compromise the standards of care afforded with standard in-clinic models.
Research on spatio-temporal database techniques for spatial information service
NASA Astrophysics Data System (ADS)
Zhao, Rong; Wang, Liang; Li, Yuxiang; Fan, Rongshuang; Liu, Ping; Li, Qingyuan
2007-06-01
Geographic data should be described by spatial, temporal and attribute components, but the spatio-temporal queries are difficult to be answered within current GIS. This paper describes research into the development and application of spatio-temporal data management system based upon GeoWindows GIS software platform which was developed by Chinese Academy of Surveying and Mapping (CASM). Faced the current and practical requirements of spatial information application, and based on existing GIS platform, one kind of spatio-temporal data model which integrates vector and grid data together was established firstly. Secondly, we solved out the key technique of building temporal data topology, successfully developed a suit of spatio-temporal database management system adopting object-oriented methods. The system provides the temporal data collection, data storage, data management and data display and query functions. Finally, as a case study, we explored the application of spatio-temporal data management system with the administrative region data of multi-history periods of China as the basic data. With all the efforts above, the GIS capacity of management and manipulation in aspect of time and attribute of GIS has been enhanced, and technical reference has been provided for the further development of temporal geographic information system (TGIS).
Collaborative Resource Allocation
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Wax, Allan; Lam, Raymond; Baldwin, John; Borden, Chester
2007-01-01
Collaborative Resource Allocation Networking Environment (CRANE) Version 0.5 is a prototype created to prove the newest concept of using a distributed environment to schedule Deep Space Network (DSN) antenna times in a collaborative fashion. This program is for all space-flight and terrestrial science project users and DSN schedulers to perform scheduling activities and conflict resolution, both synchronously and asynchronously. Project schedulers can, for the first time, participate directly in scheduling their tracking times into the official DSN schedule, and negotiate directly with other projects in an integrated scheduling system. A master schedule covers long-range, mid-range, near-real-time, and real-time scheduling time frames all in one, rather than the current method of separate functions that are supported by different processes and tools. CRANE also provides private workspaces (both dynamic and static), data sharing, scenario management, user control, rapid messaging (based on Java Message Service), data/time synchronization, workflow management, notification (including emails), conflict checking, and a linkage to a schedule generation engine. The data structure with corresponding database design combines object trees with multiple associated mortal instances and relational database to provide unprecedented traceability and simplify the existing DSN XML schedule representation. These technologies are used to provide traceability, schedule negotiation, conflict resolution, and load forecasting from real-time operations to long-range loading analysis up to 20 years in the future. CRANE includes a database, a stored procedure layer, an agent-based middle tier, a Web service wrapper, a Windows Integrated Analysis Environment (IAE), a Java application, and a Web page interface.
A Web-Based GIS for Reporting Water Usage in the High Plains Underground Water Conservation District
NASA Astrophysics Data System (ADS)
Jia, M.; Deeds, N.; Winckler, M.
2012-12-01
The High Plains Underground Water Conservation District (HPWD) is the largest and oldest of the Texas water conservation districts, and oversees approximately 1.7 million irrigated acres. Recent rule changes have motivated HPWD to develop a more automated system to allow owners and operators to report well locations, meter locations, meter readings, the association between meters and wells, and contiguous acres. INTERA, Inc. has developed a web-based interactive system for HPWD water users to report water usage and for the district to better manage its water resources. The HPWD web management system utilizes state-of-the-art GIS techniques, including cloud-based Amazon EC2 virtual machine, ArcGIS Server, ArcSDE and ArcGIS Viewer for Flex, to support web-based water use management. The system enables users to navigate to their area of interest using a well-established base-map and perform a variety of operations and inquiries against their spatial features. The application currently has six components: user privilege management, property management, water meter registration, area registration, meter-well association and water use report. The system is composed of two main databases: spatial database and non-spatial database. With the help of Adobe Flex application at the front end and ArcGIS Server as the middle-ware, the spatial feature geometry and attributes update will be reflected immediately in the back end. As a result, property owners, along with the HPWD staff, collaborate together to weave the fabric of the spatial database. Interactions between the spatial and non-spatial databases are established by Windows Communication Foundation (WCF) services to record water-use report, user-property associations, owner-area associations, as well as meter-well associations. Mobile capabilities will be enabled in the near future for field workers to collect data and synchronize them to the spatial database. The entire solution is built on a highly scalable cloud server to dynamically allocate the computational resources so as to reduce the cost on security and hardware maintenance. In addition to the default capabilities provided by ESRI, customizations include 1) enabling interactions between spatial and non-spatial databases, 2) providing role-based feature editing, 3) dynamically filtering spatial features on the map based on user accounts and 4) comprehensive data validation.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Roelofs, Larry H.; Short, Nicholas M., Jr.
1987-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components the development of an Intelligent User Interface (IUI).The intent of the latter is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. The purpose is to support the large number of potential scientific and engineering users presently having need of space and land related research and technical data but who have little or no experience in query languages or understanding of the information content or architecture of the databases involved. This technical memorandum presents prototype Intelligent User Interface Subsystem (IUIS) using the Crustal Dynamics Project Database as a test bed for the implementation of the CRUDDES (Crustal Dynamics Expert System). The knowledge base has more than 200 rules and represents a single application view and the architectural view. Operational performance using CRUDDES has allowed nondatabase users to obtain useful information from the database previously accessible only to an expert database user or the database designer.
MetaboLights: towards a new COSMOS of metabolomics data management.
Steinbeck, Christoph; Conesa, Pablo; Haug, Kenneth; Mahendraker, Tejasvi; Williams, Mark; Maguire, Eamonn; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Salek, Reza M; Griffin, Julian L
2012-10-01
Exciting funding initiatives are emerging in Europe and the US for metabolomics data production, storage, dissemination and analysis. This is based on a rich ecosystem of resources around the world, which has been build during the past ten years, including but not limited to resources such as MassBank in Japan and the Human Metabolome Database in Canada. Now, the European Bioinformatics Institute has launched MetaboLights, a database for metabolomics experiments and the associated metadata (http://www.ebi.ac.uk/metabolights). It is the first comprehensive, cross-species, cross-platform metabolomics database maintained by one of the major open access data providers in molecular biology. In October, the European COSMOS consortium will start its work on Metabolomics data standardization, publication and dissemination workflows. The NIH in the US is establishing 6-8 metabolomics services cores as well as a national metabolomics repository. This communication reports about MetaboLights as a new resource for Metabolomics research, summarises the related developments and outlines how they may consolidate the knowledge management in this third large omics field next to proteomics and genomics.
Design and implementation of a fault-tolerant and dynamic metadata database for clinical trials
NASA Astrophysics Data System (ADS)
Lee, J.; Zhou, Z.; Talini, E.; Documet, J.; Liu, B.
2007-03-01
In recent imaging-based clinical trials, quantitative image analysis (QIA) and computer-aided diagnosis (CAD) methods are increasing in productivity due to higher resolution imaging capabilities. A radiology core doing clinical trials have been analyzing more treatment methods and there is a growing quantity of metadata that need to be stored and managed. These radiology centers are also collaborating with many off-site imaging field sites and need a way to communicate metadata between one another in a secure infrastructure. Our solution is to implement a data storage grid with a fault-tolerant and dynamic metadata database design to unify metadata from different clinical trial experiments and field sites. Although metadata from images follow the DICOM standard, clinical trials also produce metadata specific to regions-of-interest and quantitative image analysis. We have implemented a data access and integration (DAI) server layer where multiple field sites can access multiple metadata databases in the data grid through a single web-based grid service. The centralization of metadata database management simplifies the task of adding new databases into the grid and also decreases the risk of configuration errors seen in peer-to-peer grids. In this paper, we address the design and implementation of a data grid metadata storage that has fault-tolerance and dynamic integration for imaging-based clinical trials.
Fisk, Glenda M; Neville, Lukas B
2011-10-01
This exploratory study examines the nature of customer entitlement and its impact on front-line service employees. In an open-ended qualitative inquiry, 56 individuals with waitstaff experience described the types of behaviors entitled customers engage in and the kinds of service-related "perks" these individuals feel deserving of. Participants explained how they responded to entitled customers, how and when managers became involved, and how their dealings with these patrons influenced their subjective physical and psychological well-being. We found that the behaviors of entitled customers negatively impacted waitstaff employees. Participants reported physiological arousal, negative affect, burnout, and feelings of dehumanization as a result of dealing with these patrons. While respondents drew on a variety of strategies to manage their encounters with entitled customers, they indicated workplace support was often informal and described feeling abandoned by management in dealing with this workplace stressor. Approaching customer entitlement as a form of microaggression, we offer recommendations for practice and suggest new directions for future research. . (PsycINFO Database Record (c) 2011 APA, all rights reserved).
Gunderson, Michael; Barnard, Jeff; McPherson, John; Kearns, Conrad T
2002-08-01
Pinellas County EMS' Medical Communications Officers provide a wide variety of services to patients, field clinicians, managers and their medical director. The concurrent data collection processes used in the MCO program for performance measurement of resuscitation efforts, intubations, submersion incidents and aeromedical transports for trauma cases have been very effective in the integration of data from multiple computer databases and telephone follow-ups with field crews and receiving emergency department staffs. This has facilitated significant improvements in the performance of these and many other aspects of our EMS system.
mantisGRID: a grid platform for DICOM medical images management in Colombia and Latin America.
Garcia Ruiz, Manuel; Garcia Chaves, Alvin; Ruiz Ibañez, Carlos; Gutierrez Mazo, Jorge Mario; Ramirez Giraldo, Juan Carlos; Pelaez Echavarria, Alejandro; Valencia Diaz, Edison; Pelaez Restrepo, Gustavo; Montoya Munera, Edwin Nelson; Garcia Loaiza, Bernardo; Gomez Gonzalez, Sebastian
2011-04-01
This paper presents the mantisGRID project, an interinstitutional initiative from Colombian medical and academic centers aiming to provide medical grid services for Colombia and Latin America. The mantisGRID is a GRID platform, based on open source grid infrastructure that provides the necessary services to access and exchange medical images and associated information following digital imaging and communications in medicine (DICOM) and health level 7 standards. The paper focuses first on the data abstraction architecture, which is achieved via Open Grid Services Architecture Data Access and Integration (OGSA-DAI) services and supported by the Globus Toolkit. The grid currently uses a 30-Mb bandwidth of the Colombian High Technology Academic Network, RENATA, connected to Internet 2. It also includes a discussion on the relational database created to handle the DICOM objects that were represented using Extensible Markup Language Schema documents, as well as other features implemented such as data security, user authentication, and patient confidentiality. Grid performance was tested using the three current operative nodes and the results demonstrated comparable query times between the mantisGRID (OGSA-DAI) and Distributed mySQL databases, especially for a large number of records.
An Introduction to Database Structure and Database Machines.
ERIC Educational Resources Information Center
Detweiler, Karen
1984-01-01
Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…
AQUA-USERS: AQUAculture USEr Driven Operational Remote Sensing Information Services
NASA Astrophysics Data System (ADS)
Laanen, Marnix; Poser, Kathrin; Peters, Steef; de Reus, Nils; Ghebrehiwot, Semhar; Eleveld, Marieke; Miller, Peter; Groom, Steve; Clements, Oliver; Kurekin, Andrey; Martinez Vicente, Victor; Brotas, Vanda; Sa, Carolina; Couto, Andre; Brito, Ana; Amorim, Ana; Dale, Trine; Sorensen, Kai; Boye Hansen, Lars; Huber, Silvia; Kaas, Hanne; Andersson, Henrik; Icely, John; Fragoso, Bruno
2015-12-01
The FP7 project AQUA-USERS provides the aquaculture industry with user-relevant and timely information based on the most up-to-date satellite data and innovative optical in-situ measurements. Its key purpose is to develop an application that brings together satellite information on water quality and temperature with in-situ observations as well as relevant weather prediction and met-ocean data. The application and its underlying database are linked to a decision support system that includes a set of (user-determined) management options. Specific focus is on the development of indicators for aquaculture management including indicators for harmful algae bloom (HAB) events. The methods and services developed within AQUA-USERS are tested by the members of the user board, who represent different geographic areas and aquaculture production systems.
2013-01-01
Background Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. Results We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user’s query, advanced data searching based on the specified user’s query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. Conclusions search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/. PMID:23452691
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur
2013-03-01
Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/.
The National Map - Orthoimagery Layer
,
2007-01-01
Many Federal, State, and local agencies use a common set of framework geographic information databases as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continually maintained, and nationally consistent set of online, public domain, framework geographic information databases. The National Map will serve as a foundation for integrating, sharing, and using data easily and consistently. The data will be the source of revised paper topographic maps. The National Map includes digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information.
Sandhoff, Brian G; Nies, Leslie K; Olson, Kari L; Nash, James D; Rasmussen, Jon R; Merenich, John A
2007-01-01
A clinical pharmacy service for managing the treatment of coronary artery disease in a health maintenance organization is described. Despite the proven benefits of aggressive risk factor modification for patients with coronary artery disease (CAD), there remains a treatment gap between consensus- and evidence-based recommendations and their application in patient care. In 1998, Kaiser Permanente of Colorado developed the Clinical Pharmacy Cardiac Risk Service (CPCRS) to focus on the long-term management of patients with CAD to improve clinical outcomes. The primary goals of the CPCRS are to increase the number of CAD patients on lipid-lowering therapy, manage medications shown to decrease the risk of future CAD-related events, assist in the monitoring and control of other diseases that increase cardiovascular risk, provide patient education and recommendations for nonpharmacologic therapy, and act as a CAD information resource for physicians and other health care providers. Using an electronic medical record and tracking database, the service works in close collaboration with primary care physicians, cardiologists, cardiac rehabilitation nurses, and other health care providers to reduce cardiac risk in the CAD population. Particular attention is given to dyslipidemia, blood pressure, diabetes mellitus, and tobacco cessation. Treatment with evidence-based regimens is initiated and adjusted as necessary. Over 11,000 patients are currently being followed by the CPCRS. A clinical pharmacy service in a large health maintenance organization provides cardiac risk reduction for patients with CAD and helps close treatment gaps that may exist for these patients.
NASA Astrophysics Data System (ADS)
Cavallari, Francesca; de Gruttola, Michele; Di Guida, Salvatore; Govi, Giacomo; Innocente, Vincenzo; Pfeiffer, Andreas; Pierro, Antonio
2011-12-01
Automatic, synchronous and reliable population of the condition databases is critical for the correct operation of the online selection as well as of the offline reconstruction and analysis of data. In this complex infrastructure, monitoring and fast detection of errors is a very challenging task. In this paper, we describe the CMS experiment system to process and populate the Condition Databases and make condition data promptly available both online for the high-level trigger and offline for reconstruction. The data are automatically collected using centralized jobs or are "dropped" by the users in dedicated services (offline and online drop-box), which synchronize them and take care of writing them into the online database. Then they are automatically streamed to the offline database, and thus are immediately accessible offline worldwide. The condition data are managed by different users using a wide range of applications.In normal operation the database monitor is used to provide simple timing information and the history of all transactions for all database accounts, and in the case of faults it is used to return simple error messages and more complete debugging information.
Dollar, Daniel M.; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-01-01
Objective: To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. Methods: The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Results: Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database (“electronic shelf reading”), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. Conclusions: The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes. PMID:17443247
David C. Chojnacky; Thomas M. Schuler
2004-01-01
Fallen or down dead wood is a key element in healthy forest ecosystems. Although the amount of down wood and shrubs can provide critical information to forest resource managers for assessing fire fuel build up, data on biomass of down woody materials (DWM) are not readily accessible using existing databases. We summarized data collected by the USDA Forest Service'...
Forced Shortsightedness: Security Force Assistance Missions
2014-06-01
legislation , it is therefore the intention of the Congress to promote the peace of the world and the foreign policy, security, and general welfare of the... legislation , Congressional Research Service (CRS) reports, the Defense Institute of Security Assistance Management’s (DISAM) Green Book, and interviews with...developed database, there are “184 separate legislative authorities that power the 165 Building Partnership Capacity (BPC) programs managed across
Applications of GIS and database technologies to manage a Karst Feature Database
Gao, Y.; Tipping, R.G.; Alexander, E.C.
2006-01-01
This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.
Honoré, Paul; Granjeaud, Samuel; Tagett, Rebecca; Deraco, Stéphane; Beaudoing, Emmanuel; Rougemont, Jacques; Debono, Stéphane; Hingamp, Pascal
2006-09-20
High throughput gene expression profiling (GEP) is becoming a routine technique in life science laboratories. With experimental designs that repeatedly span thousands of genes and hundreds of samples, relying on a dedicated database infrastructure is no longer an option.GEP technology is a fast moving target, with new approaches constantly broadening the field diversity. This technology heterogeneity, compounded by the informatics complexity of GEP databases, means that software developments have so far focused on mainstream techniques, leaving less typical yet established techniques such as Nylon microarrays at best partially supported. MAF (MicroArray Facility) is the laboratory database system we have developed for managing the design, production and hybridization of spotted microarrays. Although it can support the widely used glass microarrays and oligo-chips, MAF was designed with the specific idiosyncrasies of Nylon based microarrays in mind. Notably single channel radioactive probes, microarray stripping and reuse, vector control hybridizations and spike-in controls are all natively supported by the software suite. MicroArray Facility is MIAME supportive and dynamically provides feedback on missing annotations to help users estimate effective MIAME compliance. Genomic data such as clone identifiers and gene symbols are also directly annotated by MAF software using standard public resources. The MAGE-ML data format is implemented for full data export. Journalized database operations (audit tracking), data anonymization, material traceability and user/project level confidentiality policies are also managed by MAF. MicroArray Facility is a complete data management system for microarray producers and end-users. Particular care has been devoted to adequately model Nylon based microarrays. The MAF system, developed and implemented in both private and academic environments, has proved a robust solution for shared facilities and industry service providers alike.
Honoré, Paul; Granjeaud, Samuel; Tagett, Rebecca; Deraco, Stéphane; Beaudoing, Emmanuel; Rougemont, Jacques; Debono, Stéphane; Hingamp, Pascal
2006-01-01
Background High throughput gene expression profiling (GEP) is becoming a routine technique in life science laboratories. With experimental designs that repeatedly span thousands of genes and hundreds of samples, relying on a dedicated database infrastructure is no longer an option. GEP technology is a fast moving target, with new approaches constantly broadening the field diversity. This technology heterogeneity, compounded by the informatics complexity of GEP databases, means that software developments have so far focused on mainstream techniques, leaving less typical yet established techniques such as Nylon microarrays at best partially supported. Results MAF (MicroArray Facility) is the laboratory database system we have developed for managing the design, production and hybridization of spotted microarrays. Although it can support the widely used glass microarrays and oligo-chips, MAF was designed with the specific idiosyncrasies of Nylon based microarrays in mind. Notably single channel radioactive probes, microarray stripping and reuse, vector control hybridizations and spike-in controls are all natively supported by the software suite. MicroArray Facility is MIAME supportive and dynamically provides feedback on missing annotations to help users estimate effective MIAME compliance. Genomic data such as clone identifiers and gene symbols are also directly annotated by MAF software using standard public resources. The MAGE-ML data format is implemented for full data export. Journalized database operations (audit tracking), data anonymization, material traceability and user/project level confidentiality policies are also managed by MAF. Conclusion MicroArray Facility is a complete data management system for microarray producers and end-users. Particular care has been devoted to adequately model Nylon based microarrays. The MAF system, developed and implemented in both private and academic environments, has proved a robust solution for shared facilities and industry service providers alike. PMID:16987406
Baptiste, B; Dawson, D R; Streiner, D
2015-01-01
To determine factors associated with case management (CM) service use in people with traumatic brain injury (TBI), using a published model for service use. A retrospective cohort, with nested case-control design. Correlational and logistic regression analyses of questionnaires from a longitudinal community data base. Questionnaires of 203 users of CM services and 273 non-users, complete for all outcome and predictor variables. Individuals with TBI, 15 years of age and older. Out of a dataset of 1,960 questionnaires, 476 met the inclusion criteria. Eight predictor variables and one outcome variable (use or non-use of the service). Predictor variables considered the framework of the Behaviour Model of Health Service Use (BMHSU); specifically, pre-disposing, need and enabling factor groups as these relate to health service use and access. Analyses revealed significant differences between users and non-users of CM services. In particular, users were significantly younger than non-users as the older the person the less likely to use the service. Also, users had less education and more severe activity limitations and lower community integration. Persons living alone are less likely to use case management. Funding groups also significantly impact users. This study advances an empirical understanding of equity of access to health services usage in the practice of CM for persons living with TBI as a fairly new area of research, and considers direct relevance to Life Care Planning (LCP). Many life care planers are CM and the genesis of LCP is CM. The findings relate to health service use and access, rather than health outcomes. These findings may assist with development of a modified model for prediction of use to advance future cost of care predictions.
Geospatial Data Management Platform for Urban Groundwater
NASA Astrophysics Data System (ADS)
Gaitanaru, D.; Priceputu, A.; Gogu, C. R.
2012-04-01
Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis tools) and a front-end geoportal service. The SIMPA platform makes use of mark-up transfer standards to provide a user-friendly application that can be accessed through internet to query, analyse, and visualise geospatial data related to urban groundwater. The platform holds the information within the local groundwater geospatial databases and the user is able to access this data through a geoportal service. The database architecture allows storing accurate and very detailed geological, hydrogeological, and infrastructure information that can be straightforwardly generalized and further upscaled. The geoportal service offers the possibility of querying a dataset from the spatial database. The query is coded in a standard mark-up language, and sent to the server through a standard Hyper Text Transfer Protocol (http) to be processed by the local application. After the validation of the query, the results are sent back to the user to be displayed by the geoportal application. The main advantage of the SIMPA platform is that it offers to the user the possibility to make a primary multi-criteria query, which results in a smaller set of data to be analysed afterwards. This improves both the transfer process parameters and the user's means of creating the desired query.
NASA Astrophysics Data System (ADS)
Gatto, Francesca; Katsanevakis, Stelios; Vandekerkhove, Jochen; Zenetos, Argyro; Cardoso, Ana Cristina
2013-06-01
Europe is severely affected by alien invasions, which impact biodiversity, ecosystem services, economy, and human health. A large number of national, regional, and global online databases provide information on the distribution, pathways of introduction, and impacts of alien species. The sufficiency and efficiency of the current online information systems to assist the European policy on alien species was investigated by a comparative analysis of occurrence data across 43 online databases. Large differences among databases were found which are partially explained by variations in their taxonomical, environmental, and geographical scopes but also by the variable efforts for continuous updates and by inconsistencies on the definition of "alien" or "invasive" species. No single database covered all European environments, countries, and taxonomic groups. In many European countries national databases do not exist, which greatly affects the quality of reported information. To be operational and useful to scientists, managers, and policy makers, online information systems need to be regularly updated through continuous monitoring on a country or regional level. We propose the creation of a network of online interoperable web services through which information in distributed resources can be accessed, aggregated and then used for reporting and further analysis at different geographical and political scales, as an efficient approach to increase the accessibility of information. Harmonization, standardization, conformity on international standards for nomenclature, and agreement on common definitions of alien and invasive species are among the necessary prerequisites.
Pre-hospital management of mass casualty civilian shootings: a systematic literature review.
Turner, Conor D A; Lockey, David J; Rehn, Marius
2016-11-08
Mass casualty civilian shootings present an uncommon but recurring challenge to emergency services around the world and produce unique management demands. On the background of a rising threat of transnational terrorism worldwide, emergency response strategies are of critical importance. This study aims to systematically identify, describe and appraise the quality of indexed and non-indexed literature on the pre-hospital management of modern civilian mass shootings to guide future practice. Systematic literature searches of PubMed, Cochrane Database of Systematic Reviews and Scopus were conducted in conjunction with simple searches of non-indexed databases; Web of Science, OpenDOAR and Evidence Search. The searches were last carried out on 20 April 2016 and only identified those papers published after the 1 January 1980. Included documents had to contain descriptions, discussions or experiences of the pre-hospital management of civilian mass shootings. From the 494 identified manuscripts, 73 were selected on abstract and title and after full text reading 47 were selected for inclusion in analysis. The search yielded reports of 17 mass shooting events, the majority from the USA with additions from France, Norway, the UK and Kenya. Between 1994 and 2015 the shooting of 1649 people with 578 deaths at 17 separate events are described. Quality appraisal demonstrated considerable heterogeneity in reporting and revealed limited data on mass shootings globally. Key themes were identified to improve future practice: tactical emergency medical support may harmonise inner cordon interventions, a need for inter-service education on effective haemorrhage control, the value of senior triage operators and the need for regular mass casualty incident simulation.
What Is Case Management? A Scoping and Mapping Review
Millington, Michael; Salvador-Carulla, Luis
2016-01-01
The description of case management in research and clinical practice is highly variable which impedes quality analysis, policy and planning. Case management makes a unique contribution towards the integration of health care, social services and other sector services and supports for people with complex health conditions. There are multiple components and variations of case management depending on the context and client population. This paper aims to scope and map case management in the literature to identify how case management is described in the literature for key complex health conditions (e.g., brain injury, diabetes, mental health, spinal cord injury). Following literature searches in multiple databases, grey literature and exclusion by health condition, community-based and adequate description, there were 661 potential papers for data extraction. Data from 79 papers (1988–2013) were analysed to the point of saturation (no new information) and mapped to the model, components and activities. The results included 22 definitions, five models, with 69 activities or tasks of case managers mapped to 17 key components (interventions). The results confirm the significant terminological variance in case management which produces role confusion, ambiguity and hinders comparability across different health conditions and contexts. There is an urgent need for an internationally agreed taxonomy for the coordination, navigation and management of care. PMID:28413368
A new information architecture, website and services for the CMS experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas
2012-01-01
The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe themore » information architecture, the system design, implementation and monitoring, the document and content database, security aspects, and our deployment strategy, which ensured continual smooth operation of all systems at all times.« less
A new Information Architecture, Website and Services for the CMS Experiment
NASA Astrophysics Data System (ADS)
Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas
2012-12-01
The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe the information architecture; the system design, implementation and monitoring; the document and content database; security aspects; and our deployment strategy, which ensured continual smooth operation of all systems at all times.
Negative Effects of Learning Spreadsheet Management on Learning Database Management
ERIC Educational Resources Information Center
Vágner, Anikó; Zsakó, László
2015-01-01
A lot of students learn spreadsheet management before database management. Their similarities can cause a lot of negative effects when learning database management. In this article, we consider these similarities and explain what can cause problems. First, we analyse the basic concepts such as table, database, row, cell, reference, etc. Then, we…
Spacecraft Orbit Design and Analysis (SODA), version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Stallcup, Scott S.; Davis, John S.
1989-01-01
The Spacecraft Orbit Design and Analysis (SODA) computer program, Version 1.0 is described. SODA is a spaceflight mission planning system which consists of five program modules integrated around a common database and user interface. SODA runs on a VAX/VMS computer with an EVANS & SUTHERLAND PS300 graphics workstation. BOEING RIM-Version 7 relational database management system performs transparent database services. In the current version three program modules produce an interactive three dimensional (3D) animation of one or more satellites in planetary orbit. Satellite visibility and sensor coverage capabilities are also provided. One module produces an interactive 3D animation of the solar system. Another module calculates cumulative satellite sensor coverage and revisit time for one or more satellites. Currently Earth, Moon, and Mars systems are supported for all modules except the solar system module.
Effectiveness of case management for homeless persons: a systematic review.
de Vet, Renée; van Luijtelaar, Maurice J A; Brilleslijper-Kater, Sonja N; Vanderplasschen, Wouter; Beijersbergen, Mariëlle D; Wolf, Judith R L M
2013-10-01
We reviewed the literature on standard case management (SCM), intensive case management (ICM), assertive community treatment (ACT), and critical time intervention (CTI) for homeless adults. We searched databases for peer-reviewed English articles published from 1985 to 2011 and found 21 randomized controlled trials or quasi-experimental studies comparing case management to other services. We found little evidence for the effectiveness of ICM. SCM improved housing stability, reduced substance use, and removed employment barriers for substance users. ACT improved housing stability and was cost-effective for mentally ill and dually diagnosed persons. CTI showed promise for housing, psychopathology, and substance use and was cost-effective for mentally ill persons. More research is needed on how case management can most effectively support rapid-rehousing approaches to homelessness.
Effectiveness of Case Management for Homeless Persons: A Systematic Review
de Vet, Renée; van Luijtelaar, Maurice J. A.; Brilleslijper-Kater, Sonja N.; Vanderplasschen, Wouter; Beijersbergen, Mariëlle D.
2013-01-01
We reviewed the literature on standard case management (SCM), intensive case management (ICM), assertive community treatment (ACT), and critical time intervention (CTI) for homeless adults. We searched databases for peer-reviewed English articles published from 1985 to 2011 and found 21 randomized controlled trials or quasi-experimental studies comparing case management to other services. We found little evidence for the effectiveness of ICM. SCM improved housing stability, reduced substance use, and removed employment barriers for substance users. ACT improved housing stability and was cost-effective for mentally ill and dually diagnosed persons. CTI showed promise for housing, psychopathology, and substance use and was cost-effective for mentally ill persons. More research is needed on how case management can most effectively support rapid-rehousing approaches to homelessness. PMID:23947309
Mapping the literature of case management nursing.
White, Pamela; Hall, Marilyn E
2006-04-01
Nursing case management provides a continuum of health care services for defined groups of patients. Its literature is multidisciplinary, emphasizing clinical specialties, case management methodology, and the health care system. This study is part of a project to map the literature of nursing, sponsored by the Nursing and Allied Health Resources Section of the Medical Library Association. The study identifies core journals cited in case management literature and indexing services that access those journals. Three source journals were identified based on established criteria, and cited references from each article published from 1997 to 1999 were analyzed. Nearly two-thirds of the cited references were from journals; others were from books, monographs, reports, government documents, and the Internet. Cited journal references were ranked in descending order, and Bradford's Law of Scattering was applied. The many journals constituting the top two zones reflect the diversity of this field. Zone 1 included journals from nursing administration, case management, general medicine, medical specialties, and social work. Two databases, PubMed/MEDLINE and OCLC ArticleFirst, provided the best indexing coverage. Collections that support case management require a relatively small group of core journals. Students and health care professionals will need to search across disciplines to identify appropriate literature.
EPA Facility Registry Service (FRS): RCRA
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of hazardous waste facilities that link to the Resource Conservation and Recovery Act Information System (RCRAInfo). EPA's comprehensive information system in support of the Resource Conservation and Recovery Act (RCRA) of 1976 and the Hazardous and Solid Waste Amendments (HSWA) of 1984, RCRAInfo tracks many types of information about generators, transporters, treaters, storers, and disposers of hazardous waste. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to RCRAInfo hazardous waste facilities once the RCRAInfo data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs
The new geographic information system in ETVA VI.PE.
NASA Astrophysics Data System (ADS)
Xagoraris, Zafiris; Soulis, George
2016-08-01
ETVA VI.PE. S.A. is a member of the Piraeus Bank Group of Companies and its activities include designing, developing, exploiting and managing Industrial Areas throughout Greece. Inside ETVA VI.PE.'s thirty-one Industrial Parks there are currently 2,500 manufacturing companies established, with 40,000 employees and € 2.5 billion of invested funds. In each one of the industrial areas ETVA VI.PE guarantees the companies industrial lots of land (sites) with propitious building codes and complete infrastructure networks of water supply, sewerage, paved roads, power supply, communications, cleansing services, etc. The development of Geographical Information System for ETVA VI.PE.'s Industrial Parks started at the beginning of 1992 and consists of three subsystems: Cadastre, that manages the information for the land acquisition of Industrial Areas; Street Layout - Sites, that manages the sites sold to manufacturing companies; Networks, that manages the infrastructure networks (roads, water supply, sewerage etc). The mapping of each Industrial Park is made incorporating state-of-the-art photogrammetric, cartographic and surveying methods and techniques. Passing through the phases of initial design (hybrid GIS) and system upgrade (integrated Gis solution with spatial database), the system is currently operating on a new upgrade (integrated gIS solution with spatial database) that includes redesigning and merging the system's database schemas, along with the creation of central security policies, and the development of a new web GIS application for advanced data entry, highly customisable and standard reports, and dynamic interactive maps. The new GIS bring the company to advanced levels of productivity and introduce the new era for decision making and business management.
Blank, Lindsay; Baxter, Susan; Woods, Helen Buckley; Goyder, Elizabeth; Lee, Andrew; Payne, Nick; Rimmer, Melanie
2014-01-01
Background Demand management defines any method used to monitor, direct, or regulate patient referrals. Strategies have been developed to manage the referral of patients to secondary care, with interventions that target primary care, specialist services, or infrastructure. Aim To review the international evidence on interventions to manage referral from primary to specialist care. Design and setting Systematic review. Method Iterative, systematic searches of published and unpublished sources public health, health management, management, and grey literature databases from health care and other industries were undertaken to identify recent, relevant studies. A narrative synthesis of the data was completed to structure the evidence into groups of similar interventions. Results The searches generated 8327 unique results, of which 140 studies were included. Interventions were grouped into four intervention categories: GP education (n = 50); process change (n = 49); system change (n = 38); and patient-focused (n = 3). It is clear that there is no ‘magic bullet’ to managing demand for secondary care services: although some groups of interventions may have greater potential for development, given the existing evidence that they can be effective in specific contexts. Conclusions To tackle demand management of primary care services, the focus cannot be on primary care alone; a whole-systems approach is needed because the introduction of interventions in primary care is often just the starting point of the referral process. In addition, more research is needed to develop and evaluate interventions that acknowledge the role of the patient in the referral decision. PMID:25452541
Nelson, Kurtis J.; Long, Donald G.; Connot, Joel A.
2016-02-29
The Landscape Fire and Resource Management Planning Tools (LANDFIRE) 2010 data release provides updated and enhanced vegetation, fuel, and fire regime layers consistently across the United States. The data represent landscape conditions from approximately 2010 and are the latest release in a series of planned updates to maintain currency of LANDFIRE data products. Enhancements to the data products included refinement of urban areas by incorporating the National Land Cover Database 2006 land cover product, refinement of agricultural lands by integrating the National Agriculture Statistics Service 2011 cropland data layer, and improved wetlands delineations using the National Land Cover Database 2006 land cover and the U.S. Fish and Wildlife Service National Wetlands Inventory data. Disturbance layers were generated for years 2008 through 2010 using remotely sensed imagery, polygons representing disturbance events submitted by local organizations, and fire mapping program data such as the Monitoring Trends in Burn Severity perimeters produced by the U.S. Geological Survey and the U.S. Forest Service. Existing vegetation data were updated to account for transitions in disturbed areas and to account for vegetation growth and succession in undisturbed areas. Surface and canopy fuel data were computed from the updated vegetation type, cover, and height and occasionally from potential vegetation. Historical fire frequency and succession classes were also updated. Revised topographic layers were created based on updated elevation data from the National Elevation Dataset. The LANDFIRE program also released a new Web site offering updated content, enhanced usability, and more efficient navigation.
Sunguya, Bruno F; Poudel, Krishna C; Mlunde, Linda B; Urassa, David P; Yasuoka, Junko; Jimba, Masamine
2013-09-24
Medical and nursing education lack adequate practical nutrition training to fit the clinical reality that health workers face in their practices. Such a deficit creates health workers with poor nutrition knowledge and child undernutrition management practices. In-service nutrition training can help to fill this gap. However, no systematic review has examined its collective effectiveness. We thus conducted this study to examine the effectiveness of in-service nutrition training on health workers' nutrition knowledge, counseling skills, and child undernutrition management practices. We conducted a literature search on nutrition interventions from PubMed/MEDLINE, CINAHL, EMBASE, ISI Web of Knowledge, and World Health Organization regional databases. The outcome variables were nutrition knowledge, nutrition-counseling skills, and undernutrition management practices of health workers. Due to heterogeneity, we conducted only descriptive analyses. Out of 3910 retrieved articles, 25 were selected as eligible for the final analysis. A total of 18 studies evaluated health workers' nutrition knowledge and showed improvement after training. A total of 12 studies with nutrition counseling as the outcome variable also showed improvement among the trained health workers. Sixteen studies evaluated health workers' child undernutrition management practices. In all such studies, child undernutrition management practices and competence of health workers improved after the nutrition training intervention. In-service nutrition training improves quality of health workers by rendering them more knowledge and competence to manage nutrition-related conditions, especially child undernutrition. In-service nutrition training interventions can help to fill the gap created by the lack of adequate nutrition training in the existing medical and nursing education system. In this way, steps can be taken toward improving the overall nutritional status of the child population.
47 CFR 15.713 - TV bands database.
Code of Federal Regulations, 2011 CFR
2011-10-01
... authorized services operating in the TV bands. In addition, a TV bands database must also verify that the FCC identifier (FCC ID) of a device seeking access to its services is valid; under this requirement the TV bands... information will come from the official Commission database. These services include: (i) Digital television...
Code of Federal Regulations, 2011 CFR
2011-10-01
... to have access to its directory assistance services, including directory assistance databases, so... provider, including transfer of the LECs' directory assistance databases in readily accessible magnetic.... Updates to the directory assistance database shall be made in the same format as the initial transfer...
Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery
NASA Astrophysics Data System (ADS)
Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.
2018-04-01
The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.
An Integrated Korean Biodiversity and Genetic Information Retrieval System
Lim, Jeongheui; Bhak, Jong; Oh, Hee-Mock; Kim, Chang-Bae; Park, Yong-Ha; Paek, Woon Kee
2008-01-01
Background On-line biodiversity information databases are growing quickly and being integrated into general bioinformatics systems due to the advances of fast gene sequencing technologies and the Internet. These can reduce the cost and effort of performing biodiversity surveys and genetic searches, which allows scientists to spend more time researching and less time collecting and maintaining data. This will cause an increased rate of knowledge build-up and improve conservations. The biodiversity databases in Korea have been scattered among several institutes and local natural history museums with incompatible data types. Therefore, a comprehensive database and a nation wide web portal for biodiversity information is necessary in order to integrate diverse information resources, including molecular and genomic databases. Results The Korean Natural History Research Information System (NARIS) was built and serviced as the central biodiversity information system to collect and integrate the biodiversity data of various institutes and natural history museums in Korea. This database aims to be an integrated resource that contains additional biological information, such as genome sequences and molecular level diversity. Currently, twelve institutes and museums in Korea are integrated by the DiGIR (Distributed Generic Information Retrieval) protocol, with Darwin Core2.0 format as its metadata standard for data exchange. Data quality control and statistical analysis functions have been implemented. In particular, integrating molecular and genetic information from the National Center for Biotechnology Information (NCBI) databases with NARIS was recently accomplished. NARIS can also be extended to accommodate other institutes abroad, and the whole system can be exported to establish local biodiversity management servers. Conclusion A Korean data portal, NARIS, has been developed to efficiently manage and utilize biodiversity data, which includes genetic resources. NARIS aims to be integral in maximizing bio-resource utilization for conservation, management, research, education, industrial applications, and integration with other bioinformation data resources. It can be found at . PMID:19091024
NASA Astrophysics Data System (ADS)
Madin, Joshua S.; Anderson, Kristen D.; Andreasen, Magnus Heide; Bridge, Tom C. L.; Cairns, Stephen D.; Connolly, Sean R.; Darling, Emily S.; Diaz, Marcela; Falster, Daniel S.; Franklin, Erik C.; Gates, Ruth D.; Hoogenboom, Mia O.; Huang, Danwei; Keith, Sally A.; Kosnik, Matthew A.; Kuo, Chao-Yang; Lough, Janice M.; Lovelock, Catherine E.; Luiz, Osmar; Martinelli, Julieta; Mizerek, Toni; Pandolfi, John M.; Pochon, Xavier; Pratchett, Morgan S.; Putnam, Hollie M.; Roberts, T. Edward; Stat, Michael; Wallace, Carden C.; Widman, Elizabeth; Baird, Andrew H.
2016-03-01
Trait-based approaches advance ecological and evolutionary research because traits provide a strong link to an organism’s function and fitness. Trait-based research might lead to a deeper understanding of the functions of, and services provided by, ecosystems, thereby improving management, which is vital in the current era of rapid environmental change. Coral reef scientists have long collected trait data for corals; however, these are difficult to access and often under-utilized in addressing large-scale questions. We present the Coral Trait Database initiative that aims to bring together physiological, morphological, ecological, phylogenetic and biogeographic trait information into a single repository. The database houses species- and individual-level data from published field and experimental studies alongside contextual data that provide important framing for analyses. In this data descriptor, we release data for 56 traits for 1547 species, and present a collaborative platform on which other trait data are being actively federated. Our overall goal is for the Coral Trait Database to become an open-source, community-led data clearinghouse that accelerates coral reef research.
Madin, Joshua S.; Anderson, Kristen D.; Andreasen, Magnus Heide; Bridge, Tom C.L.; Cairns, Stephen D.; Connolly, Sean R.; Darling, Emily S.; Diaz, Marcela; Falster, Daniel S.; Franklin, Erik C.; Gates, Ruth D.; Hoogenboom, Mia O.; Huang, Danwei; Keith, Sally A.; Kosnik, Matthew A.; Kuo, Chao-Yang; Lough, Janice M.; Lovelock, Catherine E.; Luiz, Osmar; Martinelli, Julieta; Mizerek, Toni; Pandolfi, John M.; Pochon, Xavier; Pratchett, Morgan S.; Putnam, Hollie M.; Roberts, T. Edward; Stat, Michael; Wallace, Carden C.; Widman, Elizabeth; Baird, Andrew H.
2016-01-01
Trait-based approaches advance ecological and evolutionary research because traits provide a strong link to an organism’s function and fitness. Trait-based research might lead to a deeper understanding of the functions of, and services provided by, ecosystems, thereby improving management, which is vital in the current era of rapid environmental change. Coral reef scientists have long collected trait data for corals; however, these are difficult to access and often under-utilized in addressing large-scale questions. We present the Coral Trait Database initiative that aims to bring together physiological, morphological, ecological, phylogenetic and biogeographic trait information into a single repository. The database houses species- and individual-level data from published field and experimental studies alongside contextual data that provide important framing for analyses. In this data descriptor, we release data for 56 traits for 1547 species, and present a collaborative platform on which other trait data are being actively federated. Our overall goal is for the Coral Trait Database to become an open-source, community-led data clearinghouse that accelerates coral reef research. PMID:27023900
Madin, Joshua S; Anderson, Kristen D; Andreasen, Magnus Heide; Bridge, Tom C L; Cairns, Stephen D; Connolly, Sean R; Darling, Emily S; Diaz, Marcela; Falster, Daniel S; Franklin, Erik C; Gates, Ruth D; Harmer, Aaron; Hoogenboom, Mia O; Huang, Danwei; Keith, Sally A; Kosnik, Matthew A; Kuo, Chao-Yang; Lough, Janice M; Lovelock, Catherine E; Luiz, Osmar; Martinelli, Julieta; Mizerek, Toni; Pandolfi, John M; Pochon, Xavier; Pratchett, Morgan S; Putnam, Hollie M; Roberts, T Edward; Stat, Michael; Wallace, Carden C; Widman, Elizabeth; Baird, Andrew H
2016-03-29
Trait-based approaches advance ecological and evolutionary research because traits provide a strong link to an organism's function and fitness. Trait-based research might lead to a deeper understanding of the functions of, and services provided by, ecosystems, thereby improving management, which is vital in the current era of rapid environmental change. Coral reef scientists have long collected trait data for corals; however, these are difficult to access and often under-utilized in addressing large-scale questions. We present the Coral Trait Database initiative that aims to bring together physiological, morphological, ecological, phylogenetic and biogeographic trait information into a single repository. The database houses species- and individual-level data from published field and experimental studies alongside contextual data that provide important framing for analyses. In this data descriptor, we release data for 56 traits for 1547 species, and present a collaborative platform on which other trait data are being actively federated. Our overall goal is for the Coral Trait Database to become an open-source, community-led data clearinghouse that accelerates coral reef research.
Relax with CouchDB - Into the non-relational DBMS era of Bioinformatics
Manyam, Ganiraju; Payton, Michelle A.; Roth, Jack A.; Abruzzo, Lynne V.; Coombes, Kevin R.
2012-01-01
With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. PMID:22609849
Discovering Knowledge from AIS Database for Application in VTS
NASA Astrophysics Data System (ADS)
Tsou, Ming-Cheng
The widespread use of the Automatic Identification System (AIS) has had a significant impact on maritime technology. AIS enables the Vessel Traffic Service (VTS) not only to offer commonly known functions such as identification, tracking and monitoring of vessels, but also to provide rich real-time information that is useful for marine traffic investigation, statistical analysis and theoretical research. However, due to the rapid accumulation of AIS observation data, the VTS platform is often unable quickly and effectively to absorb and analyze it. Traditional observation and analysis methods are becoming less suitable for the modern AIS generation of VTS. In view of this, we applied the same data mining technique used for business intelligence discovery (in Customer Relation Management (CRM) business marketing) to the analysis of AIS observation data. This recasts the marine traffic problem as a business-marketing problem and integrates technologies such as Geographic Information Systems (GIS), database management systems, data warehousing and data mining to facilitate the discovery of hidden and valuable information in a huge amount of observation data. Consequently, this provides the marine traffic managers with a useful strategic planning resource.
Robertson, Merryn; Callen, Joanne
The profile of health information managers (HIMs) employed within one metropolitan area health service in New South Wales (NSW) was identified, together with which information technology and health informatics knowledge and skills they possess, and which ones they require in their workplace. The subjects worked in a variety of roles: 26% were employed in the area's Information Systems Division developing and implementing point-of-care clinical systems. Health information managers perceived they needed further continuing and formal education in point-of-care clinical systems, decision support systems, the electronic health record, privacy and security, health data collections, and database applications.
Managing and Querying Image Annotation and Markup in XML.
Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel
2010-01-01
Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid.
Managing and Querying Image Annotation and Markup in XML
Wang, Fusheng; Pan, Tony; Sharma, Ashish; Saltz, Joel
2010-01-01
Proprietary approaches for representing annotations and image markup are serious barriers for researchers to share image data and knowledge. The Annotation and Image Markup (AIM) project is developing a standard based information model for image annotation and markup in health care and clinical trial environments. The complex hierarchical structures of AIM data model pose new challenges for managing such data in terms of performance and support of complex queries. In this paper, we present our work on managing AIM data through a native XML approach, and supporting complex image and annotation queries through native extension of XQuery language. Through integration with xService, AIM databases can now be conveniently shared through caGrid. PMID:21218167
A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Michelle M.; Wu, Chase Q.
2013-11-07
Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization formore » this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.« less
The Impact of Online Bibliographic Databases on Teaching and Research in Political Science.
ERIC Educational Resources Information Center
Reichel, Mary
The availability of online bibliographic databases greatly facilitates literature searching in political science. The advantages to searching databases online include combination of concepts, comprehensiveness, multiple database searching, free-text searching, currency, current awareness services, document delivery service, and convenience.…
ERIC Educational Resources Information Center
Godbee, Sara; de Jong, Mark
2007-01-01
The University of Maryland University College (UMUC) serves a dispersed patron base, and its library has developed, over time, a circulation system for distributing physical research materials to its patrons throughout the United States. This article discusses the development of this system and its associated interface/database management system…
C. Dillingham; M.R. Poe; E. Grinspoon; C. Stuart; C. Moseley; R. Mazza; S. Charnley; L. Meierotto; E. Donoghue; N. Toth
2008-01-01
This report examines socioeconomic changes that occurred between 1990 and 2003 associated with implementation of the Northwest Forest Plan (the Plan) in and around lands managed by the Okanogan-Wenatchee National Forest in Washington state. Our findings are based on quantitative data from the U.S. census, the USDA Forest Service and other federal databases, historical...
Collaborative Interactive Visualization Exploratory Concept
2015-06-01
the FIAC concepts. It consists of various DRDC-RDDC-2015-N004 intelligence analysis web services build of top of big data technologies exploited...sits on the UDS where validated common knowledge is stored. Based on the Lumify software2, this important component exploits big data technologies such...interfaces. Above this database resides the Big Data Manager responsible for transparent data transmission between the UDS and the rest of the S3
47 CFR 69.118 - Traffic sensitive switched services.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Docket 86-10, FCC 93-53 (1993). Moreover, all customers that use basic 800 database service shall be... account revenues from the relevant Basic Service Element or Elements and 800 Database Service Elements in...
47 CFR 69.118 - Traffic sensitive switched services.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Docket 86-10, FCC 93-53 (1993). Moreover, all customers that use basic 800 database service shall be... account revenues from the relevant Basic Service Element or Elements and 800 Database Service Elements in...
47 CFR 69.118 - Traffic sensitive switched services.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Docket 86-10, FCC 93-53 (1993). Moreover, all customers that use basic 800 database service shall be... account revenues from the relevant Basic Service Element or Elements and 800 Database Service Elements in...
47 CFR 69.118 - Traffic sensitive switched services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Docket 86-10, FCC 93-53 (1993). Moreover, all customers that use basic 800 database service shall be... account revenues from the relevant Basic Service Element or Elements and 800 Database Service Elements in...
47 CFR 69.118 - Traffic sensitive switched services.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Docket 86-10, FCC 93-53 (1993). Moreover, all customers that use basic 800 database service shall be... account revenues from the relevant Basic Service Element or Elements and 800 Database Service Elements in...
Hudon, Catherine; Chouinard, Maud-Christine; Lambert, Mireille; Diadiou, Fatoumata; Bouliane, Danielle; Beaudin, Jérémie
2017-01-01
Objective The aim of this paper was to identify the key factors of case management (CM) interventions among frequent users of healthcare services found in empirical studies of effectiveness. Design Thematic analysis review of CM studies. Methods We built on a previously published review that aimed to report the effectiveness of CM interventions for frequent users of healthcare services, using the Medline, Scopus and CINAHL databases covering the January 2004–December 2015 period, then updated to July 2017, with the keywords ‘CM’ and ‘frequent use’. We extracted factors of successful (n=7) and unsuccessful (n=6) CM interventions and conducted a mixed thematic analysis to synthesise findings. Chaudoir’s implementation of health innovations framework was used to organise results into four broad levels of factors: (1) environmental/organisational level, (2) practitioner level, (3) patient level and (4) programme level. Results Access to, and close partnerships with, healthcare providers and community services resources were key factors of successful CM interventions that should target patients with the greatest needs and promote frequent contacts with the healthcare team. The selection and training of the case manager was also an important factor to foster patient engagement in CM. Coordination of care, self-management support and assistance with care navigation were key CM activities. The main issues reported by unsuccessful CM interventions were problems with case finding or lack of care integration. Conclusions CM interventions for frequent users of healthcare services should ensure adequate case finding processes, rigorous selection and training of the case manager, sufficient intensity of the intervention, as well as good care integration among all partners. Other studies could further evaluate the influence of contextual factors on intervention impacts. PMID:29061623
Fennelly, Orna; Blake, Catherine; FitzGerald, Oliver; Breen, Roisin; Ashton, Jennifer; Brennan, Aisling; Caffrey, Aoife; Desmeules, François; Cunningham, Caitriona
2018-06-01
Many people with musculoskeletal (MSK) disorders wait several months or years for Consultant Doctor appointments, despite often not requiring medical or surgical interventions. To allow earlier patient access to orthopaedic and rheumatology services in Ireland, Advanced Practice Physiotherapists (APPs) were introduced at 16 major acute hospitals. This study performed the first national evaluation of APP triage services. Throughout 2014, APPs (n = 22) entered clinical data on a national database. Analysis of these data using descriptive statistics determined patient wait times, Consultant Doctor involvement in clinical decisions, and patient clinical outcomes. Chi square tests were used to compare patient clinical outcomes across orthopaedic and rheumatology clinics. A pilot study at one site identified re-referral rates to orthopaedic/rheumatology services of patients managed by the APPs. In one year, 13,981 new patients accessed specialist orthopaedic and rheumatology consultations via the APP. Median wait time for an appointment was 5.6 months. Patients most commonly presented with knee (23%), lower back (22%) and shoulder (15%) disorders. APPs made autonomous clinical decisions regarding patient management at 77% of appointments, and managed patient care pathways without onward referral to Consultant Doctors in more than 80% of cases. Other onward clinical pathways recommended by APPs were: physiotherapy referrals (42%); clinical investigations (29%); injections administered (4%); and surgical listing (2%). Of those managed by the APP, the pilot study identified that only 6.5% of patients were re-referred within one year. This national evaluation of APP services demonstrated that the majority of patients assessed by an APP did not require onward referral for a Consultant Doctor appointment. Therefore, patients gained earlier access to orthopaedic and rheumatology consultations in secondary care, with most patients conservatively managed.
TWRS technical baseline database manager definition document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acree, C.D.
1997-08-13
This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.
A Multi-Purpose Data Dissemination Infrastructure for the Marine-Earth Observations
NASA Astrophysics Data System (ADS)
Hanafusa, Y.; Saito, H.; Kayo, M.; Suzuki, H.
2015-12-01
To open the data from a variety of observations, the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) has developed a multi-purpose data dissemination infrastructure. Although many observations have been made in the earth science, all the data are not opened completely. We think data centers may provide researchers with a universal data dissemination service which can handle various kinds of observation data with little effort. For this purpose JAMSTEC Data Management Office has developed the "Information Catalog Infrastructure System (Catalog System)". This is a kind of catalog management system which can create, renew and delete catalogs (= databases) and has following features, - The Catalog System does not depend on data types or granularity of data records. - By registering a new metadata schema to the system, a new database can be created on the same system without sytem modification. - As web pages are defined by the cascading style sheets, databases have different look and feel, and operability. - The Catalog System provides databases with basic search tools; search by text, selection from a category tree, and selection from a time line chart. - For domestic users it creates the Japanese and English pages at the same time and has dictionary to control terminology and proper noun. As of August 2015 JAMSTEC operates 7 databases on the Catalog System. We expect to transfer existing databases to this system, or create new databases on it. In comparison with a dedicated database developed for the specific dataset, the Catalog System is suitable for the dissemination of small datasets, with minimum cost. Metadata held in the catalogs may be transfered to other metadata schema to exchange global databases or portals. Examples: JAMSTEC Data Catalog: http://www.godac.jamstec.go.jp/catalog/data_catalog/metadataList?lang=enJAMSTEC Document Catalog: http://www.godac.jamstec.go.jp/catalog/doc_catalog/metadataList?lang=en&tab=categoryResearch Information and Data Access Site of TEAMS: http://www.i-teams.jp/catalog/rias/metadataList?lang=en&tab=list
Narendran, Rajesh C; Duarte, Rui V; Valyi, Andrea; Eldabe, Sam
2015-06-30
The aim of this study was to evaluate changes in the uptake of intrathecal baclofen (ITB) following commissioning of this therapy by the National Health Service (NHS) England in April 2013. The specific objectives of this study were: (i) to explore the gap between the need for and the actual provision of ITB services; and (ii) to compare England figures with other European countries with comparable data available. Data for ITB -related procedures were obtained from the Hospital Episode Statistics (HES) database from 2009/2010 to 2013/2014. Patients receiving ITB for the management of spasticity. The available data for implantation of ITB from 2009/2010 to 2013/2014 for the treatment of spasticity due to varied aetiologies show that there has not been an increase in uptake of this therapy. The estimated need for this treatment based on the incidence and prevalence of conditions susceptible to ITB therapy is between 4.6 and 5.7 per million population. Our analysis of the data available from the HES database showed that the actual number of implants is around 3.0 per million population. The same period 2009-2014 has seen an increase in the delivery of other neuromodulation techniques including spinal cord stimulation, deep brain stimulation and sacral nerve stimulation. There is a considerable gap between the need for and provision of ITB figures nationally. Additionally, within the same area, we have observed important differences in the ITB service delivery between the various trusts. The reasons for this can be multifactorial, including individual experience and opinions, organisational structures, resource and financial limitations. Further research analysing the efficacy and cost-effectiveness of this treatment in the UK might inform the development of Technology Appraisal Guidance for ITB, potentially leading to an improvement in service provision. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Doing Your Science While You're in Orbit
NASA Astrophysics Data System (ADS)
Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.
2010-11-01
Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.
[The role of supply-side characteristics of services in AIDS mortality in Mexico].
Bautista-Arredondo, Sergio; Serván-Mori, Edson; Silverman-Retana, Omar; Contreras-Loya, David; Romero-Martínez, Martín; Magis-Rodríguez, Carlos; Uribe-Zúñiga, Patricia; Lozano, Rafael
2015-01-01
To document the association between supply-side determinants and AIDS mortality in Mexico between 2008 and 2013. We analyzed the SALVAR database (system for antiretroviral management, logistics and surveillance) as well as data collected through a nationally representative survey in health facilities. We used multivariate logit regression models to estimate the association between supply-side characteristics, namely management, training and experience of health care providers, and AIDS mortality, distinguishing early and non-early mortality and controlling for clinical indicators of the patients. Clinic status of the patients (initial CD4 and viral load) explain 44.4% of the variability of early mortality across clinics and 13.8% of the variability in non-early mortality. Supply-side characteristics increase explanatory power of the models by 16% in the case of early mortality, and 96% in the case of non-early mortality. Aspects of management and implementation of services contribute significantly to explain AIDS mortality in Mexico. Improving these aspects of the national program, can similarly improve its results.
A portal for the ocean biogeographic information system
Zhang, Yunqing; Grassle, J. F.
2002-01-01
Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.
An effective XML based name mapping mechanism within StoRM
NASA Astrophysics Data System (ADS)
Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.
2008-07-01
In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.
Clark, Malissa A; O'Neal, Catherine W; Conley, Kate M; Mancini, Jay A
2018-01-01
Deployment affects not just the service members, but also their family members back home. Accordingly, this study examined how resilient family processes during a deployment (i.e., frequency of communication and household management) were related to the personal reintegration of each family member (i.e., how well each family member begins to "feel like oneself again" after a deployment), as well as several indicators of subjective well-being. Drawing from the family attachment network model (Riggs & Riggs, 2011), the present study collected survey data from 273 service members, their partners, and their adolescent children. Resilient family processes during the deployment itself (i.e., frequency of communication, household management), postdeployment positive and negative personal reintegration, and several indicators of well-being were assessed. Frequency of communication was related to personal reintegration for service members, while household management was related to personal reintegration for nondeployed partners; both factors were related to personal reintegration for adolescents. Negative and positive personal reintegration related to a variety of subjective well-being outcomes for each individual family member. Interindividual (i.e., crossover) effects were also found, particularly between adolescents and nondeployed partners. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Sipos, Roland; Govi, Giacomo; Franzoni, Giovanni; Di Guida, Salvatore; Pfeiffer, Andreas
2017-10-01
The CMS experiment at CERN LHC has a dedicated infrastructure to handle the alignment and calibration data. This infrastructure is composed of several services, which take on various data management tasks required for the consumption of the non-event data (also called as condition data) in the experiment activities. The criticality of these tasks imposes tights requirements for the availability and the reliability of the services executing them. In this scope, a comprehensive monitoring and alarm generating system has been developed. The system has been implemented based on the Nagios open source industry standard for monitoring and alerting services, and monitors the database back-end, the hosting nodes and key heart-beat functionalities for all the services involved. This paper describes the design, implementation and operational experience with the monitoring system developed and deployed at CMS in 2016.
Shea, S; Sengupta, S; Crosswell, A; Clayton, P D
1992-01-01
The developing Integrated Academic Information System (IAIMS) at Columbia-Presbyterian Medical Center provides data sharing links between two separate corporate entities, namely Columbia University Medical School and The Presbyterian Hospital, using a network-based architecture. Multiple database servers with heterogeneous user authentication protocols are linked to this network. "One-stop information shopping" implies one log-on procedure per session, not separate log-on and log-off procedures for each server or application used during a session. These circumstances provide challenges at the policy and technical levels to data security at the network level and insuring smooth information access for end users of these network-based services. Five activities being conducted as part of our security project are described: (1) policy development; (2) an authentication server for the network; (3) Kerberos as a tool for providing mutual authentication, encryption, and time stamping of authentication messages; (4) a prototype interface using Kerberos services to authenticate users accessing a network database server; and (5) a Kerberized electronic signature.
Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Finocchiaro, Raymond G.; Stamm, John F.
2009-01-01
Long-term accumulation of salts in wetlands at Bowdoin National Wildlife Refuge (NWR), Mont., has raised concern among wetland managers that increasing salinity may threaten plant and invertebrate communities that provide important habitat and food resources for migratory waterfowl. Currently, the U.S. Fish and Wildlife Service (USFWS) is evaluating various water management strategies to help maintain suitable ranges of salinity to sustain plant and invertebrate resources of importance to wildlife. To support this evaluation, the USFWS requested that the U.S. Geological Survey (USGS) provide information on salinity ranges of water and soil for common plants and invertebrates on Bowdoin NWR lands. To address this need, we conducted a search of the literature on occurrences of plants and invertebrates in relation to salinity and pH of the water and soil. The compiled literature was used to (1) provide a general overview of salinity concepts, (2) document published tolerances and adaptations of biota to salinity, (3) develop databases that the USFWS can use to summarize the range of reported salinity values associated with plant and invertebrate taxa, and (4) perform database summaries that describe reported salinity ranges associated with plants and invertebrates at Bowdoin NWR. The purpose of this report is to synthesize information to facilitate a better understanding of the ecological relations between salinity and flora and fauna when developing wetland management strategies. A primary focus of this report is to provide information to help evaluate and address salinity issues at Bowdoin NWR; however, the accompanying databases, as well as concepts and information discussed, are applicable to other areas or refuges. The accompanying databases include salinity values reported for 411 plant taxa and 330 invertebrate taxa. The databases are available in Microsoft Excel version 2007 (http://pubs.usgs.gov/sir/2009/5098/downloads/databases_21april2009.xls) and contain 27 data fields that include variables such as taxonomic identification, values for salinity and pH, wetland classification, location of study, and source of data. The databases are not exhaustive of the literature and are biased toward wetland habitats located in the glaciated North-Central United States; however, the databases do encompass a diversity of biota commonly found in brackish and freshwater inland wetland habitats.
Development of Online Database Services in Japan and Perspectives on Asia.
ERIC Educational Resources Information Center
Miyakawa, Takayasu
This paper outlines the market developments, governmental promotion policies, and efforts by private industries for online database services in Japan since the late 1970s. The combination of these efforts over the years has resulted in an online database service market of US$20 billion annually, of which approximately one third is Western online…
A component-based, distributed object services architecture for a clinical workstation.
Chueh, H C; Raila, W F; Pappas, J J; Ford, M; Zatsman, P; Tu, J; Barnett, G O
1996-01-01
Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces.
A component-based, distributed object services architecture for a clinical workstation.
Chueh, H. C.; Raila, W. F.; Pappas, J. J.; Ford, M.; Zatsman, P.; Tu, J.; Barnett, G. O.
1996-01-01
Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces. PMID:8947744
Effectiveness of 'rehabilitation in the home' service.
Bharadwaj, Sneha; Bruce, David
2014-11-01
Rehabilitation in the home (RITH) services increasingly provide hospital substitution services. This study examines clinical outcomes in a large metropolitan RITH service in Western Australia. The 2010 database of Fremantle Hospital RITH service was interrogated to identify the clinical profile of cases, length of stay (LOS) and clinical outcomes. Negative outcomes included death or unexpected hospital readmission. Multiple logistic regression modelling was used to explore associations with negative outcomes. This study was reviewed by the Institutional Review Board which deemed it not to require ethics approval. There were 1348 cases managed by RITH: 70.6% were aged≥65 years; elective joint replacement (29.7%), medical conditions (20%), stroke (13%), hip fractures (10%) were major contributors. The majority (93.3%) were discharged after a median of 9 days. Negative outcomes occurred in 90 cases (6.7%), including five deaths (0.4%) and 85 readmissions (6.3%). Independent associations with negative outcomes included older age (odds ratio (OR) (95% CI); 1.02, P=0.006), orthopaedic conditions (OR 1.91, P=0.004) and longer inpatient LOS (OR 1.96, P=0.003). Age above 80 years was independently associated with risk of negative outcome (OR 2.99, P=0.004). RITH had a low rate of negative outcomes. The database proved useful for monitoring quality of service provision. WHAT IS KNOWN ABOUT THE TOPIC?: Rehabilitation in the home environment has proven cost effective for multiple conditions, particularly stroke and elective joint surgery, among others, facilitating better quality of life, with reduced rates of delirium and mortality. Overall there are few negative outcomes and death is rare. WHAT DOES THIS PAPER ADD?: Although RITH services are widely utilised as bed substitution services, there is scant literature on clinical outcomes while within the service. This study focuses on frequency of good and poor clinical outcomes in a well-established RITH service in Western Australia, suggesting pattern recognition of an at-risk cohort by identifying potentially useful predictors of poor outcome. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS?: RITH services are a safe alternative for many, including older people. Health administration databases are useful tools to monitor clinical outcomes. Clinical indicators such as older age, long hospital stay and orthopaedic diagnoses may be useful predictors of poor outcomes in such services.
Short Fiction on Film: A Relational DataBase.
ERIC Educational Resources Information Center
May, Charles
Short Fiction on Film is a database that was created and will run on DataRelator, a relational database manager created by Bill Finzer for the California State Department of Education in 1986. DataRelator was designed for use in teaching students database management skills and to provide teachers with examples of how a database manager might be…
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2012 CFR
2012-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2013 CFR
2013-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2011 CFR
2011-10-01
... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers; to develop procedures that these database managers will use to ensure compliance with the...
Improving healthcare services using web based platform for management of medical case studies.
Ogescu, Cristina; Plaisanu, Claudiu; Udrescu, Florian; Dumitru, Silviu
2008-01-01
The paper presents a web based platform for management of medical cases, support for healthcare specialists in taking the best clinical decision. Research has been oriented mostly on multimedia data management, classification algorithms for querying, retrieving and processing different medical data types (text and images). The medical case studies can be accessed by healthcare specialists and by students as anonymous case studies providing trust and confidentiality in Internet virtual environment. The MIDAS platform develops an intelligent framework to manage sets of medical data (text, static or dynamic images), in order to optimize the diagnosis and the decision process, which will reduce the medical errors and will increase the quality of medical act. MIDAS is an integrated project working on medical information retrieval from heterogeneous, distributed medical multimedia database.
Management and assimilation of diverse, distributed watershed datasets
NASA Astrophysics Data System (ADS)
Varadharajan, C.; Faybishenko, B.; Versteeg, R.; Agarwal, D.; Hubbard, S. S.; Hendrix, V.
2016-12-01
The U.S. Department of Energy's (DOE) Watershed Function Scientific Focus Area (SFA) seeks to determine how perturbations to mountainous watersheds (e.g., floods, drought, early snowmelt) impact the downstream delivery of water, nutrients, carbon, and metals over seasonal to decadal timescales. We are building a software platform that enables integration of diverse and disparate field, laboratory, and simulation datasets, of various types including hydrological, geological, meteorological, geophysical, geochemical, ecological and genomic datasets across a range of spatial and temporal scales within the Rifle floodplain and the East River watershed, Colorado. We are using agile data management and assimilation approaches, to enable web-based integration of heterogeneous, multi-scale dataSensor-based observations of water-level, vadose zone and groundwater temperature, water quality, meteorology as well as biogeochemical analyses of soil and groundwater samples have been curated and archived in federated databases. Quality Assurance and Quality Control (QA/QC) are performed on priority datasets needed for on-going scientific analyses, and hydrological and geochemical modeling. Automated QA/QC methods are used to identify and flag issues in the datasets. Data integration is achieved via a brokering service that dynamically integrates data from distributed databases via web services, based on user queries. The integrated results are presented to users in a portal that enables intuitive search, interactive visualization and download of integrated datasets. The concepts, approaches and codes being used are shared across various data science components of various large DOE-funded projects such as the Watershed Function SFA, Next Generation Ecosystem Experiment (NGEE) Tropics, Ameriflux/FLUXNET, and Advanced Simulation Capability for Environmental Management (ASCEM), and together contribute towards DOE's cyberinfrastructure for data management and model-data integration.
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Guarise, A.; Lusso, S.; Masera, M.; Vallero, S.
2015-12-01
The INFN computing centre in Torino hosts a private Cloud, which is managed with the OpenNebula cloud controller. The infrastructure offers Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) services to different scientific computing applications. The main stakeholders of the facility are a grid Tier-2 site for the ALICE collaboration at LHC, an interactive analysis facility for the same experiment and a grid Tier-2 site for the BESIII collaboration, plus an increasing number of other small tenants. The dynamic allocation of resources to tenants is partially automated. This feature requires detailed monitoring and accounting of the resource usage. We set up a monitoring framework to inspect the site activities both in terms of IaaS and applications running on the hosted virtual instances. For this purpose we used the ElasticSearch, Logstash and Kibana (ELK) stack. The infrastructure relies on a MySQL database back-end for data preservation and to ensure flexibility to choose a different monitoring solution if needed. The heterogeneous accounting information is transferred from the database to the ElasticSearch engine via a custom Logstash plugin. Each use-case is indexed separately in ElasticSearch and we setup a set of Kibana dashboards with pre-defined queries in order to monitor the relevant information in each case. For the IaaS metering, we developed sensors for the OpenNebula API. The IaaS level information gathered through the API is sent to the MySQL database through an ad-hoc developed RESTful web service. Moreover, we have developed a billing system for our private Cloud, which relies on the RabbitMQ message queue for asynchronous communication to the database and on the ELK stack for its graphical interface. The Italian Grid accounting framework is also migrating to a similar set-up. Concerning the application level, we used the Root plugin TProofMonSenderSQL to collect accounting data from the interactive analysis facility. The BESIII virtual instances used to be monitored with Zabbix, as a proof of concept we also retrieve the information contained in the Zabbix database. In this way we have achieved a uniform monitoring interface for both the IaaS and the scientific applications, mostly leveraging off-the-shelf tools. At present, we are working to define a model for monitoring-as-a-service, based on the tools described above, which the Cloud tenants can easily configure to suit their specific needs.
Theodoros, Deborah; Aldridge, Danielle; Hill, Anne J; Russell, Trevor
2018-06-19
Communication and swallowing disorders are highly prevalent in people with Parkinson's disease (PD). Maintenance of functional communication and swallowing over time is challenging for the person with PD and their families and may lead to social isolation and reduced quality of life if not addressed. Speech and language therapists (SLTs) face the conundrum of providing sustainable and flexible services to meet the changing needs of people with PD. Motor, cognitive and psychological issues associated with PD, medication regimens and dependency on others often impede attendance at a centre-based service. The access difficulties experienced by people with PD require a disruptive service approach to meet their needs. Technology-enabled management using information and telecommunications technologies to provide services at a distance has the potential to improve access, and enhance the quality of SLT services to people with PD. To report the status and scope of the evidence for the use of technology in the management of the communication and swallowing disorders associated with PD. Studies were retrieved from four major databases (PubMed, CINAHL, EMBASE and Medline via Web of Science). Data relating to the types of studies, level of evidence, context, nature of the management undertaken, participant perspectives and the types of technologies involved were extracted for the review. A total of 17 studies were included in the review, 15 of which related to the management of communication and swallowing disorders in PD with two studies devoted to participant perspectives. The majority of the studies reported on the treatment of the speech disorder in PD using Lee Silverman Voice Treatment (LSVT LOUD ® ). Synchronous and asynchronous technologies were used in the studies with a predominance of the former. There was a paucity of research in the management of cognitive-communication and swallowing disorders. Research evidence supporting technology-enabled management of the communication and swallowing disorders in PD is limited and predominantly low in quality. The treatment of the speech disorder online is the most developed aspect of the technology-enabled management pathway. Future research needs to address technology-enabled management of cognitive-communication and swallowing disorders and the use of a more diverse range of technologies and management approaches to optimize SLT service delivery to people with PD. © 2018 Royal College of Speech and Language Therapists.
Microcomputer Database Management Systems for Bibliographic Data.
ERIC Educational Resources Information Center
Pollard, Richard
1986-01-01
Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)
The Data Base and Decision Making in Public Schools.
ERIC Educational Resources Information Center
Hedges, William D.
1984-01-01
Describes generic types of databases--file management systems, relational database management systems, and network/hierarchical database management systems--with their respective strengths and weaknesses; discusses factors to be considered in determining whether a database is desirable; and provides evaluative criteria for use in choosing…
NASA Astrophysics Data System (ADS)
Boulanger, D.; Thouret, V.
2016-12-01
IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core and IAGOS-CARIBIC data. The IAGOS Data Portal (http://www.iagos.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles). New added value products are available through the portal: back trajectories, origin of air masses, co-location with satellite data. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals and other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. Duting the next year IAGOS will improve metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Measurements traceability and quality metadata will be available and DOI will be implemented.
Joint Battlespace Infosphere: Information Management Within a C2 Enterprise
2005-06-01
using. In version 1.2, we support both MySQL and Oracle as underlying implementations where the XML metadata schema is mapped into relational tables in...Identity Servers, Role-Based Access Control, and Policy Representation – Databases: Oracle , MySQL , TigerLogic, Berkeley XML DB 15 Instrumentation Services...converted to SQL for execution. Invocations are then forwarded to the appropriate underlying IOR core components that have the responsibility of issuing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ham, Timothy
2008-12-01
The JBEI Registry is a software to store and manage to a database of biological parts. It is intended to be used as a web service that is accessed via a web browser. It is also capable of running as a desktop program for a single user. The registry software stores, indexes, categories, and allows users to enter, search, retrieve, and contruct biological constructs in silico. It is also able to communicate with other Registries for data sharing and exchange.
Air Force Civil Engineer, Volume 14, Number 2, 2006
2006-01-01
homes located off base), reimburs - able service agreements are created between the housing development’s project owner and both the Air Force...Just as with security, fire protection is provided by the on-base fire department on a reimbursable basis. At a recent fire at Hanscom AFB, Mass... reimbursable clients; and programming functions. Input of this “living record” allows the database to manage the 5-Year Plan so 16 AIR FORCE CIVIL
Brink-Huis, Anita; van Achterberg, Theo; Schoonhoven, Lisette
2008-08-01
This paper reports a review of the literature conducted to identify organisation models in cancer pain management that contain integrated care processes and describe their effectiveness. Pain is experienced by 30-50% of cancer patients receiving treatment and by 70-90% of those with advanced disease. Efforts to improve pain management have been made through the development and dissemination of clinical guidelines. Early improvements in pain management were focussed on just one or two single processes such as pain assessment and patient education. Little is known about organisational models with multiple integrated processes throughout the course of the disease trajectory and concerning all stages of the care process. Systematic review. The review involved a systematic search of the literature, published between 1986-2006. Subject-specific keywords used to describe patients, disease, pain management interventions and integrated care processes, relevant for this review were selected using the thesaurus of the databases. Institutional models, clinical pathways and consultation services are three alternative models for the integration of care processes in cancer pain management. A clinical pathway is a comprehensive institutionalisation model, whereas a pain consultation service is a 'stand-alone' model that can be integrated in a clinical pathway. Positive patient and process outcomes have been described for all three models, although the level of evidence is generally low. Evaluation of the quality of pain management must involve standardised measurements of both patient and process outcomes. We recommend the development of policies for referrals to a pain consultation service. These policies can be integrated within a clinical pathway. To evaluate the effectiveness of pain management models standardised outcome measures are needed.
47 CFR 0.241 - Authority delegated.
Code of Federal Regulations, 2014 CFR
2014-10-01
... individual database managers; and to perform other functions as needed for the administration of the TV bands... database functions for unlicensed devices operating in the television broadcast bands (TV bands) as set... methods that will be used to designate TV bands database managers, to designate these database managers...
Kilintzis, Vassilis; Beredimas, Nikolaos; Chouvarda, Ioanna
2014-01-01
An integral part of a system that manages medical data is the persistent storage engine. For almost twenty five years Relational Database Management Systems(RDBMS) were considered the obvious decision, yet today new technologies have emerged that require our attention as possible alternatives. Triplestores store information in terms of RDF triples without necessarily binding to a specific predefined structural model. In this paper we present an attempt to compare the performance of Apache JENA-Fuseki and the Virtuoso Universal Server 6 triplestores with that of MySQL 5.6 RDBMS for storing and retrieving medical information that it is communicated as RDF/XML ontology instances over a RESTful web service. The results show that the performance, calculated as average time of storing and retrieving instances, is significantly better using Virtuoso Server while MySQL performed better than Fuseki.
Medicaid care management: description of high-cost addictions treatment clients.
Neighbors, Charles J; Sun, Yi; Yerneni, Rajeev; Tesiny, Ed; Burke, Constance; Bardsley, Leland; McDonald, Rebecca; Morgenstern, Jon
2013-09-01
High utilizers of alcohol and other drug treatment (AODTx) services are a priority for healthcare cost control. We examine characteristics of Medicaid-funded AODTx clients, comparing three groups: individuals <90th percentile of AODTx expenditures (n=41,054); high-cost clients in the top decile of AODTx expenditures (HC; n=5,718); and 1760 enrollees in a chronic care management (CM) program for HC clients implemented in 22 counties in New York State. Medicaid and state AODTx registry databases were combined to draw demographic, clinical, social needs and treatment history data. HC clients accounted for 49% of AODTx costs funded by Medicaid. As expected, HC clients had significant social welfare needs, comorbid medical and psychiatric conditions, and use of inpatient services. The CM program was successful in enrolling some high-needs, high-cost clients but faced barriers to reaching the most costly and disengaged individuals. Copyright © 2013 Elsevier Inc. All rights reserved.
Surveillance of occupational noise exposures using OSHA's Integrated Management Information System.
Middendorf, Paul J
2004-11-01
Exposure to noise has long been known to cause hearing loss, and is an ubiquitous problem in workplaces. Occupational noise exposures for industries stored in the Occupational Safety and Health Administration's (OSHA) Integrated Management Information System (IMIS) can be used to identify temporal and industrial trends of noise exposure to anticipate changes in rates of hearing loss. The noise records in OSHA's IMIS database for 1979-1999 were extracted by major industry division and measurement criteria. The noise exposures were summarized by year, industry, and employment size. The majority of records are from Manufacturing and Services. Exposures in Manufacturing and Services have decreased during the period, except that PEL exposures measured by federal enforcement increased from 1995 to 1999. Noise exposures in manufacturing have been reduced since the late 1970s, except those documented by federal enforcement. Noise exposure data outside manufacturing is not well represented in IMIS. Copyright 2004 Wiley-Liss, Inc.
EPA Facility Registry System (FRS): NCES
This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Center for Education Statistics (NCES). The primary federal database for collecting and analyzing data related to education in the United States and other Nations, NCES is located in the U.S. Department of Education, within the Institute of Education Sciences. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA00e2??s national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NCES school facilities once the NCES data has been integrated into the FRS database. Additional information on FRS is available at the EPA website http://www.epa.gov/enviro/html/fii/index.html.
Researcher and Author Profiles: Opportunities, Advantages, and Limitations
2017-01-01
Currently available online profiling platforms offer various services for researchers and authors. Opening an individual account and filling it with scholarly contents increase visibility of research output and boost its impact. This article overviews some of the widely used and emerging profiling platforms, highlighting their tools for sharing scholarly items, crediting individuals, and facilitating networking. Global bibliographic databases and search platforms, such as Scopus, Web of Science, PubMed, and Google Scholar, are widely used for profiling authors with indexed publications. Scholarly networking websites, such as ResearchGate and Academia.edu, provide indispensable services for researchers poorly visible elsewhere on the Internet. Several specialized platforms are designed to offer profiling along with their main functionalities, such as reference management and archiving. The Open Researcher and Contributor Identification (ORCID) project has offered a solution to the author name disambiguation. It has been integrated with numerous bibliographic databases, platforms, and manuscript submission systems to help research managers and journal editors select and credit the best reviewers, and other scholarly contributors. Individuals with verifiable reviewer and editorial accomplishments are also covered by Publons, which is an increasingly recognized service for publicizing and awarding reviewer comments. Currently available profiling formats have numerous advantages and some limitations. The advantages are related to their openness and chances of boosting the researcher impact. Some of the profiling websites are complementary to each other. The underutilization of various profiling websites and their inappropriate uses for promotion of ‘predatory’ journals are among reported limitations. A combined approach to the profiling systems is advocated in this article. PMID:28960025
Database Management Systems: New Homes for Migrating Bibliographic Records.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Bierbaum, Esther G.
1987-01-01
Assesses bibliographic databases as part of visionary text systems such as hypertext and scholars' workstations. Downloading is discussed in terms of the capability to search records and to maintain unique bibliographic descriptions, and relational database management systems, file managers, and text databases are reviewed as possible hosts for…
Wang, Yun-Tung; Lin, Yi-Jiun
2017-02-01
Purpose The aim of this study is to explore whether/which vocational rehabilitation case manager (VRCMer) factors were significantly associated with the vocational rehabilitation service (VRS) program outcomes in Taiwan. Method This study used the 2011 VRS Program for People with Disabilities Database in a metropolitan city in Taiwan (N = 466) to do a secondary data analysis using hierarchical logistic regression. Results This study found that the employment rate and stable employment rate created by the 2011 VRS program in a metropolitan city in Taiwan were 48.7% and 42.1%, respectively. For the predictors of employment/stable employment, "occurrences of the services provided by the VRCMer" variable was definitely dominant. In addition, "level of the disability" was the second-ranking predictor, and was significantly negatively correlated with both employment and stable employment outcomes. Conclusions Vocational rehabilitation case manager factors in this study were significantly correlated with VRS program outcomes for people with disabilities in Taiwan after controlling for the clients' socio-demographic variables. The results indicate that greater input by VRCMers for people with disabilities equates to better employment outcomes in metropolitan Taiwan. Implications for Rehabilitation This is the first study to build an inferential statistical model in attempt to explain and predict the association between vocational rehabilitation case manager factors and vocational rehabilitation service program outcomes for people with disabilities in Taiwan. In cases of severe disability, a vocational rehabilitation case manager should seek out more in-kind and in-cash resources, and choose a suitable job coach to cooperate in assisting the client to become employed. Based on the findings, government has to continue implementing opportunities for people with disabilities to attain higher and better quality educational levels, for increasing their employment rate. Vocational rehabilitation case managers should raise the referral rate and cooperation with job coaches as this directly affects the quality of services and clients' employment rate.
Mapping the literature of case management nursing
White, Pamela; Hall, Marilyn E.
2006-01-01
Objectives: Nursing case management provides a continuum of health care services for defined groups of patients. Its literature is multidisciplinary, emphasizing clinical specialties, case management methodology, and the health care system. This study is part of a project to map the literature of nursing, sponsored by the Nursing and Allied Health Resources Section of the Medical Library Association. The study identifies core journals cited in case management literature and indexing services that access those journals. Methods: Three source journals were identified based on established criteria, and cited references from each article published from 1997 to 1999 were analyzed. Results: Nearly two-thirds of the cited references were from journals; others were from books, monographs, reports, government documents, and the Internet. Cited journal references were ranked in descending order, and Bradford's Law of Scattering was applied. The many journals constituting the top two zones reflect the diversity of this field. Zone 1 included journals from nursing administration, case management, general medicine, medical specialties, and social work. Two databases, PubMed/MEDLINE and OCLC ArticleFirst, provided the best indexing coverage. Conclusion: Collections that support case management require a relatively small group of core journals. Students and health care professionals will need to search across disciplines to identify appropriate literature. PMID:16710470
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.
ERIC Educational Resources Information Center
Pieska, K. A. O.
1986-01-01
Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)
23 CFR 970.204 - Management systems requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...
Construction of databases: advances and significance in clinical research.
Long, Erping; Huang, Bingjie; Wang, Liming; Lin, Xiaoyu; Lin, Haotian
2015-12-01
Widely used in clinical research, the database is a new type of data management automation technology and the most efficient tool for data management. In this article, we first explain some basic concepts, such as the definition, classification, and establishment of databases. Afterward, the workflow for establishing databases, inputting data, verifying data, and managing databases is presented. Meanwhile, by discussing the application of databases in clinical research, we illuminate the important role of databases in clinical research practice. Lastly, we introduce the reanalysis of randomized controlled trials (RCTs) and cloud computing techniques, showing the most recent advancements of databases in clinical research.
[The future of clinical laboratory database management system].
Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y
1999-09-01
To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.
SEER Linked Databases - SEER Datasets
SEER-Medicare database of elderly persons with cancer is useful for epidemiologic and health services research. SEER-MHOS has health-related quality of life information about elderly persons with cancer. SEER-CAHPS database has clinical, survey, and health services information on people with cancer.
The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.
Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N
2014-03-01
The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.
Kubo, Makoto
2014-09-01
The purpose of this paper is to examine the status of care service providers by locality and organisational nature. Questionnaires were sent to 9505 home-based care service providers registered in the databases of 17 prefectures. The prefectures were selected according to population size. Numerous for-profit providers have newly entered the aged care service market and are operating selectively in Tokyo, a typical example of a metropolitan area. Furthermore, both for-profit and non-profit providers have suffered from a shortage of care workers and difficult management conditions, which tend to be more pronounced in Tokyo. The market under long-term care insurance was successful in terms of the volume of services, but most providers were sceptical as to whether competition in the market could facilitate quality care services. © 2013 The Author. Australasian Journal on Ageing © 2013 ACOTA.
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
23 CFR 971.204 - Management systems requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...
Jo, Junyoung; Leem, Jungtae; Lee, Jin Moo; Park, Kyoung Sun
2017-06-15
Primary dysmenorrhoea is menstrual pain without pelvic pathology and is the most common gynaecological condition in women. Xuefu Zhuyudecoction (XZD) or Hyeolbuchukeo-tang, a traditional herbal formula, has been used as a treatment for primary dysmenorrhoea. The purpose of this study is to assess the current published evidence regarding XZD as treatment for primary dysmenorrhoea. The following databases will be searched from their inception until April 2017: MEDLINE (via PubMed), Allied and Complementary Medicine Database (AMED), EMBASE, The Cochrane Library, six Korean medical databases (Korean Studies Information Service System, DBPia, Oriental Medicine Advanced Searching Integrated System, Research Information Service System, Korea Med and the Korean Traditional Knowledge Portal), three Chinese medical databases (China National Knowledge Infrastructure (CNKI), Wan Fang Database and Chinese Scientific Journals Database (VIP)) and one Japanese medical database (CiNii). Randomised clinical trials (RCTs) that will be included in this systematic review comprise those that used XZD or modified XZD. The control groups in the RCTs include no treatment, placebo, conventional medication or other treatments. Trials testing XZD as an adjunct to other treatments and studies where the control group received the same treatment as the intervention group will be also included. Data extraction and risk of bias assessments will be performed by two independent reviewers. The risk of bias will be assessed with the Cochrane risk of bias tool. All statistical analyses will be conducted using Review Manager software (RevMan V.5.3.0). This systematic review will be published in a peer-reviewed journal. The review will also be disseminated electronically and in print. The review will benefit patients and practitioners in the fields of traditional and conventional medicine. CRD42016050447. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A mobile trauma database with charge capture.
Moulton, Steve; Myung, Dan; Chary, Aron; Chen, Joshua; Agarwal, Suresh; Emhoff, Tim; Burke, Peter; Hirsch, Erwin
2005-11-01
Charge capture plays an important role in every surgical practice. We have developed and merged a custom mobile database (DB) system with our trauma registry (TRACS), to better understand our billing methods, revenue generators, and areas for improved revenue capture. The mobile database runs on handheld devices using the Windows Compact Edition platform. The front end was written in C# and the back end is SQL. The mobile database operates as a thick client; it includes active and inactive patient lists, billing screens, hot pick lists, and Current Procedural Terminology and International Classification of Diseases, Ninth Revision code sets. Microsoft Information Internet Server provides secure data transaction services between the back ends stored on each device. Traditional, hand written billing information for three of five adult trauma surgeons was averaged over a 5-month period. Electronic billing information was then collected over a 3-month period using handheld devices and the subject software application. One surgeon used the software for all 3 months, and two surgeons used it for the latter 2 months of the electronic data collection period. This electronic billing information was combined with TRACS data to determine the clinical characteristics of the trauma patients who were and were not captured using the mobile database. Total charges increased by 135%, 148%, and 228% for each of the three trauma surgeons who used the mobile DB application. The majority of additional charges were for evaluation and management services. Patients who were captured and billed at the point of care using the mobile DB had higher Injury Severity Scores, were more likely to undergo an operative procedure, and had longer lengths of stay compared with those who were not captured. Total charges more than doubled using a mobile database to bill at the point of care. A subsequent comparison of TRACS data with billing information revealed a large amount of uncaptured patient revenue. Greater familiarity and broader use of mobile database technology holds the potential for even greater revenue capture.
Quantifying the Validity of Routine Neonatal Healthcare Data in the Greater Accra Region, Ghana
Kayode, Gbenga A.; Amoakoh-Coleman, Mary; Brown-Davies, Charles; Grobbee, Diederick E.; Agyepong, Irene Akua; Ansah, Evelyn; Klipstein-Grobusch, Kerstin
2014-01-01
Objectives The District Health Information Management System–2 (DHIMS–2) is the database for storing health service data in Ghana, and similar to other low and middle income countries, paper-based data collection is being used by the Ghana Health Service. As the DHIMS-2 database has not been validated before this study aimed to evaluate its validity. Methods Seven out of ten districts in the Greater Accra Region were randomly sampled; the district hospital and a polyclinic in each district were recruited for validation. Seven pre-specified neonatal health indicators were considered for validation: antenatal registrants, deliveries, total births, live birth, stillbirth, low birthweight, and neonatal death. Data were extracted on these health indicators from the primary data (hospital paper-registers) recorded from January to March 2012. We examined all the data captured during this period as these data have been uploaded to the DHIMS-2 database. The differences between the values of the health indicators obtained from the primary data and that of the facility and DHIMS–2 database were used to assess the accuracy of the database while its completeness was estimated by the percentage of missing data in the primary data. Results About 41,000 data were assessed and in almost all the districts, the error rates of the DHIMS-2 data were less than 2.1% while the percentages of missing data were below 2%. At the regional level, almost all the health indicators had an error rate below 1% while the overall error rate of the DHIMS-2 database was 0.68% (95% C I = 0.61–0.75) and the percentage of missing data was 3.1% (95% C I = 2.96–3.24). Conclusion This study demonstrated that the percentage of missing data in the DHIMS-2 database was negligible while its accuracy was close to the acceptable range for high quality data. PMID:25144222
A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.
Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo
2015-01-01
The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.
Trust that binds: the impact of collective felt trust on organizational performance.
Salamon, Sabrina Deutsch; Robinson, Sandra L
2008-05-01
The impact of employees' collective perceptions of being trusted by management was examined with a longitudinal study involving 88 retail stores. Drawing on the appropriateness framework (March, 1994; Weber, Kopelman, & Messick, 2004), the authors develop and test a model showing that when employees in an organization perceive they are trusted by management, increases in the presence of responsibility norms, as well as in the sales performance and customer service performance of the organization, are observed. Moreover, the relationship between perceptions of being trusted and sales performance is fully mediated by responsibility norms. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Baptiste, B.; Dawson, D.R.; Streiner, D.
2015-01-01
Abstract OBJECTIVE: To determine factors associated with case management (CM) service use in people with traumatic brain injury (TBI), using a published model for service use. DESIGN: A retrospective cohort, with nested case-control design. Correlational and logistic regression analyses of questionnaires from a longitudinal community data base. STUDY SAMPLE: Questionnaires of 203 users of CM services and 273 non-users, complete for all outcome and predictor variables. Individuals with TBI, 15 years of age and older. Out of a dataset of 1,960 questionnaires, 476 met the inclusion criteria. METHODOLOGY: Eight predictor variables and one outcome variable (use or non-use of the service). Predictor variables considered the framework of the Behaviour Model of Health Service Use (BMHSU); specifically, pre-disposing, need and enabling factor groups as these relate to health service use and access. RESULTS: Analyses revealed significant differences between users and non-users of CM services. In particular, users were significantly younger than non-users as the older the person the less likely to use the service. Also, users had less education and more severe activity limitations and lower community integration. Persons living alone are less likely to use case management. Funding groups also significantly impact users. CONCLUSIONS: This study advances an empirical understanding of equity of access to health services usage in the practice of CM for persons living with TBI as a fairly new area of research, and considers direct relevance to Life Care Planning (LCP). Many life care planers are CM and the genesis of LCP is CM. The findings relate to health service use and access, rather than health outcomes. These findings may assist with development of a modified model for prediction of use to advance future cost of care predictions. PMID:26409333
Education and training column: the learning collaborative.
MacDonald-Wilson, Kim L; Nemec, Patricia B
2015-03-01
This column describes the key components of a learning collaborative, with examples from the experience of 1 organization. A learning collaborative is a method for management, learning, and improvement of products or processes, and is a useful approach to implementation of a new service design or approach. This description draws from published material on learning collaboratives and the authors' experiences. The learning collaborative approach offers an effective method to improve service provider skills, provide support, and structure environments to result in lasting change for people using behavioral health services. This approach is consistent with psychiatric rehabilitation principles and practices, and serves to increase the overall capacity of the mental health system by structuring a process for discovering and sharing knowledge and expertise across provider agencies. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Brissebrat, Guillaume; Fleury, Laurence; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Asencio, Nicole; Favot, Florence; Roussot, Odile
2013-04-01
The AMMA information system aims at expediting data and scientific results communication inside the AMMA community and beyond. It has already been adopted as the data management system by several projects and is meant to become a reference information system about West Africa area for the whole scientific community. The AMMA database and the associated on line tools have been developed and are managed by two French teams (IPSL Database Centre, Palaiseau and OMP Data Service, Toulouse). The complete system has been fully duplicated and is operated by AGRHYMET Regional Centre in Niamey, Niger. The AMMA database contains a wide variety of datasets: - about 250 local observation datasets, that cover geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health...) They come from either operational networks or scientific experiments, and include historical data in West Africa from 1850; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Database users can access all the data using either the portal http://database.amma-international.org or http://amma.agrhymet.ne/amma-data. Different modules are available. The complete catalogue enables to access metadata (i.e. information about the datasets) that are compliant with the international standards (ISO19115, INSPIRE...). Registration pages enable to read and sign the data and publication policy, and to apply for a user database account. The data access interface enables to easily build a data extraction request by selecting various criteria like location, time, parameters... At present, the AMMA database counts more than 740 registered users and process about 80 data requests every month In order to monitor day-to-day meteorological and environment information over West Africa, some quick look and report display websites have been developed. They met the operational needs for the observational teams during the AMMA 2006 (http://aoc.amma-international.org) and FENNEC 2011 (http://fenoc.sedoo.fr) campaigns. But they also enable scientific teams to share physical indices along the monsoon season (http://misva.sedoo.fr from 2011). A collaborative WIKINDX tool has been set on line in order to manage scientific publications and communications of interest to AMMA (http://biblio.amma-international.org). Now the bibliographic database counts about 1200 references. It is the most exhaustive document collection about African Monsoon available for all. Every scientist is invited to make use of the different AMMA on line tools and data. Scientists or project leaders who have data management needs for existing or future datasets over West Africa are welcome to use the AMMA database framework and to contact ammaAdmin@sedoo.fr .
NASA Astrophysics Data System (ADS)
Boulanger, Damien; Thouret, Valérie; Brissebrat, Guillaume
2017-04-01
IAGOS (In-service Aircraft for a Global Observing System) is a European Research Infrastructure which aims at the provision of long-term, regular and spatially resolved in situ observations of the atmospheric composition. IAGOS observation systems are deployed on a fleet of commercial aircraft and do measurements of aerosols, cloud particles, greenhouse gases, ozone, water vapor and nitrogen oxides from the surface to the lower stratosphere. The IAGOS database is an essential part of the global atmospheric monitoring network. It contains IAGOS-core data and IAGOS-CARIBIC (Civil Aircraft for the Regular Investigation of the Atmosphere Based on an Instrument Container) data. The IAGOS Data Portal http://www.iagos.org, damien.boulanger@obs-mip.fr) is part of the French atmospheric chemistry data center AERIS (http://www.aeris-data.fr). In 2016 the new IAGOS Data Portal has been released. In addition to the data download the portal provides improved and new services such as download in NetCDF or NASA Ames formats and plotting tools (maps, time series, vertical profiles, etc.). New added value products are or will be soon available through the portal: back trajectories, origin of air masses, co-location with satellite data, etc. Web services allow to download IAGOS metadata such as flights and airports information. Administration tools have been implemented for users management and instruments monitoring. A major improvement is the interoperability with international portals or other databases in order to improve IAGOS data discovery. In the frame of the IGAS project (IAGOS for the Copernicus Atmospheric Service), a data network has been setup. It is composed of three data centers: the IAGOS database in Toulouse, the HALO research aircraft database at DLR (https://halo-db.pa.op.dlr.de) and the CAMS (Copernicus Atmosphere Monitoring Service) data center in Jülich (http://join.iek.fz-juelich.de). The link with the CAMS data center, through the JOIN interface, allows to combine model outputs with IAGOS data for inter-comparison. The CAMS project is a prominent user of the IGAS data network. During the year IAGOS will improved metadata standardization and dissemination through different collaborations with the AERIS data center, GAW for which IAGOS is a contributing network and the ENVRI+ European project. Metadata about measurements traceability and quality will be available, DOI will be implemented and interoperability with other European Infrastructures will be set up through standardized web services.
Craig, J; Murray, A; Mitchell, S; Clark, S; Saunders, L; Burleigh, L
2013-11-01
Estimate costs for health and social care services in managing older people in the community who fall. Analyses of predominantly national databases using cost of illness methodologies. In Scotland, 294,000 (34%) of people over 65 years and living in the community fall at least once a year. Of these 20%, almost 60,000 people contacted a medical service for assistance. There were almost 30,000 attendances at GP practices, over 36,100 calls to the Scottish Ambulance Service and 46,816 people presenting at A&E, with 16,549 admitted, 30% with a hip fracture. Mortality was high, 7% during the hospital stay, rising to over 12% at 1 year. Over 20% of patients were unable to return to their homes. Associated costs were over £470 million, with 60% incurred by social services, mainly providing long-term care. Cost per person falling was over £1720, rising to over £8600 for those seeking medical assistance. A hip fracture admission cost £39,490, compared with £21,960 for other falls-related admissions. Transparent, robust cost information demonstrates the substantial burden of falls for health and social care services and should be a driver for implementing evidence-based interventions to reduce falls.
ERIC Educational Resources Information Center
Freeman, Carla; And Others
In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... Excluded Parties Listing System (EPLS) databases into the System for Award Management (SAM) database. DATES... combined the functional capabilities of the CCR, ORCA, and EPLS procurement systems into the SAM database... identification number and the type of organization from the System for Award Management database. 0 3. Revise the...
An Examination of Selected Software Testing Tools: 1992
1992-12-01
Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows
NASA Astrophysics Data System (ADS)
Raju, P. L. N.; Sarma, K. K.; Barman, D.; Handique, B. K.; Chutia, D.; Kundu, S. S.; Das, R. Kr.; Chakraborty, K.; Das, R.; Goswami, J.; Das, P.; Devi, H. S.; Nongkynrih, J. M.; Bhusan, K.; Singh, M. S.; Singh, P. S.; Saikhom, V.; Goswami, C.; Pebam, R.; Borgohain, A.; Gogoi, R. B.; Singh, N. R.; Bharali, A.; Sarma, D.; Lyngdoh, R. B.; Mandal, P. P.; Chabukdhara, M.
2016-06-01
North Eastern Region (NER) of India comprising of eight states considered to be most unique and one of the most challenging regions to govern due to its unique physiographic condition, rich biodiversity, disaster prone and diverse socio-economic characteristics. Operational Remote Sensing services increased manifolds in the region with the establishment of North Eastern Space Applications Centre (NESAC) in the year 2000. Since inception, NESAC has been providing remote sensing services in generating inventory, planning and developmental activities, and management of natural resources, disasters and dissemination of information and services through geo-web services for NER. The operational remote sensing services provided by NESAC can be broadly divided into three categories viz. natural resource planning and developmental services, disaster risk reduction and early warning services and information dissemination through geo-portal services. As a apart of natural resources planning and developmental services NESAC supports the state forest departments in preparing the forest working plans by providing geospatial inputs covering entire NER, identifying the suitable culturable wastelands for cultivation of silkworm food plants, mapping of natural resources such as land use/land cover, wastelands, land degradation etc. on temporal basis. In the area of disaster risk reduction, NESAC has initiated operational services for early warning and post disaster assessment inputs for flood early warning system (FLEWS) using satellite remote sensing, numerical weather prediction, hydrological modeling etc.; forest fire alert system with actionable attribute information; Japanese Encephalitis Early Warning System (JEWS) based on mosquito vector abundance, pig population and historical disease intensity and agriculture drought monitoring for the region. The large volumes of geo-spatial databases generated as part of operational services are made available to the administrators and local government bodies for better management, preparing prospective planning, and sustainable use of available resources. The knowledge dissemination is being done through online web portals wherever the internet access is available and as well as offline space based information kiosks, where the internet access is not available or having limited bandwidth availability. This paper presents a systematic and comprehensive study on the remote sensing services operational in NER of India for natural resources management, disaster risk reduction and dissemination of information and services, in addition to outlining future areas and direction of space applications for the region.
Redefining Information Access to Serials Information.
ERIC Educational Resources Information Center
Chen, Ching-chih
1992-01-01
Describes full-text document delivery services that have been introduced in conjunction with available databases in response to economic and technological changes affecting libraries: (1) CARL System's UnCover database and UnCover2 service; (2) Research Libraries Group's CitaDel delivery service; and (3) Faxon Research Service's Faxon Finder and…
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.; Locarnini, R. A.; Mishonov, A. V.; Hall, N.; Ouellet, M.
2016-02-01
The World Ocean Database (WOD) contains systematically quality controlled historical and recent ocean profile data (temperature, salinity, oxygen, nutrients, carbon cycle variables, biological variables) ranging from Captain Cooks second voyage (1773) to this year's Argo floats. The US National Centers for Environmental Information (NCEI) also hosts the Global Temperature and Salinity Profile Program (GTSPP) Continuously Managed Database (CMD) which provides quality controlled near-real time ocean profile data and higher level quality controlled temperature and salinity profiles from 1990 to present. Both databases are used extensively for ocean and climate studies. Synchronization of these two databases will allow easier access and use of comprehensive regional and global ocean profile data sets for ocean and climate studies. Synchronizing consists of two distinct phases: 1) a retrospective comparison of data in WOD and GTSPP to ensure that the most comprehensive and highest quality data set is available to researchers without the need to individually combine and contrast the two datasets and 2) web services to allow the constantly accruing near-real time data in the GTSPP CMD and the continuous addition and quality control of historical data in WOD to be made available to researchers together, seamlessly.
Surviving the Glut: The Management of Event Streams in Cyberphysical Systems
NASA Astrophysics Data System (ADS)
Buchmann, Alejandro
Alejandro Buchmann is Professor in the Department of Computer Science, Technische Universität Darmstadt, where he heads the Databases and Distributed Systems Group. He received his MS (1977) and PhD (1980) from the University of Texas at Austin. He was an Assistant/Associate Professor at the Institute for Applied Mathematics and Systems IIMAS/UNAM in Mexico, doing research on databases for CAD, geographic information systems, and objectoriented databases. At Computer Corporation of America (later Xerox Advanced Information Systems) in Cambridge, Mass., he worked in the areas of active databases and real-time databases, and at GTE Laboratories, Waltham, in the areas of distributed object systems and the integration of heterogeneous legacy systems. 1991 he returned to academia and joined T.U. Darmstadt. His current research interests are at the intersection of middleware, databases, eventbased distributed systems, ubiquitous computing, and very large distributed systems (P2P, WSN). Much of the current research is concerned with guaranteeing quality of service and reliability properties in these systems, for example, scalability, performance, transactional behaviour, consistency, and end-to-end security. Many research projects imply collaboration with industry and cover a broad spectrum of application domains. Further information can be found at http://www.dvs.tu-darmstadt.de
Supporting the Establishment of Climate-Resilient Rural Livelihoods in Mongolia with EO Services
NASA Astrophysics Data System (ADS)
Grosso, Nuno; Patinha, Carla; Sainkhuu, Tserendash; Bataa, Mendbayar; Doljinsuren, Nyamdorj
2016-08-01
The work presented here shows the results from the project "Climate-Resilient Rural Livelihoods in Mongolia", included in the EOTAP (Earth Observation for a Transforming Asia Pacific) initiative, a collaboration between the European Space Agency (ESA) and the Asian Development Bank (ADB), developed in cooperation with the Ministry of Food and Agriculture of Mongolia.The EO services developed within this EOTAP project primarily aimed at enriching the existing environmental database maintained by the National Remote Sensing Center (NRSC) in Mongolia and sustaining the collaborative pasture management practices introduced by the teams within the Ministry of Food and Agriculture of Mongolia. The geographic area covered by the EOTAP services is Bayankhongor province, in western Mongolia region, with two main services: drought monitoring at the provincial level for the year 2014 and Land Use/Land Cover (LULC) and changes mapping for three districts of this province (Buutsagaan, Dzag and Khureemaral) for the years 2013, 2014.
2016-03-01
Representational state transfer Java messaging service Java application programming interface (API) Internet relay chat (IRC)/extensible messaging and...JBoss application server or an Apache Tomcat servlet container instance. The relational database management system can be either PostgreSQL or MySQL ... Java library called direct web remoting. This library has been part of the core CACE architecture for quite some time; however, there have not been
Mavrikakis, I; Mantas, J; Diomidous, M
2007-01-01
This paper is based on the research on the possible structure of an information system for the purposes of occupational health and safety management. We initiated a questionnaire in order to find the possible interest on the part of potential users in the subject of occupational health and safety. The depiction of the potential interest is vital both for the software analysis cycle and development according to previous models. The evaluation of the results tends to create pilot applications among different enterprises. Documentation and process improvements ascertained quality of services, operational support, occupational health and safety advice are the basics of the above applications. Communication and codified information among intersted parts is the other target of the survey regarding health issues. Computer networks can offer such services. The network will consist of certain nodes responsible to inform executives on Occupational Health and Safety. A web database has been installed for inserting and searching documents. The submission of files to a server and the answers to questionnaires through the web help the experts to perform their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files so that users can retrieve the files which they need. The access is limited to authorized users. Digital watermarks authenticate and protect digital objects.
Database Searching by Managers.
ERIC Educational Resources Information Center
Arnold, Stephen E.
Managers and executives need the easy and quick access to business and management information that online databases can provide, but many have difficulty articulating their search needs to an intermediary. One possible solution would be to encourage managers and their immediate support staff members to search textual databases directly as they now…
Applications of Technology to CAS Data-Base Production.
ERIC Educational Resources Information Center
Weisgerber, David W.
1984-01-01
Reviews the economic importance of applying computer technology to Chemical Abstracts Service database production from 1973 to 1983. Database building, technological applications for editorial processing (online editing, Author Index Manufacturing System), and benefits (increased staff productivity, reduced rate of increase of cost of services,…
Wilson, Fernando A; Rampa, Sankeerth; Trout, Kate E; Stimpson, Jim P
2017-05-01
Telehealth technologies promise to increase access to care, particularly in underserved communities. However, little is known about how private payer reimbursements vary between telehealth and non-telehealth services. We use the largest private claims database in the United States provided by the Health Care Cost Institute to identify telehealth claims and compare average reimbursements to non-telehealth claims. We find average reimbursements for telehealth services are significantly lower than those for non-telehealth for seven of the ten most common services. For example, telehealth reimbursements for office visits for evaluation and management of established patients with low complexity were 30% lower than the corresponding non-telehealth service. Reimbursements by clinical diagnosis code also tended to be lower for telehealth than non-telehealth claims. Widespread adoption of telehealth may be hampered by lower reimbursements for telehealth services relative to face-to-face services. This may result in lower incentives for providers to invest in telehealth technologies that do not result in significant cost savings to their practice, even if telehealth improves patient outcomes.
Relax with CouchDB--into the non-relational DBMS era of bioinformatics.
Manyam, Ganiraju; Payton, Michelle A; Roth, Jack A; Abruzzo, Lynne V; Coombes, Kevin R
2012-07-01
With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. Copyright © 2012 Elsevier Inc. All rights reserved.
Cost and cost-effectiveness studies in urologic oncology using large administrative databases.
Wang, Ye; Mossanen, Matthew; Chang, Steven L
2018-04-01
Urologic cancers are not only among the most common types of cancers, but also among the most expensive cancers to treat in the United States. This study aimed to review the use of CEAs and other cost analyses in urologic oncology using large databases to better understand the value of management strategies of these cancers. A literature review on CEAs and other cost analyses in urologic oncology using large databases. The options for and costs of diagnosing, treating, and following patients with urologic cancers can be expected to rise in the coming years. There are numerous opportunities in each urologic cancer to use CEAs to both lower costs and provide high-quality services. Improved cancer care must balance the integration of novelty with ensuring reasonable costs to patients and the health care system. With the increasing focus cost containment, appreciating the value of competing strategies in caring for our patients is pivotal. Leveraging methods such as CEAs and harnessing large databases may help evaluate the merit of established or emerging strategies. Copyright © 2018 Elsevier Inc. All rights reserved.
[Experience with the reference manager EndNote-EndLink].
Reiss, M; Reiss, G
1998-09-01
A good reference management program should make it easy to record the elements of a reference: author's name, year of publication, title of article, etc. It should offer tools that let you find and retrieve references quickly, and it should be able to produce the bibliography in the format required for a particular publication. There are many computer programs, but very few stand out as truly useful, time saving, and work enhancing. One of them is EndNote-EndLink. We want to report our experience with this database manager. The functions and the use of the software package EndNote 2.3 for Windows are described. You can create your database or you can download batches of references from one of the popular searching services (e.g. MEDLINE). When you want to cite a reference you simply paste the reference wherever you want your in-text citation to appear. To prepare the bibliography, EndNote scans your article, replaces the place holders with citations and prints the list of references at the end of the manuscript, according with the style that you have chosen. Altogether EndNote provides an excellent combination of features and ease of use.
Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution
NASA Astrophysics Data System (ADS)
Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.
2017-10-01
Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.
The EarthServer Federation: State, Role, and Contribution to GEOSS
NASA Astrophysics Data System (ADS)
Merticariu, Vlad; Baumann, Peter
2016-04-01
The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.
Generalized Database Management System Support for Numeric Database Environments.
ERIC Educational Resources Information Center
Dominick, Wayne D.; Weathers, Peggy G.
1982-01-01
This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…
UPM: unified policy-based network management
NASA Astrophysics Data System (ADS)
Law, Eddie; Saxena, Achint
2001-07-01
Besides providing network management to the Internet, it has become essential to offer different Quality of Service (QoS) to users. Policy-based management provides control on network routers to achieve this goal. The Internet Engineering Task Force (IETF) has proposed a two-tier architecture whose implementation is based on the Common Open Policy Service (COPS) protocol and Lightweight Directory Access Protocol (LDAP). However, there are several limitations to this design such as scalability and cross-vendor hardware compatibility. To address these issues, we present a functionally enhanced multi-tier policy management architecture design in this paper. Several extensions are introduced thereby adding flexibility and scalability. In particular, an intermediate entity between the policy server and policy rule database called the Policy Enforcement Agent (PEA) is introduced. By keeping internal data in a common format, using a standard protocol, and by interpreting and translating request and decision messages from multi-vendor hardware, this agent allows a dynamic Unified Information Model throughout the architecture. We have tailor-made this unique information system to save policy rules in the directory server and allow executions of policy rules with dynamic addition of new equipment during run-time.
NASA Astrophysics Data System (ADS)
Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.
2012-04-01
Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.
Mapping the literature of nursing administration.
Galganski, Carol J
2006-04-01
As part of Phase I of a project to map the literature of nursing, sponsored by the Nursing and Allied Health Resources Section of the Medical Library Association, this study identifies the core literature cited in nursing administration and the indexing services that provide access to the core journals. The results of this study will assist librarians and end users searching for information related to this nursing discipline, as well as database producers who might consider adding specific titles to their indexing services. Using the common methodology described in the overview article, five source journals for nursing administration were identified and selected for citation analysis over a three-year period, 1996 to 1998, to identify the most frequently cited titles according to Bradford's Law of Scattering. From this core of most productive journal titles, the bibliographic databases that provide the best access to these titles were identified. Results reveal that nursing administration literature relies most heavily on journal articles and on those titles identified as core nursing administrative titles. When the indexing coverage of nine services is compared, PubMed/MEDLINE and CINAHL provide the most comprehensive coverage of this nursing discipline. No one indexing service adequately covers this nursing discipline. Researchers needing comprehensive coverage in this area must search more than one database to effectively research their projects. While PubMed/MEDLINE and CINAHL provide more coverage for this discipline than the other indexing services, none is sufficiently broad in scope to provide indexing of nursing, health care management, and medical literature in a single file. Nurse administrators using the literature to research current work issues need to review not only the nursing titles covered by CINAHL but should also include the major weekly medical titles, core titles in health care administration, and general business sources if they wish to adequately cover the many aspects of nursing administration.
The STP (Solar-Terrestrial Physics) Semantic Web based on the RSS1.0 and the RDF
NASA Astrophysics Data System (ADS)
Kubo, T.; Murata, K. T.; Kimura, E.; Ishikura, S.; Shinohara, I.; Kasaba, Y.; Watari, S.; Matsuoka, D.
2006-12-01
In the Solar-Terrestrial Physics (STP), it is pointed out that circulation and utilization of observation data among researchers are insufficient. To archive interdisciplinary researches, we need to overcome this circulation and utilization problems. Under such a background, authors' group has developed a world-wide database that manages meta-data of satellite and ground-based observation data files. It is noted that retrieving meta-data from the observation data and registering them to database have been carried out by hand so far. Our goal is to establish the STP Semantic Web. The Semantic Web provides a common framework that allows a variety of data shared and reused across applications, enterprises, and communities. We also expect that the secondary information related with observations, such as event information and associated news, are also shared over the networks. The most fundamental issue on the establishment is who generates, manages and provides meta-data in the Semantic Web. We developed an automatic meta-data collection system for the observation data using the RSS (RDF Site Summary) 1.0. The RSS1.0 is one of the XML-based markup languages based on the RDF (Resource Description Framework), which is designed for syndicating news and contents of news-like sites. The RSS1.0 is used to describe the STP meta-data, such as data file name, file server address and observation date. To describe the meta-data of the STP beyond RSS1.0 vocabulary, we defined original vocabularies for the STP resources using the RDF Schema. The RDF describes technical terms on the STP along with the Dublin Core Metadata Element Set, which is standard for cross-domain information resource descriptions. Researchers' information on the STP by FOAF, which is known as an RDF/XML vocabulary, creates a machine-readable metadata describing people. Using the RSS1.0 as a meta-data distribution method, the workflow from retrieving meta-data to registering them into the database is automated. This technique is applied for several database systems, such as the DARTS database system and NICT Space Weather Report Service. The DARTS is a science database managed by ISAS/JAXA in Japan. We succeeded in generating and collecting the meta-data automatically for the CDF (Common data Format) data, such as Reimei satellite data, provided by the DARTS. We also create an RDF service for space weather report and real-time global MHD simulation 3D data provided by the NICT. Our Semantic Web system works as follows: The RSS1.0 documents generated on the data sites (ISAS and NICT) are automatically collected by a meta-data collection agent. The RDF documents are registered and the agent extracts meta-data to store them in the Sesame, which is an open source RDF database with support for RDF Schema inferencing and querying. The RDF database provides advanced retrieval processing that has considered property and relation. Finally, the STP Semantic Web provides automatic processing or high level search for the data which are not only for observation data but for space weather news, physical events, technical terms and researches information related to the STP.
Developing a managed care delivery system in New York State for Medicaid recipients with HIV.
Feldman, I; Cruz, H; DeLorenzo, J; Hidalgo, J; Plavin, H; Whitaker, J
1999-11-01
In the state of New York, models of care known as HIV Special Needs Plans (HIV SNPs) are being developed to meet the unique health and medical needs of Medicaid recipients with HIV. Establishing managed care plans for the 80,000 to 100,000 HIV-infected Medicaid recipients residing in the state has required considerable effort, including distributing planning grants to solicit information and recommendations regarding program and fiscal policy; convening a workgroup to facilitate discussions between the state and the provider and consumer communities; conducting a longitudinal survey to assess the impact of managed care on persons with HIV; and developing a longitudinal, person-based, encounter-level database representing the clinical and service utilization histories of more than 100,000 patients for state fiscal years 1990 to 1996. The key fiscal issues identified and discussed were capitation rates, initial capitalization levels, and risk-adjustment mechanisms. Other pertinent issues included the importance of a benefits package supporting a comprehensive, integrated continuum of state-of-the-art services; marketing and enrollment; attention to provider and consumer training and education needs; and interdependence of financial reimbursement and benefits packages. From our experience in New York State, we conclude that a successful model of Medicaid managed care for persons with HIV should build on the existing infrastructure of services, using a collaborative process among government agencies, healthcare providers, and HIV/AIDS consumer communities. A future challenge lies in the implementation of the HIV SNP model and evaluation of its soundness and ability to ensure quality healthcare services.
Keeys, Christopher; Kalejaiye, Bamidele; Skinner, Michelle; Eimen, Mandana; Neufer, Joann; Sidbury, Gisele; Buster, Norman; Vincent, Joan
2014-12-15
The development, implementation, and pilot testing of a discharge medication reconciliation service managed by pharmacists with offsite telepharmacy support are described. Hospitals' efforts to prepare legible, complete, and accurate medication lists to patients prior to discharge continue to be complicated by staffing and time constraints and suboptimal information technology. To address these challenges, the pharmacy department at a 324-bed community hospital initiated a quality-improvement project to optimize patients' discharge medication lists while addressing problems that often resulted in confusing, incomplete, or inaccurate lists. A subcommittee of the hospital's pharmacy and therapeutics committee led the development of a revised medication reconciliation process designed to streamline and improve the accuracy and utility of discharge medication documents, with subsequent implementation of a new service model encompassing both onsite and remote pharmacists. The new process and service were evaluated on selected patient care units in a 19-month pilot project requiring collaboration by physicians, nurses, case managers, pharmacists, and an outpatient prescription drug database vendor. During the pilot testing period, 6402 comprehensive reconciled discharge medication lists were prepared; 634 documented discrepancies or medication errors were detected. The majority of identified problems were in three categories: unreconciled medication orders (31%), order clarification (25%), and duplicate orders (12%). The most problematic medications were the opioids, cardiovascular agents, and anticoagulants. A pharmacist-managed medication reconciliation service including onsite pharmacists and telepharmacy support was successful in improving the final discharge lists and documentation received by patients. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
JAMSTEC DARWIN Database Assimilates GANSEKI and COEDO
NASA Astrophysics Data System (ADS)
Tomiyama, T.; Toyoda, Y.; Horikawa, H.; Sasaki, T.; Fukuda, K.; Hase, H.; Saito, H.
2017-12-01
Introduction: Japan Agency for Marine-Earth Science and Technology (JAMSTEC) archives data and samples obtained by JAMSTEC research vessels and submersibles. As a common property of the human society, JAMSTEC archive is open for public users with scientific/educational purposes [1]. For publicizing its data and samples online, JAMSTEC is operating NUUNKUI data sites [2], a group of several databases for various data and sample types. For years, data and metadata of JAMSTEC rock samples, sediment core samples and cruise/dive observation were publicized through databases named GANSEKI, COEDO, and DARWIN, respectively. However, because they had different user interfaces and data structures, these services were somewhat confusing for unfamiliar users. Maintenance costs of multiple hardware and software were also problematic for performing sustainable services and continuous improvements. Database Integration: In 2017, GANSEKI, COEDO and DARWIN were integrated into DARWIN+ [3]. The update also included implementation of map-search function as a substitute of closed portal site. Major functions of previous systems were incorporated into the new system; users can perform the complex search, by thumbnail browsing, map area, keyword filtering, and metadata constraints. As for data handling, the new system is more flexible, allowing the entry of variety of additional data types. Data Management: After the DARWIN major update, JAMSTEC data & sample team has been dealing with minor issues of individual sample data/metadata which sometimes need manual modification to be transferred to the new system. Some new data sets, such as onboard sample photos and surface close-up photos of rock samples, are getting available online. Geochemical data of sediment core samples will supposedly be added in the near future. Reference: [1] http://www.jamstec.go.jp/e/database/data_policy.html [2] http://www.godac.jamstec.go.jp/jmedia/portal/e/ [3] http://www.godac.jamstec.go.jp/darwin/e/
Hazards of Extreme Weather: Flood Fatalities in Texas
NASA Astrophysics Data System (ADS)
Sharif, H. O.; Jackson, T.; Bin-Shafique, S.
2009-12-01
The Federal Emergency Management Agency (FEMA) considers flooding “America’s Number One Natural Hazard”. Despite flood management efforts in many communities, U.S. flood damages remain high, due, in large part, to increasing population and property development in flood-prone areas. Floods are the leading cause of fatalities related to natural disasters in Texas. Texas leads the nation in flash flood fatalities. There are three times more fatalities in Texas (840) than the following state Pennsylvania (265). This study examined flood fatalities that occurred in Texas between 1960 and 2008. Flood fatality statistics were extracted from three sources: flood fatality databases from the National Climatic Data Center, the Spatial Hazard Event and Loss Database for the United States, and the Texas Department of State Health Services. The data collected for flood fatalities include the date, time, gender, age, location, and weather conditions. Inconsistencies among the three databases were identified and discussed. Analysis reveals that most fatalities result from driving into flood water (about 65%). Spatial analysis indicates that more fatalities occurred in counties containing major urban centers. Hydrologic analysis of a flood event that resulted in five fatalities was performed. A hydrologic model was able to simulate the water level at a location where a vehicle was swept away by flood water resulting in the death of the driver.
Clinical image processing engine
NASA Astrophysics Data System (ADS)
Han, Wei; Yao, Jianhua; Chen, Jeremy; Summers, Ronald
2009-02-01
Our group provides clinical image processing services to various institutes at NIH. We develop or adapt image processing programs for a variety of applications. However, each program requires a human operator to select a specific set of images and execute the program, as well as store the results appropriately for later use. To improve efficiency, we design a parallelized clinical image processing engine (CIPE) to streamline and parallelize our service. The engine takes DICOM images from a PACS server, sorts and distributes the images to different applications, multithreads the execution of applications, and collects results from the applications. The engine consists of four modules: a listener, a router, a job manager and a data manager. A template filter in XML format is defined to specify the image specification for each application. A MySQL database is created to store and manage the incoming DICOM images and application results. The engine achieves two important goals: reduce the amount of time and manpower required to process medical images, and reduce the turnaround time for responding. We tested our engine on three different applications with 12 datasets and demonstrated that the engine improved the efficiency dramatically.
Substance abuse treatment in persons with HIV/AIDS: challenges in managing triple diagnosis.
Durvasula, Ramani; Miller, Theodore R
2014-01-01
Clinical management of HIV must account for the "triple diagnosis" of HIV, psychiatric diagnosis, and substance use disorders and requires integrated treatment services that focus beyond just mitigation of substance use and psychiatric and medical symptoms but also address other health behaviors. Because clinical management of HIV/AIDS has shifted significantly with the advent of highly active antiretroviral therapies (HAART) in the mid 1990s, a literature review focusing on literature published since 2000, and using relevant key words was conducted using a wide range of literature search databases. This literature review was complemented by studies to expand on specific treatment modalities for which there was a dearth of literature addressing HIV infected cohorts and to provide discussion of issues around substance abuse treatment as an HIV prevention tool. Existing models of substance abuse treatment including cognitive behavioral therapy and motivational interviewing have proven to be useful for enhancing adherence and reducing substance use in outpatient populations, while methadone maintenance and directly observed treatment have been useful with specific subgroups of users. Contextualization of services heightens the likelihood of successful outcomes and relapse prevention.
Pharmacology Portal: An Open Database for Clinical Pharmacologic Laboratory Services.
Karlsen Bjånes, Tormod; Mjåset Hjertø, Espen; Lønne, Lars; Aronsen, Lena; Andsnes Berg, Jon; Bergan, Stein; Otto Berg-Hansen, Grim; Bernard, Jean-Paul; Larsen Burns, Margrete; Toralf Fosen, Jan; Frost, Joachim; Hilberg, Thor; Krabseth, Hege-Merete; Kvan, Elena; Narum, Sigrid; Austgulen Westin, Andreas
2016-01-01
More than 50 Norwegian public and private laboratories provide one or more analyses for therapeutic drug monitoring or testing for drugs of abuse. Practices differ among laboratories, and analytical repertoires can change rapidly as new substances become available for analysis. The Pharmacology Portal was developed to provide an overview of these activities and to standardize the practices and terminology among laboratories. The Pharmacology Portal is a modern dynamic web database comprising all available analyses within therapeutic drug monitoring and testing for drugs of abuse in Norway. Content can be retrieved by using the search engine or by scrolling through substance lists. The core content is a substance registry updated by a national editorial board of experts within the field of clinical pharmacology. This ensures quality and consistency regarding substance terminologies and classification. All laboratories publish their own repertoires in a user-friendly workflow, adding laboratory-specific details to the core information in the substance registry. The user management system ensures that laboratories are restricted from editing content in the database core or in repertoires within other laboratory subpages. The portal is for nonprofit use, and has been fully funded by the Norwegian Medical Association, the Norwegian Society of Clinical Pharmacology, and the 8 largest pharmacologic institutions in Norway. The database server runs an open-source content management system that ensures flexibility with respect to further development projects, including the potential expansion of the Pharmacology Portal to other countries. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Candela, L.; Ruggieri, G.; Giancaspro, A.
2004-09-01
In the sphere of "Multi-Mission Ground Segment" Italian Space Agency project, some innovative technologies such as CORBA[1], Z39.50[2], XML[3], Java[4], Java server Pages[4] and C++ has been experimented. The SSPI system (Space Service Provider Infrastructure) is the prototype of a distributed environment aimed to facilitate the access to Earth Observation (EO) data. SSPI allows to ingests, archive, consolidate, visualize and evaluate these data. Hence, SSPI is not just a database of or a data repository, but an application that by means of a set of protocols, standards and specifications provides a unified access to multi-mission EO data.
Draft secure medical database standard.
Pangalos, George
2002-01-01
Medical database security is a particularly important issue for all Healthcare establishments. Medical information systems are intended to support a wide range of pertinent health issues today, for example: assure the quality of care, support effective management of the health services institutions, monitor and contain the cost of care, implement technology into care without violating social values, ensure the equity and availability of care, preserve humanity despite the proliferation of technology etc.. In this context, medical database security aims primarily to support: high availability, accuracy and consistency of the stored data, the medical professional secrecy and confidentiality, and the protection of the privacy of the patient. These properties, though of technical nature, basically require that the system is actually helpful for medical care and not harmful to patients. These later properties require in turn not only that fundamental ethical principles are not violated by employing database systems, but instead, are effectively enforced by technical means. This document reviews the existing and emerging work on the security of medical database systems. It presents in detail the related problems and requirements related to medical database security. It addresses the problems of medical database security policies, secure design methodologies and implementation techniques. It also describes the current legal framework and regulatory requirements for medical database security. The issue of medical database security guidelines is also examined in detailed. The current national and international efforts in the area are studied. It also gives an overview of the research work in the area. The document also presents in detail the most complete to our knowledge set of security guidelines for the development and operation of medical database systems.
NASA Astrophysics Data System (ADS)
Leavesley, G.; Markstrom, S.; Frevert, D.; Fulp, T.; Zagona, E.; Viger, R.
2004-12-01
Increasing demands for limited fresh-water supplies, and increasing complexity of water-management issues, present the water-resource manager with the difficult task of achieving an equitable balance of water allocation among a diverse group of water users. The Watershed and River System Management Program (WARSMP) is a cooperative effort between the U.S. Geological Survey (USGS) and the Bureau of Reclamation (BOR) to develop and deploy a database-centered, decision-support system (DSS) to address these multi-objective, resource-management problems. The decision-support system couples the USGS Modular Modeling System (MMS) with the BOR RiverWare tools using a shared relational database. MMS is an integrated system of computer software that provides a research and operational framework to support the development and integration of a wide variety of hydrologic and ecosystem models, and their application to water- and ecosystem-resource management. RiverWare is an object-oriented reservoir and river-system modeling framework developed to provide tools for evaluating and applying water-allocation and management strategies. The modeling capabilities of MMS and Riverware include simulating watershed runoff, reservoir inflows, and the impacts of resource-management decisions on municipal, agricultural, and industrial water users, environmental concerns, power generation, and recreational interests. Forecasts of future climatic conditions are a key component in the application of MMS models to resource-management decisions. Forecast methods applied in MMS include a modified version of the National Weather Service's Extended Streamflow Prediction Program (ESP) and statistical downscaling from atmospheric models. The WARSMP DSS is currently operational in the Gunnison River Basin, Colorado; Yakima River Basin, Washington; Rio Grande Basin in Colorado and New Mexico; and Truckee River Basin in California and Nevada.
Bouchet, Philippe; Boxshall, Geoff; Fauchald, Kristian; Gordon, Dennis; Hoeksema, Bert W.; Poore, Gary C. B.; van Soest, Rob W. M.; Stöhr, Sabine; Walter, T. Chad; Vanhoorne, Bart; Decock, Wim
2013-01-01
The World Register of Marine Species is an over 90% complete open-access inventory of all marine species names. Here we illustrate the scale of the problems with species names, synonyms, and their classification, and describe how WoRMS publishes online quality assured information on marine species. Within WoRMS, over 100 global, 12 regional and 4 thematic species databases are integrated with a common taxonomy. Over 240 editors from 133 institutions and 31 countries manage the content. To avoid duplication of effort, content is exchanged with 10 external databases. At present WoRMS contains 460,000 taxonomic names (from Kingdom to subspecies), 368,000 species level combinations of which 215,000 are currently accepted marine species names, and 26,000 related but non-marine species. Associated information includes 150,000 literature sources, 20,000 images, and locations of 44,000 specimens. Usage has grown linearly since its launch in 2007, with about 600,000 unique visitors to the website in 2011, and at least 90 organisations from 12 countries using WoRMS for their data management. By providing easy access to expert-validated content, WoRMS improves quality control in the use of species names, with consequent benefits to taxonomy, ecology, conservation and marine biodiversity research and management. The service manages information on species names that would otherwise be overly costly for individuals, and thus minimises errors in the application of nomenclature standards. WoRMS' content is expanding to include host-parasite relationships, additional literature sources, locations of specimens, images, distribution range, ecological, and biological data. Species are being categorised as introduced (alien, invasive), of conservation importance, and on other attributes. These developments have a multiplier effect on its potential as a resource for biodiversity research and management. As a consequence of WoRMS, we are witnessing improved communication within the scientific community, and anticipate increased taxonomic efficiency and quality control in marine biodiversity research and management. PMID:23505408
Costello, Mark J; Bouchet, Philippe; Boxshall, Geoff; Fauchald, Kristian; Gordon, Dennis; Hoeksema, Bert W; Poore, Gary C B; van Soest, Rob W M; Stöhr, Sabine; Walter, T Chad; Vanhoorne, Bart; Decock, Wim; Appeltans, Ward
2013-01-01
The World Register of Marine Species is an over 90% complete open-access inventory of all marine species names. Here we illustrate the scale of the problems with species names, synonyms, and their classification, and describe how WoRMS publishes online quality assured information on marine species. Within WoRMS, over 100 global, 12 regional and 4 thematic species databases are integrated with a common taxonomy. Over 240 editors from 133 institutions and 31 countries manage the content. To avoid duplication of effort, content is exchanged with 10 external databases. At present WoRMS contains 460,000 taxonomic names (from Kingdom to subspecies), 368,000 species level combinations of which 215,000 are currently accepted marine species names, and 26,000 related but non-marine species. Associated information includes 150,000 literature sources, 20,000 images, and locations of 44,000 specimens. Usage has grown linearly since its launch in 2007, with about 600,000 unique visitors to the website in 2011, and at least 90 organisations from 12 countries using WoRMS for their data management. By providing easy access to expert-validated content, WoRMS improves quality control in the use of species names, with consequent benefits to taxonomy, ecology, conservation and marine biodiversity research and management. The service manages information on species names that would otherwise be overly costly for individuals, and thus minimises errors in the application of nomenclature standards. WoRMS' content is expanding to include host-parasite relationships, additional literature sources, locations of specimens, images, distribution range, ecological, and biological data. Species are being categorised as introduced (alien, invasive), of conservation importance, and on other attributes. These developments have a multiplier effect on its potential as a resource for biodiversity research and management. As a consequence of WoRMS, we are witnessing improved communication within the scientific community, and anticipate increased taxonomic efficiency and quality control in marine biodiversity research and management.
Building Databases for Education. ERIC Digest.
ERIC Educational Resources Information Center
Klausmeier, Jane A.
This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…
E-DECIDER Decision Support Gateway For Earthquake Disaster Response
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.
2013-12-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.
In the Jungle of Astronomical On--line Data Services
NASA Astrophysics Data System (ADS)
Egret, D.
The author tried to survive in the jungle of astronomical on--line data services. In order to find efficient answers to common scientific data retrieval requests, he had to collect many pieces of information, in order to formulate typical user scenarios, and try them against a number of different data bases, catalogue services, or information systems. He discovered soon how frustrating treasure coffers may be when their keys are not available, but he realized also that nice widgets and gadgets are of no help when the information is not there. And, before long, he knew he would have to navigate through several systems because no one was yet offering a general answer to all his questions. I will present examples of common user scenarios and show how they were tested against a number of services. I will propose some elements of classification which should help the end-user to evaluate how adequate the different services may be for providing satisfying answers to specific queries. For that, many aspects of the user interaction will be considered: documentation, access, query formulation, functionalities, qualification of the data, overall efficiency, etc. I will also suggest possible improvements to the present situation: the first of them being to encourage system managers to increase collaboration between one another, for the benefit of the whole astronomical community. The subjective review I will present, is based on publicly available astronomical on--line services from the U.S. and from Europe, most of which (excepting the newcomers) were described in ``Databases and On-Line Data in Astronomy", (Albrecht & Egret, eds, 1991): this includes databases (such as NED and Simbad ), catalog services ( StarCat , DIRA , XCatScan , etc.), and information systems ( ADS and ESIS ).
Fu, Yu; McNichol, Elaine; Marczewski, Kathryn; Closs, S José
2016-05-01
Chronic back pain is common, and its self-management may be a lifelong task for many patients. While health professionals can provide a service or support for pain, only patients can actually experience it. It is likely that optimum self-management of chronic back pain may only be achieved when patients and professionals develop effective partnerships which integrate their complementary knowledge and skills. However, at present, there is no evidence to explain how such partnerships can influence patients' self-management ability. This review aimed to explore the influence of patient-professional partnerships on patients' ability to self-manage chronic back pain, and to identify key factors within these partnerships that may influence self-management. A systematic review was undertaken, aiming to retrieve relevant studies using any research method. Five databases were searched for papers published between 1980 and 2014, including Cochrane Library, CINAHL, Medline, EMBASE and PsycINFO. Eligible studies were those reporting on patients being supported by professionals to self-manage chronic back pain; patients being actively involved for self-managing chronic back pain; and the influence of patient-professional partnerships on self-management of chronic back pain. Included studies were critically appraised for quality, and findings were extracted and analysed thematically. A total of 738 studies were screened, producing 10 studies for inclusion, all of which happened to use qualitative methods. Seven themes were identified: communication, mutual understanding, roles of health professionals, information delivery, patients' involvement, individualised care and healthcare service. These themes were developed into a model suggesting how factors within patient-professional partnerships influence self-management. Review findings suggest that a partnership between patients and professionals supports patients' self-management ability, and effective communication is a fundamental factor underpinning their partnerships in care. It also calls for the development of individualised healthcare services offering self-referral or telephone consultation to patients with chronic conditions. © 2015 John Wiley & Sons Ltd.
Recommendations for Planning and Managing International Short-term Pharmacy Service Trips
Alsharif, Naser Z.; Rovers, John; Connor, Sharon; White, Nicole D.; Hogue, Michael D.
2017-01-01
International pharmacy service trips by schools and colleges of pharmacy allow students to provide health care to medically underserved areas. A literature review (2000-2016) in databases and Internet searches with specific keywords or terms was performed to assess current practices to establish and maintain successful pharmacy service trips. Educational documents such as syllabi were obtained from pharmacy programs and examined. A preliminary draft was developed and authors worked on sections of interest and expertise. Considerations and current recommendations are provided for the key aspects of the home institution and the host country requirements for pharmacy service trips based on findings from a literature search and the authors’ collective, extensive experience. Evaluation of the trip and ethical considerations are also discussed. This article serves as a resource for schools and colleges of pharmacy that are interested in the development of new pharmacy service trips and provides key considerations for continuous quality improvement of current or future activities. PMID:28381883
Gutiérrez, Miguel F; Cajiao, Alejandro; Hidalgo, José A; Cerón, Jesús D; López, Diego M; Quintero, Víctor M; Rendón, Alvaro
2014-01-01
This article presents the development process of an acquisition and data storage system managing clinical variables through a cloud storage service and a Personal Health Record (PHR) System. First, the paper explains how a Wireless Body Area Network (WBAN) that captures data from two sensors corresponding to arterial pressure and heart rate is designed. Second, this paper illustrates how data collected by the WBAN are transmitted to a cloud storage service. It is worth mentioning that this cloud service allows the data to be stored in a persistent way on an online database system. Finally, the paper describes, how the data stored in the cloud service are sent to the Indivo PHR System, where they are registered and charted for future revision by health professionals. The research demonstrated the feasibility of implementing WBAN networks for the acquisition of clinical data, and particularly for the use of Web technologies and standards to provide interoperability with PHR Systems at technical and syntactic levels.
Newborn screening healthcare information system based on service-oriented architecture.
Hsieh, Sung-Huai; Hsieh, Sheau-Ling; Chien, Yin-Hsiu; Weng, Yung-Ching; Hsu, Kai-Ping; Chen, Chi-Huang; Tu, Chien-Ming; Wang, Zhenyu; Lai, Feipei
2010-08-01
In this paper, we established a newborn screening system under the HL7/Web Services frameworks. We rebuilt the NTUH Newborn Screening Laboratory's original standalone architecture, having various heterogeneous systems operating individually, and restructured it into a Service-Oriented Architecture (SOA), distributed platform for further integrity and enhancements of sample collections, testing, diagnoses, evaluations, treatments or follow-up services, screening database management, as well as collaboration, communication among hospitals; decision supports and improving screening accuracy over the Taiwan neonatal systems are also addressed. In addition, the new system not only integrates the newborn screening procedures among phlebotomy clinics, referral hospitals, as well as the newborn screening center in Taiwan, but also introduces new models of screening procedures for the associated, medical practitioners. Furthermore, it reduces the burden of manual operations, especially the reporting services, those were heavily dependent upon previously. The new system can accelerate the whole procedures effectively and efficiently. It improves the accuracy and the reliability of the screening by ensuring the quality control during the processing as well.
Recommendations for Planning and Managing International Short-term Pharmacy Service Trips.
Johnson, Kalin L; Alsharif, Naser Z; Rovers, John; Connor, Sharon; White, Nicole D; Hogue, Michael D
2017-03-25
International pharmacy service trips by schools and colleges of pharmacy allow students to provide health care to medically underserved areas. A literature review (2000-2016) in databases and Internet searches with specific keywords or terms was performed to assess current practices to establish and maintain successful pharmacy service trips. Educational documents such as syllabi were obtained from pharmacy programs and examined. A preliminary draft was developed and authors worked on sections of interest and expertise. Considerations and current recommendations are provided for the key aspects of the home institution and the host country requirements for pharmacy service trips based on findings from a literature search and the authors' collective, extensive experience. Evaluation of the trip and ethical considerations are also discussed. This article serves as a resource for schools and colleges of pharmacy that are interested in the development of new pharmacy service trips and provides key considerations for continuous quality improvement of current or future activities.
Hudon, Catherine; Chouinard, Maud-Christine; Lambert, Mireille; Diadiou, Fatoumata; Bouliane, Danielle; Beaudin, Jérémie
2017-10-22
The aim of this paper was to identify the key factors of case management (CM) interventions among frequent users of healthcare services found in empirical studies of effectiveness. Thematic analysis review of CM studies. We built on a previously published review that aimed to report the effectiveness of CM interventions for frequent users of healthcare services, using the Medline, Scopus and CINAHL databases covering the January 2004-December 2015 period, then updated to July 2017, with the keywords 'CM' and 'frequent use'. We extracted factors of successful (n=7) and unsuccessful (n=6) CM interventions and conducted a mixed thematic analysis to synthesise findings. Chaudoir's implementation of health innovations framework was used to organise results into four broad levels of factors: (1) ,environmental/organisational level, (2) practitioner level, (3) patient level and (4) programme level. Access to, and close partnerships with, healthcare providers and community services resources were key factors of successful CM interventions that should target patients with the greatest needs and promote frequent contacts with the healthcare team. The selection and training of the case manager was also an important factor to foster patient engagement in CM. Coordination of care, self-management support and assistance with care navigation were key CM activities. The main issues reported by unsuccessful CM interventions were problems with case finding or lack of care integration. CM interventions for frequent users of healthcare services should ensure adequate case finding processes, rigorous selection and training of the case manager, sufficient intensity of the intervention, as well as good care integration among all partners. Other studies could further evaluate the influence of contextual factors on intervention impacts. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Technical Reports Server (NTRS)
Hochstadt, Jake
2011-01-01
Ruby on Rails is an open source web application framework for the Ruby programming language. The first application I built was a web application to manage and authenticate other applications. One of the main requirements for this application was a single sign-on service. This allowed authentication to be built in one location and be implemented in many different applications. For example, users would be able to login using their existing credentials, and be able to access other NASA applications without authenticating again. The second application I worked on was an internal qualification plan app. Previously, the viewing of employee qualifications was managed through Excel spread sheets. I built a database driven application to streamline the process of managing qualifications. Employees would be able to login securely to view, edit and update their personal qualifications.
ERIC Educational Resources Information Center
American Society for Information Science, Washington, DC.
This document contains abstracts of papers on database design and management which were presented at the 1986 mid-year meeting of the American Society for Information Science (ASIS). Topics considered include: knowledge representation in a bilingual art history database; proprietary database design; relational database design; in-house databases;…
The Network Configuration of an Object Relational Database Management System
NASA Technical Reports Server (NTRS)
Diaz, Philip; Harris, W. C.
2000-01-01
The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.
Ha, Dongmun; Song, Inmyung; Lee, Eui-Kyung; Shin, Ju-Young
2018-05-03
Predicting pharmacy service fees is crucial to sustain the health insurance budget and maintain pharmacy management. However, there is no evidence on how to predict pharmacy service fees at the population level. This study compares the status of pharmacy services and constructs regression model to project annual pharmacy service fees in Korea. We conducted a time-series analysis by using sample data from the national health insurance database from 2006 and 2012. To reflect the latest trend, we categorized pharmacies into general hospital, special hospital, and clinic outpatient pharmacies based on the major source of service fees, using a 1% sample of the 2012 data. We estimated the daily number of prescriptions, pharmacy service fees, and drugs costs according to these three types of pharmacy services. To forecast pharmacy service fees, a regression model was constructed to estimate annual fees in the following year (2013). The dependent variable was pharmacy service fees and the independent variables were the number of prescriptions and service fees per pharmacy, ratio of patients (≥ 65 years), conversion factor, change of policy, and types of pharmacy services. Among the 21,283 pharmacies identified, 5.0% (1064), 4.6% (974), and 77.5% (16,340) were general hospital, special hospital, and clinic outpatient pharmacies, respectively, in 2012. General hospital pharmacies showed a higher daily number of prescriptions (111.9), higher pharmacy service fees ($25,546,342), and higher annual drugs costs ($215,728,000) per pharmacy than any other pharmacy (p < 0.05). The regression model to project found the ratio of patients aged 65 years and older and the conversion factor to be associated with an increase in pharmacy service fees. It also estimated the future rate of increase in pharmacy service fees to be between 3.1% and 7.8%. General hospital outpatient pharmacies spent more on annual pharmacy service fees than any other type of pharmacy. The forecast of annual pharmacy service fees in Korea was similar to that of Australia, but not that of the United Kingdom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bower, J.C.; Burford, M.J.; Downing, T.R.
The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using themore » site map database.« less
Acoustic Metadata Management and Transparent Access to Networked Oceanographic Data Sets
2011-09-30
Roberts in Pat Halpin’s lab, integrating the Marine Geospatial Ecology (GeoEco) toolset into our database services. While there is a steep...noise bands. The lower box at each site denotes the 1-6 kHz band while the upper box denotes 6-96 kHz band. Lad seamount has deployments at two sites...N00014-11-1-0697 http://cetus.ucsd.edu Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of
OASIS: A Data Fusion System Optimized for Access to Distributed Archives
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Kong, M.; Good, J. C.
2002-05-01
The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.
Eakle, Wade; Haggerty, Patti; Fuller, Mark; Phillips, Susan L.
2013-01-01
The purpose of this Data Series report is to provide the occasions, locations, and counts when golden eagles were recorded during the annual Midwinter Bald Eagle Surveys. Golden eagles (Aquila chrysaetos) are protected by Federal statutes including the Bald and Golden Eagle Protection Act (BGEPA) (16 USC 668-668c) and the Migratory Bird Treaty Act (MBTA) (16 USC 703-12). The U.S. Fish and Wildlife Service (Service) manages golden eagles with the goal of maintaining stable or increasing breeding populations (U.S. Fish and Wildlife Service, 2009). Development for the generation of electricity from wind turbines is occurring in much of the range of the golden eagle in the western United States. Development could threaten population stability because golden eagles might be disturbed by construction and operation of facilities and they are vulnerable to mortality from collisions with wind turbines (Smallwood and Thelander, 2008). Therefore, the Service has proposed a process by which wind energy developers can collect information that could lead to Eagle Conservation Plans (ECP), mitigation, and permitting that allow for golden eagle management in areas of wind energy development (U.S. Fish and Wildlife Service, 2011). The Service recommends that ECP be developed in stages, and the first stage is to learn if golden eagles occur at the landscape level where potential wind facilities might be located. Information about where eagles occur can be obtained from technical literature, agency files, and other sources of information including on-line biological databases. The broad North American distribution of golden eagles is known, but there is a paucity of readily available information about intermediate geographic scales and site-specific scales, especially during the winter season (Kochert and others, 2002).
[Selected aspects of computer-assisted literature management].
Reiss, M; Reiss, G
1998-01-01
We want to report about our own experiences with a database manager. Bibliography database managers are used to manage information resources: specifically, to maintain a database to references and create bibliographies and reference lists for written works. A database manager allows to enter summary information (record) for articles, book sections, books, dissertations, conference proceedings, and so on. Other features that may be included in a database manager include the ability to import references from different sources, such as MEDLINE. The word processing components allow to generate reference list and bibliographies in a variety of different styles, generates a reference list from a word processor manuscript. The function and the use of the software package EndNote 2 for Windows are described. Its advantages in fulfilling different requirements for the citation style and the sort order of reference lists are emphasized.