Sample records for management system dbms

  1. Application of a data base management system to a finite element model

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1980-01-01

    In today's software market, much effort is being expended on the development of data base management systems (DBMS). Most commercially available DBMS were designed for business use. However, the need for such systems within the engineering and scientific communities is becoming apparent. A potential DBMS application that appears attractive is the handling of data for finite element engineering models. The applications of a commercially available, business-oriented DBMS to a structural engineering, finite element model is explored. The model, DBMS, an approach to using the DBMS, advantages and disadvantages are described. Plans for research on a scientific and engineering DBMS are discussed.

  2. Generalized File Management System or Proto-DBMS?

    ERIC Educational Resources Information Center

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  3. The data base management system alternative for computing in the human services.

    PubMed

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  4. Generalized Database Management System Support for Numeric Database Environments.

    ERIC Educational Resources Information Center

    Dominick, Wayne D.; Weathers, Peggy G.

    1982-01-01

    This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…

  5. Microcomputer Database Management Systems for Bibliographic Data.

    ERIC Educational Resources Information Center

    Pollard, Richard

    1986-01-01

    Discusses criteria for evaluating microcomputer database management systems (DBMS) used for storage and retrieval of bibliographic data. Two popular types of microcomputer DBMS--file management systems and relational database management systems--are evaluated with respect to these criteria. (Author/MBR)

  6. Office automation: The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  7. The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1984-01-01

    A good office automation system manned by a team of facilitators seeking opportunities to serve end users could go a long way toward defining a DBMS that serves management. The problems of DBMS organization, alternative approaches to solving some of the major problems, problems that may have no solution, and how office automation fits into the development of the manager's management information system are discussed.

  8. Data Base Management Systems Panel Workshop: Executive summary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Data base management systems (DBMS) for space acquired and associated data are discussed. The full range of DBMS needs is covered including acquiring, managing, storing, archiving, accessing and dissemination of data for an application. Existing bottlenecks in DBMS operations, expected developments in the field of remote sensing, communications, and computer science are discussed, and an overview of existing conditions and expected problems is presented. The requirements for a proposed spatial information system and characteristics of a comprehensive browse facility for earth observations applications are included.

  9. A Data Base Management System for Clinical and Epidemiologic Studies In Systemic Lupus Erythematosus: Design and Maintenance

    PubMed Central

    Kosmides, Victoria S.; Hochberg, Marc C.

    1984-01-01

    This report describes the development, design specifications, features and implementation of a data base management system (DBMS) for clinical and epidemiologic studies in SLE. The DBMS is multidimensional with arrays formulated across patients, studies and variables. The major impact of this DBMS has been to increase the efficiency of managing and analyzing vast amounts of clinical and laboratory data and, as a result, to allow for continued growth in research productivity in areas related to SLE.

  10. Comparison of scientific and administrative database management systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, J. C.

    1983-01-01

    Some characteristics found to be different for scientific and administrative data bases are identified and some of the corresponding generic requirements for data base management systems (DBMS) are discussed. The requirements discussed are especially stringent for either the scientific or administrative data bases. For some, no commercial DBMS is fully satisfactory, and the data base designer must invent a suitable approach. For others, commercial systems are available with elegant solutions, and a wrong choice would mean an expensive work-around to provide the missing features. It is concluded that selection of a DBMS must be based on the requirements for the information system. There is no unique distinction between scientific and administrative data bases or DBMS. The distinction comes from the logical structure of the data, and understanding the data and their relationships is the key to defining the requirements and selecting an appropriate DBMS for a given set of applications.

  11. Interconnecting heterogeneous database management systems

    NASA Technical Reports Server (NTRS)

    Gligor, V. D.; Luckenbaugh, G. L.

    1984-01-01

    It is pointed out that there is still a great need for the development of improved communication between remote, heterogeneous database management systems (DBMS). Problems regarding the effective communication between distributed DBMSs are primarily related to significant differences between local data managers, local data models and representations, and local transaction managers. A system of interconnected DBMSs which exhibit such differences is called a network of distributed, heterogeneous DBMSs. In order to achieve effective interconnection of remote, heterogeneous DBMSs, the users must have uniform, integrated access to the different DBMs. The present investigation is mainly concerned with an analysis of the existing approaches to interconnecting heterogeneous DBMSs, taking into account four experimental DBMS projects.

  12. Data Base Management Systems Panel. Third workshop summary

    NASA Technical Reports Server (NTRS)

    Urena, J. L. (Editor)

    1981-01-01

    The discussions and results of a review by a panel of data base management system (DRMS) experts of various aspects of the use of DBMSs within NASA/Office of Space and Terrestrial Applications (OSTA) and related organizations are summarized. The topics discussed included the present status of the use of DBMS technology and of the various ongoing DBMS-related efforts within NASA. The report drafts of a study that seeks to determine the functional requirements for a generalized DBMS for the NASA/OSTA and related data bases are examined. Future problems and possibilities with the use of DBMS technology are also considered. A list of recommendations for NASA/OSTA data systems is included.

  13. ARCADIA: a system for the integration of angiocardiographic data and images by an object-oriented DBMS.

    PubMed

    Pinciroli, F; Combi, C; Pozzi, G

    1995-02-01

    Use of data base techniques to store medical records has been going on for more than 40 years. Some aspects still remain unresolved, e.g., the management of textual data and image data within a single system. Object-orientation techniques applied to a database management system (DBMS) allow the definition of suitable data structures (e.g., to store digital images): some facilities allow the use of predefined structures when defining new ones. Currently available object-oriented DBMS, however, still need improvements both in the schema update and in the query facilities. This paper describes a prototype of a medical record that includes some multimedia features, managing both textual and image data. The prototype here described considers data from the medical records of patients subjected to percutaneous transluminal coronary artery angioplasty. We developed it on a Sun workstation with a Unix operating system and ONTOS as an object-oriented DBMS.

  14. Geometrical model for DBMS: an experimental DBMS using IBM solid modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, D.E.D.L.

    1985-01-01

    This research presents a new model for data base management systems (DBMS). The new model, Geometrical DBMS, is based on using solid modelling technology in designing and implementing DBMS. The Geometrical DBMS is implemented using the IBM solid modelling Geometric Design Processor (GDP). Built basically on computer-graphics concepts, Geometrical DBMS is indeed a unique model. Traditionally, researchers start with one of the existent DBMS models and then put a graphical front end on it. In Geometrical DBMS, the graphical aspect of the model is not an alien concept tailored to the model but is, as a matter of fact, themore » atom around which the model is designed. The main idea in Geometrical DBMS is to allow the user and the system to refer to and manipulate data items as a solid object in 3D space, and representing a record as a group of logically related solid objects. In Geometical DBMS, hierarchical structure is used to present the data relations and the user sees the data as a group of arrays; yet, for the user and the system together, the data structure is a multidimensional tree.« less

  15. NASA Administrative Data Base Management Systems, 1984

    NASA Technical Reports Server (NTRS)

    Radosevich, J. D. (Editor)

    1984-01-01

    Strategies for converting to a data base management system (DBMS) and the implementation of the software packages necessary are discussed. Experiences with DBMS at various NASA centers are related including Langley's ADABAS/NATURAL and the NEMS subsystem of the NASA metrology informaton system. The value of the integrated workstation with a personal computer is explored.

  16. Data management applications

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Kennedy Space Center's primary institutional computer is a 4 megabyte IBM 4341 with 3.175 billion characters of IBM 3350 disc storage. This system utilizes the Software AG product known as ADABAS with the on line user oriented features of NATURAL and COMPLETE as a Data Base Management System (DBMS). It is operational under the OS/VSI and is currently supporting batch/on line applications such as Personnel, Training, Physical Space Management, Procurement, Office Equipment Maintenance, and Equipment Visibility. A third and by far the largest DBMS application is known as the Shuttle Inventory Management System (SIMS) which is operational on a Honeywell 6660 (dedicated) computer system utilizing Honeywell Integrated Data Storage I (IDSI) as the DBMS. The SIMS application is designed to provide central supply system acquisition, inventory control, receipt, storage, and issue of spares, supplies, and materials.

  17. Data base systems in electronic design engineering

    NASA Technical Reports Server (NTRS)

    Williams, D.

    1980-01-01

    The concepts of an integrated design data base system (DBMS) as it might apply to an electronic design company are discussed. Data elements of documentation, project specifications, project tracking, firmware, software, electronic and mechanical design can be integrated and managed through a single DBMS. Combining the attributes of a DBMS data handler with specialized systems and functional data can provide users with maximum flexibility, reduced redundancy, and increased overall systems performance. Although some system overhead is lost due to redundancy in transitory data, it is believed the combination of the two data types is advisable rather than trying to do all data handling through a single DBMS.

  18. How Database Management Systems Can Be Used To Evaluate Program Effectiveness in Small School Districts.

    ERIC Educational Resources Information Center

    Hoffman, Tony

    Sophisticated database management systems (DBMS) for microcomputers are becoming increasingly easy to use, allowing small school districts to develop their own autonomous databases for tracking enrollment and student progress in special education. DBMS applications can be designed for maintenance by district personnel with little technical…

  19. Developing a database management system to support birth defects surveillance in Florida.

    PubMed

    Salemi, Jason L; Hauser, Kimberlea W; Tanner, Jean Paul; Sampat, Diana; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2010-01-01

    The value of any public health surveillance program is derived from the ways in which data are managed and used to improve the public's health. Although birth defects surveillance programs vary in their case volume, budgets, staff, and objectives, the capacity to operate efficiently and maximize resources remains critical to long-term survival. The development of a fully-integrated relational database management system (DBMS) can enrich a surveillance program's data and improve efficiency. To build upon the Florida Birth Defects Registry--a statewide registry relying solely on linkage of administrative datasets and unconfirmed diagnosis codes-the Florida Department of Health provided funding to the University of South Florida to develop and pilot an enhanced surveillance system in targeted areas with a more comprehensive approach to case identification and diagnosis confirmation. To manage operational and administrative complexities, a DBMS was developed, capable of managing transmission of project data from multiple sources, tracking abstractor time during record reviews, offering tools for defect coding and case classification, and providing reports to DBMS users. Since its inception, the DBMS has been used as part of our surveillance projects to guide the receipt of over 200 case lists and review of 12,924 fetuses and infants (with associated maternal records) suspected of having selected birth defects in over 90 birthing and transfer facilities in Florida. The DBMS has provided both anticipated and unexpected benefits. Automation of the processes for managing incoming case lists has reduced clerical workload considerably, while improving accuracy of working lists for field abstraction. Data quality has improved through more effective use of internal edits and comparisons with values for other data elements, while simultaneously increasing abstractor efficiency in completion of case abstraction. We anticipate continual enhancement to the DBMS in the future. While we have focused on enhancing the capacity of our DBMS for birth defects surveillance, many of the tools and approaches we have developed translate directly to other public health and clinical registries.

  20. SPIRE Data-Base Management System

    NASA Technical Reports Server (NTRS)

    Fuechsel, C. F.

    1984-01-01

    Spacelab Payload Integration and Rocket Experiment (SPIRE) data-base management system (DBMS) based on relational model of data bases. Data bases typically used for engineering and mission analysis tasks and, unlike most commercially available systems, allow data items and data structures stored in forms suitable for direct analytical computation. SPIRE DBMS designed to support data requests from interactive users as well as applications programs.

  1. The Network Configuration of an Object Relational Database Management System

    NASA Technical Reports Server (NTRS)

    Diaz, Philip; Harris, W. C.

    2000-01-01

    The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.

  2. A management information system to study space diets

    NASA Technical Reports Server (NTRS)

    Kang, Sukwon; Both, A. J.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A management information system (MIS), including a database management system (DBMS) and a decision support system (DSS), was developed to dynamically analyze the variable nutritional content of foods grown and prepared in an Advanced Life Support System (ALSS) such as required for long-duration space missions. The DBMS was designed around the known nutritional content of a list of candidate crops and their prepared foods. The DSS was designed to determine the composition of the daily crew diet based on crop and nutritional information stored in the DBMS. Each of the selected food items was assumed to be harvested from a yet-to-be designed ALSS biomass production subsystem and further prepared in accompanying food preparation subsystems. The developed DBMS allows for the analysis of the nutrient composition of a sample 20-day diet for future Advanced Life Support missions and is able to determine the required quantities of food needed to satisfy the crew's daily consumption. In addition, based on published crop growth rates, the DBMS was able to calculate the required size of the biomass production area needed to satisfy the daily food requirements for the crew. Results from this study can be used to help design future ALSS for which the integration of various subsystems (e.g., biomass production, food preparation and consumption, and waste processing) is paramount for the success of the mission.

  3. A management information system to study space diets.

    PubMed

    Kang, Sukwon; Both, A J

    2002-01-01

    A management information system (MIS), including a database management system (DBMS) and a decision support system (DSS), was developed to dynamically analyze the variable nutritional content of foods grown and prepared in an Advanced Life Support System (ALSS) such as required for long-duration space missions. The DBMS was designed around the known nutritional content of a list of candidate crops and their prepared foods. The DSS was designed to determine the composition of the daily crew diet based on crop and nutritional information stored in the DBMS. Each of the selected food items was assumed to be harvested from a yet-to-be designed ALSS biomass production subsystem and further prepared in accompanying food preparation subsystems. The developed DBMS allows for the analysis of the nutrient composition of a sample 20-day diet for future Advanced Life Support missions and is able to determine the required quantities of food needed to satisfy the crew's daily consumption. In addition, based on published crop growth rates, the DBMS was able to calculate the required size of the biomass production area needed to satisfy the daily food requirements for the crew. Results from this study can be used to help design future ALSS for which the integration of various subsystems (e.g., biomass production, food preparation and consumption, and waste processing) is paramount for the success of the mission.

  4. Investigation of DBMS for Use in a Research Environment. Rand Paper Series 7002.

    ERIC Educational Resources Information Center

    Rosenfeld, Pilar N.

    This investigation of the use of database management systems (DBMS) in a research environment used the Rand Corporation as a case study. After a general introduction in section 1, eight sections present the major components of the study. Section 2 contains an overview of DBMS terminology and concepts, followed in section 3 by a general dsecription…

  5. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  6. Information Retrieval System Design Issues in a Microcomputer-Based Relational DBMS Environment.

    ERIC Educational Resources Information Center

    Wolfram, Dietmar

    1992-01-01

    Outlines the file structure requirements for a microcomputer-based information retrieval system using FoxPro, a relational database management system (DBMS). Issues relating to the design and implementation of such systems are discussed, and two possible designs are examined in terms of space economy and practicality of implementation. (15…

  7. Methods and Software for Building Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1985-01-01

    This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…

  8. Using a data base management system for modelling SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1985-01-01

    The usefulness of a data base management system (DBMS) for modelling historical test data for the complete series of static test firings for the Space Shuttle Main Engine (SSME) was assessed. From an analysis of user data base query requirements, it became clear that a relational DMBS which included a relationally complete query language would permit a model satisfying the query requirements. Representative models and sample queries are discussed. A list of environment-particular evaluation criteria for the desired DBMS was constructed; these criteria include requirements in the areas of user-interface complexity, program independence, flexibility, modifiability, and output capability. The evaluation process included the construction of several prototype data bases for user assessement. The systems studied, representing the three major DBMS conceptual models, were: MIRADS, a hierarchical system; DMS-1100, a CODASYL-based network system; ORACLE, a relational system; and DATATRIEVE, a relational-type system.

  9. Leveraging Relational Technology through Industry Partnerships.

    ERIC Educational Resources Information Center

    Brush, Leonard M.; Schaller, Anthony J.

    1988-01-01

    Carnegie Mellon University has leveraged its technological expertise with database management systems (DBMS) into joint technological and developmental partnerships with DBMS and application software vendors. Carnegie's relational database strategy, the strategy of partnerships and how they were formed, and how the partnerships are doing are…

  10. Design and utilization of a Flight Test Engineering Database Management System at the NASA Dryden Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Knighton, Donna L.

    1992-01-01

    A Flight Test Engineering Database Management System (FTE DBMS) was designed and implemented at the NASA Dryden Flight Research Facility. The X-29 Forward Swept Wing Advanced Technology Demonstrator flight research program was chosen for the initial system development and implementation. The FTE DBMS greatly assisted in planning and 'mass production' card preparation for an accelerated X-29 research program. Improved Test Plan tracking and maneuver management for a high flight-rate program were proven, and flight rates of up to three flights per day, two times per week were maintained.

  11. Relational Database Design in Information Science Education.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1985-01-01

    Reports on database management system (dbms) applications designed by library school students for university community at University of Iowa. Three dbms design issues are examined: synthesis of relations, analysis of relations (normalization procedure), and data dictionary usage. Database planning prior to automation using data dictionary approach…

  12. Effective organizational solutions for implementation of DBMS software packages

    NASA Technical Reports Server (NTRS)

    Jones, D.

    1984-01-01

    The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.

  13. An incremental database access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, Nicholas; Sellis, Timos

    1994-01-01

    We investigated a number of design and performance issues of interoperable database management systems (DBMS's). The major results of our investigation were obtained in the areas of client-server database architectures for heterogeneous DBMS's, incremental computation models, buffer management techniques, and query optimization. We finished a prototype of an advanced client-server workstation-based DBMS which allows access to multiple heterogeneous commercial DBMS's. Experiments and simulations were then run to compare its performance with the standard client-server architectures. The focus of this research was on adaptive optimization methods of heterogeneous database systems. Adaptive buffer management accounts for the random and object-oriented access methods for which no known characterization of the access patterns exists. Adaptive query optimization means that value distributions and selectives, which play the most significant role in query plan evaluation, are continuously refined to reflect the actual values as opposed to static ones that are computed off-line. Query feedback is a concept that was first introduced to the literature by our group. We employed query feedback for both adaptive buffer management and for computing value distributions and selectivities. For adaptive buffer management, we use the page faults of prior executions to achieve more 'informed' management decisions. For the estimation of the distributions of the selectivities, we use curve-fitting techniques, such as least squares and splines, for regressing on these values.

  14. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  15. Linking Multiple Databases: Term Project Using "Sentences" DBMS.

    ERIC Educational Resources Information Center

    King, Ronald S.; Rainwater, Stephen B.

    This paper describes a methodology for use in teaching an introductory Database Management System (DBMS) course. Students master basic database concepts through the use of a multiple component project implemented in both relational and associative data models. The associative data model is a new approach for designing multi-user, Web-enabled…

  16. Results of data base management system parameterized performance testing related to GSFC scientific applications

    NASA Technical Reports Server (NTRS)

    Carchedi, C. H.; Gough, T. L.; Huston, H. A.

    1983-01-01

    The results of a variety of tests designed to demonstrate and evaluate the performance of several commercially available data base management system (DBMS) products compatible with the Digital Equipment Corporation VAX 11/780 computer system are summarized. The tests were performed on the INGRES, ORACLE, and SEED DBMS products employing applications that were similar to scientific applications under development by NASA. The objectives of this testing included determining the strength and weaknesses of the candidate systems, performance trade-offs of various design alternatives and the impact of some installation and environmental (computer related) influences.

  17. Medical record management systems: criticisms and new perspectives.

    PubMed

    Frénot, S; Laforest, F

    1999-06-01

    The first generation of computerized medical records stored the data as text, but these records did not bring any improvement in information manipulation. The use of a relational database management system (DBMS) has largely solved this problem as it allows for data requests by using SQL. However, this requires data structuring which is not very appropriate to medicine. Moreover, the use of templates and icon user interfaces has introduced a deviation from the paper-based record (still existing). The arrival of hypertext user interfaces has proven to be of interest to fill the gap between the paper-based medical record and its electronic version. We think that further improvement can be accomplished by using a fully document-based system. We present the architecture, advantages and disadvantages of classical DBMS-based and Web/DBMS-based solutions. We also present a document-based solution and explain its advantages, which include communication, security, flexibility and genericity.

  18. General specifications for the development of a USL/DBMS NASA/PC R and D distributed workstation

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.

    1984-01-01

    The general specifications for the development of a PC-based distributed workstation (PCDWS) for an information storage and retrieval systems environment are defined. This research proposes the development of a PCDWS prototype as part of the University of Southwestern Louisiana Data Base Management System (USL/DBMS) NASA/PC R and D project in the PC-based workstation environment.

  19. Report on Approaches to Database Translation. Final Report.

    ERIC Educational Resources Information Center

    Gallagher, Leonard; Salazar, Sandra

    This report describes approaches to database translation (i.e., transferring data and data definitions from a source, either a database management system (DBMS) or a batch file, to a target DBMS), and recommends a method for representing the data structures of newly-proposed network and relational data models in a form suitable for database…

  20. 76 FR 8349 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-14

    ... provides the Military Health System (MHS) with a comprehensive enterprise wide Blood Donor Management System (DBMS) and Blood Transfusion Management System (BTMS) with capabilities to manage blood donors... donors, patients, and products; automated, blood order issue, and transfusion records; manage enterprise...

  1. Asynchronous data change notification between database server and accelerator controls system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less

  2. Generic functional requirements for a NASA general-purpose data base management system

    NASA Technical Reports Server (NTRS)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  3. An Intelligent Terminal for Access to a Medical Database

    PubMed Central

    Womble, M. E.; Wilson, S. D.; Keiser, H. N.; Tworek, M. L.

    1978-01-01

    Very powerful data base management systems (DBMS) now exist which allow medical personnel access to patient record data bases. DBMS's make it easy to retrieve either complete or abbreviated records of patients with similar characteristics. In addition, statistics on data base records are immediately accessible. However, the price of this power is a large computer with the inherent problems of access, response time, and reliability. If a general purpose, time-shared computer is used to get this power, the response time to a request can be either rapid or slow, depending upon loading by other users. Furthermore, if the computer is accessed via dial-up telephone lines, there is competition with other users for telephone ports. If either the DBMS or the host machine is replaced, the medical users, who are typically not sophisticated in computer usage, are forced to learn the new system. Microcomputers, because of their low cost and adaptability, lend themselves to a solution of these problems. A microprocessor-based intelligent terminal has been designed and implemented at the USAF School of Aerospace Medicine to provide a transparent interface between the user and his data base. The intelligent terminal system includes multiple microprocessors, floppy disks, a CRT terminal, and a printer. Users interact with the system at the CRT terminal using menu selection (framing). The system translates the menu selection into the query language of the DBMS and handles all actual communication with the DBMS and its host computer, including telephone dialing and sign on procedures, as well as the actual data base query and response. Retrieved information is stored locally for CRT display, hard copy production, and/or permanent retention. Microprocessor-based communication units provide security for sensitive medical data through encryption/decryption algorithms and high reliability error detection transmission schemes. Highly modular software design permits adapation to a different DBMS and/or host computer with only minor localized software changes. Importantly, this portability is completely transparent to system users. Although the terminal system is independent of the host computer and its DBMS, it has been linked to a UNIVAC 1108 computer supporting MRI's SYSTEM 2000 DBMS.

  4. Changing an automated drug inventory control system to a data base design.

    PubMed

    Bradish, R A

    1982-09-01

    A pharmacy department's change from indexed sequential access files to a data base management system (DBMS) for purposes of automated inventory control is described. The DBMS has three main functional areas: (1) inventory ordering and accountability, (2) charging of interdepartmental and intradepartmental orders, and (3) data manipulation with report design for management control. There are seven files directly related to the inventory ordering and accountability area. Each record can be accessed directly or through another file. Information on the quantity of a drug on hand, drug(s) supplied by a specific vendor, status of a purchase order, or calculation of an estimated order quantity can be retrieved quickly. In the drug master file, two records contain a reorder point and safety-stock level that are determined by searching the entries in the order history file and vendor master file. The intradepartmental and interdepartmental orders section contains five files assigned to record and store information on drug distribution. All items removed from the stockroom and distributed are recorded, and reports can be generated for itemized bills, total cost by area, and as formatted files for the accounts payable department. The design, development, and implementation of the DBMS took approximately a year using a part-time pharmacist and minimal outside help, while the previous system required constant expensive help of a programmer/analyst. The DBMS has given the pharmacy department a flexible inventory management system with increased drug control, decreased operating expenses, increased use of department personnel, and the ability to develop and enhance other systems.

  5. National Computer Security Conference Proceedings (11th): A Postscript: Computer Security--Into the Future, 17-20 October 1988

    DTIC Science & Technology

    1988-10-20

    The LOCK project , from its very beginnings as an implementation study for the Provably Secure Operating System in 1979...to the security field, can study to gain insight into the evaluation process. The project has developed an innovative format for the DTLS and FTLS...management tern becomes available, the Al Secure DBMS will be system (DBMS) that is currently being developed un- ported to it . der the Advanced

  6. Creation of a Book Order Management System Using a Microcomputer and a DBMS.

    ERIC Educational Resources Information Center

    Neill, Charlotte; And Others

    1985-01-01

    Describes management decisions and resultant technology-based system that allowed a medical library to meet increasing workloads without accompanying increases in resources available. Discussion covers system analysis; capabilities of book-order management system, "BOOKDIRT;" software and training; hardware; data files; data entry;…

  7. Computer Security Products Technology Overview

    DTIC Science & Technology

    1988-10-01

    13 3. DATABASE MANAGEMENT SYSTEMS ................................... 15 Definition...this paper addresses fall into the areas of multi-user hosts, database management systems (DBMS), workstations, networks, guards and gateways, and...provide a portion of that protection, for example, a password scheme, a file protection mechanism, a secure database management system, or even a

  8. Object-orientated DBMS techniques for time-oriented medical record.

    PubMed

    Pinciroli, F; Combi, C; Pozzi, G

    1992-01-01

    In implementing time-orientated medical record (TOMR) management systems, use of a relational model played a big role. Many applications have been developed to extend query and data manipulation languages to temporal aspects of information. Our experience in developing TOMR revealed some deficiencies inside the relational model, such as: (a) abstract data type definition; (b) unified view of data, at a programming level; (c) management of temporal data; (d) management of signals and images. We identified some first topics to face by an object-orientated approach to database design. This paper describes the first steps in designing and implementing a TOMR by an object-orientated DBMS.

  9. Database Entity Persistence with Hibernate for the Network Connectivity Analysis Model

    DTIC Science & Technology

    2014-04-01

    time savings in the Java coding development process. Appendices A and B describe address setup procedures for installing the MySQL database...development environment is required: • The open source MySQL Database Management System (DBMS) from Oracle, which is a Java Database Connectivity (JDBC...compliant DBMS • MySQL JDBC Driver library that comes as a plug-in with the Netbeans distribution • The latest Java Development Kit with the latest

  10. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  11. Expert system shell to reason on large amounts of data

    NASA Technical Reports Server (NTRS)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  12. An overview of the USL/DBMS NASA/PC R and D project working paper series

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor)

    1984-01-01

    An introduction is given to the University of Southwestern Louisiana Data Base Management System (USL/DBMS) NASA/PC R and D Working Paper Series which has been established to provide a foundation for both a formal and informal information dissemination mechanism concerning PC-based research and development activities being performed pursuant to the NASA contract. This entry also serves as an index to the collection of Working Paper Series reports.

  13. Managing data from remeasured plots: An evaluation of existing systems

    Treesearch

    John C. Byrne; Michael D. Sweet

    1992-01-01

    Proper management of the valuable data from remeasured (or permanent) forest growth plots with data base management systems (DBMS) can greatly add to their utility. Twelve desired features for such a system (activities that facilitate the storage, accuracy, and use of the data for analysis) are described and used to evaluate the 36 systems found by a survey conducted...

  14. The Next Step in Educational Program Budgets and Information Resource Management: Integrated Data Structures.

    ERIC Educational Resources Information Center

    Jackowski, Edward M.

    1988-01-01

    Discusses the role that information resource management (IRM) plays in educational program-oriented budgeting (POB), and presents a theoretical IRM model. Highlights include design considerations for integrated data systems; database management systems (DBMS); and how POB data can be integrated to enhance its value and use within an educational…

  15. Information Management Tools for Classrooms: Exploring Database Management Systems. Technical Report No. 28.

    ERIC Educational Resources Information Center

    Freeman, Carla; And Others

    In order to understand how the database software or online database functioned in the overall curricula, the use of database management (DBMs) systems was studied at eight elementary and middle schools through classroom observation and interviews with teachers and administrators, librarians, and students. Three overall areas were addressed:…

  16. 10 CFR 2.1011 - Management of electronic information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Management of electronic information. 2.1011 Section 2... High-Level Radioactive Waste at a Geologic Repository § 2.1011 Management of electronic information. (a... Language)-compliant (ANSI IX3.135-1992/ISO 9075-1992) database management system (DBMS). Alternatively, the...

  17. Serials Management by Microcomputer: The Potential of DBMS.

    ERIC Educational Resources Information Center

    Vogel, J. Thomas; Burns, Lynn W.

    1984-01-01

    Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…

  18. Database Software Selection for the Egyptian National STI Network.

    ERIC Educational Resources Information Center

    Slamecka, Vladimir

    The evaluation and selection of information/data management system software for the Egyptian National Scientific and Technical (STI) Network are described. An overview of the state-of-the-art of database technology elaborates on the differences between information retrieval and database management systems (DBMS). The desirable characteristics of…

  19. Advances in Data Management in Remote Sensing and Climate Modeling

    NASA Astrophysics Data System (ADS)

    Brown, P. G.

    2014-12-01

    Recent commercial interest in "Big Data" information systems has yielded little more than a sense of deja vu among scientists whose work has always required getting their arms around extremely large databases, and writing programs to explore and analyze it. On the flip side, there are some commercial DBMS startups building "Big Data" platform using techniques taken from earth science, astronomy, high energy physics and high performance computing. In this talk, we will introduce one such platform; Paradigm4's SciDB, the first DBMS designed from the ground up to combine the kinds of quality-of-service guarantees made by SQL DBMS platforms—high level data model, query languages, extensibility, transactions—with the kinds of functionality familiar to scientific users—arrays as structural building blocks, integrated linear algebra, and client language interfaces that minimize the learning curve. We will review how SciDB is used to manage and analyze earth science data by several teams of scientific users.

  20. DCMS: A data analytics and management system for molecular simulation.

    PubMed

    Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni

    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.

  1. Composite Materials Design Database and Data Retrieval System Requirements

    DTIC Science & Technology

    1991-08-01

    the present time, the majority of expert systems are stand-alone systems, and environments for effectively coupling heuristic data management with...nonheuristic data management remain to be developed. The only available recourse is to resort to traditional DBMS development and use, and to service...Organization for Data Management . Academic Press, 1986. Glaeser, P. S. (ed). " Data for Science and Technology." Proceedings of the Seventh

  2. Optical mass memory system (AMM-13). AMM/DBMS interface control document

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.

    1980-01-01

    The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.

  3. DBMS as a Tool for Project Management

    NASA Technical Reports Server (NTRS)

    Linder, H.

    1984-01-01

    Scientific objectives of crustal dynamics are listed as well as the contents of the centralized data information system for the crustal dynamics project. The system provides for project observation schedules, gives project configuration control information and project site information.

  4. A Relational/Object-Oriented Database Management System: R/OODBMS

    DTIC Science & Technology

    1992-09-01

    Concepts In 1968, Dr. Edgar F . Codd had the idea that "predicate logic could be applied to maintaining the logical integrity of the data" in a DBMS [CD90, p...Hall, Inc., Englewood Cliffs, NJ, 1990. [Co70] Codd , E. F ., "A Relational Model for Large Shared Data Banks," Communications of the ACM, v. 13, no.6...pp. 377-387 Jun 1970. [CD90] Interview between E. F . Codd and DBMS, "Relational philosopher: the creator of the relational model talks about his

  5. Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Hoerger, J.

    1984-01-01

    Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.

  6. Automated Hierarchical to CODASYL (Conference on Data Systems Languages) Database Interface Schema Translator.

    DTIC Science & Technology

    1983-12-16

    management system (DBMS) is to record and maintain information used by an organization in the organization’s decision-making process. Some advantages of a...independence. Database Management Systems are classified into three major models; relational, network, and hierarchical. Each model uses a software...feeling impedes the overall effectiveness of the 4-" Acquisition Management Information System (AMIS), which currently uses S2k. The size of the AMIS

  7. Effective spatial database support for acquiring spatial information from remote sensing images

    NASA Astrophysics Data System (ADS)

    Jin, Peiquan; Wan, Shouhong; Yue, Lihua

    2009-12-01

    In this paper, a new approach to maintain spatial information acquiring from remote-sensing images is presented, which is based on Object-Relational DBMS. According to this approach, the detected and recognized results of targets are stored and able to be further accessed in an ORDBMS-based spatial database system, and users can access the spatial information using the standard SQL interface. This approach is different from the traditional ArcSDE-based method, because the spatial information management module is totally integrated into the DBMS and becomes one of the core modules in the DBMS. We focus on three issues, namely the general framework for the ORDBMS-based spatial database system, the definitions of the add-in spatial data types and operators, and the process to develop a spatial Datablade on Informix. The results show that the ORDBMS-based spatial database support for image-based target detecting and recognition is easy and practical to be implemented.

  8. Development of a 3D GIS and its application to karst areas

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Xu, Hua; Zhou, Wanfang

    2008-05-01

    There is a growing interest in modeling and analyzing karst phenomena in three dimensions. This paper integrates geology, groundwater hydrology, geographic information system (GIS), database management system (DBMS), visualization and data mining to study karst features in Huaibei, China. The 3D geo-objects retrieved from the karst area are analyzed and mapped into different abstract levels. The spatial relationships among the objects are constructed by a dual-linker. The shapes of the 3D objects and the topological models with attributes are stored and maintained in the DBMS. Spatial analysis was then used to integrate the data in the DBMS and the 3D model to form a virtual reality (VR) to provide analytical functions such as distribution analysis, correlation query, and probability assessment. The research successfully implements 3D modeling and analyses in the karst area, and meanwhile provides an efficient tool for government policy-makers to set out restrictions on water resource development in the area.

  9. Strategies for converting to a DBMS environment

    NASA Technical Reports Server (NTRS)

    Durban, D. M.

    1984-01-01

    The conversion to data base management systems processing techniques consists of three different strategies - one for each of the major stages in the development process. Each strategy was chosen for its approach in bringing about a smooth evolutionary type transition from one mode of operation to the next. The initial strategy of the indoctrination stage consisted of: (1) providing maximum access to current administrative data as soon as possible; (2) select and developing small prototype systems; (3) establishing a user information center as a central focal point for user training and assistance; and (4) developing a training program for programmers, management and ad hoc users in DBMS application and utilization. Security, the rate of the data dictionary, and data base tuning and capacity planning, and the development of a change of attitude in an automated office are issues meriting consideration.

  10. The implementation of POSTGRES

    NASA Technical Reports Server (NTRS)

    Stonebraker, Michael; Rowe, Lawrence A.; Hirohama, Michael

    1990-01-01

    The design and implementation decisions made for the three-dimensional data manager POSTGRES are discussed. Attention is restricted to the DBMS backend functions. The POSTGRES data model and query language, the rules system, the storage system, the POSTGRES implementation, and the current status and performance are discussed.

  11. Relational Information Management Data-Base System

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Erickson, W. J.; Gray, F. P.; Comfort, D. L.; Wahlstrom, S. O.; Von Limbach, G.

    1985-01-01

    DBMS with several features particularly useful to scientists and engineers. RIM5 interfaced with any application program written in language capable of Calling FORTRAN routines. Applications include data management for Space Shuttle Columbia tiles, aircraft flight tests, high-pressure piping, atmospheric chemistry, census, university registration, CAD/CAM Geometry, and civil-engineering dam construction.

  12. An Investigation of the Fine Spatial Structure of Meteor Streams Using the Relational Database ``Meteor''

    NASA Astrophysics Data System (ADS)

    Karpov, A. V.; Yumagulov, E. Z.

    2003-05-01

    We have restored and ordered the archive of meteor observations carried out with a meteor radar complex ``KGU-M5'' since 1986. A relational database has been formed under the control of the Database Management System (DBMS) Oracle 8. We also improved and tested a statistical method for studying the fine spatial structure of meteor streams with allowance for the specific features of application of the DBMS. Statistical analysis of the results of observations made it possible to obtain information about the substance distribution in the Quadrantid, Geminid, and Perseid meteor streams.

  13. Compilation of the data-base of the star catalogue by ADABAS.

    NASA Astrophysics Data System (ADS)

    Ishikawa, T.

    A data-base of the FK4 Star Catalogue is compiled by using HITAC M-280H in the Computer Center of Tokyo University and a commercial data-base management system (DBMS) ADABAS. The purpose of this attempt is to examine whether the ADABAS, which could be regarded as a representative of the currently available DBMS's developed majorly for business and information retrieval purposes, proves itself useful for handling mass numerical data like the star catalogue data. It is concluded that the data-base could really be a convenient way for storing and utilizing the star catalogue data.

  14. ADVICE--Educational System for Teaching Database Courses

    ERIC Educational Resources Information Center

    Cvetanovic, M.; Radivojevic, Z.; Blagojevic, V.; Bojovic, M.

    2011-01-01

    This paper presents a Web-based educational system, ADVICE, that helps students to bridge the gap between database management system (DBMS) theory and practice. The usage of ADVICE is presented through a set of laboratory exercises developed to teach students conceptual and logical modeling, SQL, formal query languages, and normalization. While…

  15. Cryptanalysis of Password Protection of Oracle Database Management System (DBMS)

    NASA Astrophysics Data System (ADS)

    Koishibayev, Timur; Umarova, Zhanat

    2016-04-01

    This article discusses the currently available encryption algorithms in the Oracle database, also the proposed upgraded encryption algorithm, which consists of 4 steps. In conclusion we make an analysis of password encryption of Oracle Database.

  16. Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program

    PubMed Central

    Lomatch, Diane; Truax, Terry; Savage, Peter

    1981-01-01

    A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.

  17. Improving Collaborative School-Agency Transition Planning: A Statewide DBMS Approach.

    ERIC Educational Resources Information Center

    Peterson, Randolph L.; Roessler, Richard T.

    1997-01-01

    Describes the development and components of a referral database management system developed by the Arkansas Transition Project. The system enables individualized-education-plan team members to refer students with disabilities directly to adult agencies and to receive a monitoring report describing the agency response to the referral. The system is…

  18. Airland Battlefield Environment (ALBE) Tactical Decision Aid (TDA) Demonstration Program,

    DTIC Science & Technology

    1987-11-12

    Management System (DBMS) software, GKS graphics libraries, and user interface software. These components of the ATB system software architecture will be... knowlede base ano auqent the decision mak:n• process by providing infocr-mation useful in the formulation and execution of battlefield strategies...Topographic Laboratories as an Engineer. Ms. Capps is managing the software development of the AirLand Battlefield Environment (ALBE) geographic

  19. Management Principles to be Considered for Implementing a Data Base Management System Aboard U.S. (United States) Naval Ships under the Shipboard Non-Tactical ADP (Automated Data Processing) Program.

    DTIC Science & Technology

    1982-12-01

    Data Base Management System Aboard U.S. Naval Ships Under the Shipboard Non-tactical ADP Program by Robert Harrison Dixon December 1982 Thesis Advisor...OF REPORT a PERIOD COVIAOtt Management Principles to be Considered for Master’s Thesis Implementing a Data Base Management System December 1982 Aboard...NOTES is. KEY s0mas (Coelte on revrs side of 0..e..mp am iNe or "Neo 00111) Data Base Management System , DBMS, SNAP, SNAP I, SNAP II, Information

  20. The Cronus Distributed DBMS (Database Management System) Project

    DTIC Science & Technology

    1989-10-01

    projects, e.g., HiPAC [Dayal 88] and Postgres [Stonebraker 86]. Although we expect to use these techniques, they have been developed for centralized...Computing Systems, June 1989. (To appear). [Stonebraker 86] Stonebraker, M. and Rowe, L. A., "The Design of POSTGRES ," Proceedings ACM SIGMOD Annual

  1. Android platform based smartphones for a logistical remote association repair framework.

    PubMed

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-06-25

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  2. Applications of GIS and database technologies to manage a Karst Feature Database

    USGS Publications Warehouse

    Gao, Y.; Tipping, R.G.; Alexander, E.C.

    2006-01-01

    This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.

  3. The Education of Librarians for Data Administration.

    ERIC Educational Resources Information Center

    Koenig, Michael E. D.; Kochoff, Stephen T.

    1983-01-01

    Argues that the increasing importance of database management systems (DBMS) and recognition of the information dependency of business planning are creating new job opportunities for librarians/information technicians. Highlights include development and functions of DBMSs, data and database administration, potential for librarians, and implications…

  4. Clinical Views: Object-Oriented Views for Clinical Databases

    PubMed Central

    Portoni, Luisa; Combi, Carlo; Pinciroli, Francesco

    1998-01-01

    We present here a prototype of a clinical information system for the archiving and the management of multimedia and temporally-oriented clinical data related to PTCA patients. The system is based on an object-oriented DBMS and supports multiple views and view schemas on patients' data. Remote data access is supported too.

  5. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    NASA Technical Reports Server (NTRS)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  6. Research in Functionally Distributed Computer Systems Development. Volume XII. Design Considerations in Distributed Data Base Management Systems.

    DTIC Science & Technology

    1977-04-01

    task of data organization, management, and storage has been given to a select group of specialists . These specialists (the Data Base Administrators...report writers, etc.) the task of data organi?9tion, management, and storage has been given to a select group of specialists . These specialists (the...distributed DBMS Involves first identifying a set of two or more tasks blocking each other from a collection of shared 12 records. Once the set of

  7. Android Platform Based Smartphones for a Logistical Remote Association Repair Framework

    PubMed Central

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-01-01

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use. PMID:24967603

  8. Advanced Query Formulation in Deductive Databases.

    ERIC Educational Resources Information Center

    Niemi, Timo; Jarvelin, Kalervo

    1992-01-01

    Discusses deductive databases and database management systems (DBMS) and introduces a framework for advanced query formulation for end users. Recursive processing is described, a sample extensional database is presented, query types are explained, and criteria for advanced query formulation from the end user's viewpoint are examined. (31…

  9. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  10. The USL NASA PC R and D project: General specifications of objectives

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor)

    1984-01-01

    Given here are the general specifications of the objectives of the University of Southwestern Louisiana Data Base Management System (USL/DBMS) NASA PC R and D Project, a project initiated to address future R and D issues related to PC-based processing environments acquired pursuant to the NASA contract work; namely, the IBM PC/XT systems.

  11. Data base management system analysis and performance testing with respect to NASA requirements

    NASA Technical Reports Server (NTRS)

    Martin, E. A.; Sylto, R. V.; Gough, T. L.; Huston, H. A.; Morone, J. J.

    1981-01-01

    Several candidate Data Base Management Systems (DBM's) that could support the NASA End-to-End Data System's Integrated Data Base Management System (IDBMS) Project, later rescoped and renamed the Packet Management System (PMS) were evaluated. The candidate DBMS systems which had to run on the Digital Equipment Corporation VAX 11/780 computer system were ORACLE, SEED and RIM. Oracle and RIM are both based on the relational data base model while SEED employs a CODASYL network approach. A single data base application which managed stratospheric temperature profiles was studied. The primary reasons for using this application were an insufficient volume of available PMS-like data, a mandate to use actual rather than simulated data, and the abundance of available temperature profile data.

  12. Enriched Title-Based Keyword Index Generation Using dBase II.

    ERIC Educational Resources Information Center

    Rajendran, P. P.

    1986-01-01

    Describes the use of a database management system (DBMS)--dBaseII--to create an enriched title-based keyword index for a collection of news items at the Renewable Energy Resources Information Center of the Asian Institute of Technology. The use of DBMSs in libraries in developing countries is emphasized. (Author/LRW)

  13. Data Administration at a Regional University: A Case Study.

    ERIC Educational Resources Information Center

    Gose, Frank J.

    Data administration (DA) is a position that has emerged with the growth of information technologies. A review of DA literature confirms that, although DA is widely associated with database management systems (DBMS), there is no standard DA job description, DA staffing and location within the organization vary, and DA functions range in description…

  14. The design and implementation of EPL: An event pattern language for active databases

    NASA Technical Reports Server (NTRS)

    Giuffrida, G.; Zaniolo, C.

    1994-01-01

    The growing demand for intelligent information systems requires closer coupling of rule-based reasoning engines, such as CLIPS, with advanced data base management systems (DBMS). For instance, several commercial DBMS now support the notion of triggers that monitor events and transactions occurring in the database and fire induced actions, which perform a variety of critical functions, including safeguarding the integrity of data, monitoring access, and recording volatile information needed by administrators, analysts, and expert systems to perform assorted tasks; examples of these tasks include security enforcement, market studies, knowledge discovery, and link analysis. At UCLA, we designed and implemented the event pattern language (EPL) which is capable of detecting and acting upon complex patterns of events which are temporally related to each other. For instance, a plant manager should be notified when a certain pattern of overheating repeats itself over time in a chemical process; likewise, proper notification is required when a suspicious sequence of bank transactions is executed within a certain time limit. The EPL prototype is built in CLIPS to operate on top of Sybase, a commercial relational DBMS, where actions can be triggered by events such as simple database updates, insertions, and deletions. The rule-based syntax of EPL allows the sequences of goals in rules to be interpreted as sequences of temporal events; each goal can correspond to either (1) a simple event, or (2) a (possibly negated) event/condition predicate, or (3) a complex event defined as the disjunction and repetition of other events. Various extensions have been added to CLIPS in order to tailor the interface with Sybase and its open client/server architecture.

  15. Natural language query system design for interactive information storage and retrieval systems. Presentation visuals. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Liu, I-Hsiung

    1985-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled Natural Language Query System Design for Interactive Information Storage and Retrieval Systems, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-17.

  16. Selecting a Persistent Data Support Environment for Object-Oriented Applications

    DTIC Science & Technology

    1998-03-01

    key features of most object DBMS products is contained in the <DWAS 9{eeds Assessment for Objects from Barry and Associates. The developer should...data structure and behavior in a self- contained module enhances maintainability of the system and promotes reuse of modules for similar domains...considered together, represent a survey of commercial object-oriented database management systems. These references contain detailed information needed

  17. An Analysis of Data Dictionaries and Their Role in Information Resource Management.

    DTIC Science & Technology

    1984-09-01

    management system DBMS). It manages data by utilizing software routines built Lato the idta di:-tionary package and thus is not dependent 3n D.IrS soft- wace ...described as having active or passive interfaces or a combination of the two. An inter- faze is a series of commands which connact the data...carefully conceived examples in the ii:tionary’s refer- enzc manuals. A hierarchy of menus can reduce complex oper- ations to a series of smaller

  18. Bone metastasis target redox-responsive micell for the treatment of lung cancer bone metastasis and anti-bone resorption.

    PubMed

    Ye, Wei-Liang; Zhao, Yi-Pu; Cheng, Ying; Liu, Dao-Zhou; Cui, Han; Liu, Miao; Zhang, Bang-Le; Mei, Qi-Bing; Zhou, Si-Yuan

    2018-01-16

    In order to inhibit the growth of lung cancer bone metastasis and reduce the bone resorption at bone metastasis sites, a bone metastasis target micelle DOX@DBMs-ALN was prepared. The size and the zeta potential of DOX@DBNs-ALN were about 60 nm and -15 mV, respectively. DOX@DBMs-ALN exhibited high binding affinity with hydroxyapatite and released DOX in redox-responsive manner. DOX@DBMs-ALN was effectively up taken by A549 cells and delivered DOX to the nucleus of A549 cells, which resulted in strong cytotoxicity on A549 cells. The in vivo experimental results indicated that DOX@DBMs-ALN specifically delivered DOX to bone metastasis site and obviously prolonged the retention time of DOX in bone metastasis site. Moreover, DOX@DBMs-ALN not only significantly inhibited the growth of bone metastasis tumour but also obviously reduced the bone resorption at bone metastasis sites without causing marked systemic toxicity. Thus, DOX@DBMs-ALN has great potential in the treatment of lung cancer bone metastasis.

  19. Quantitative Evaluation of 3 DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Sylto, R.

    1984-01-01

    Characteristics required for NASA scientific data base management application are listed as well as performance testing objectives. Results obtained for the ORACLE, SEED, and INGRES packages are presented in charts. It is concluded that vendor packages can manage 130 megabytes of data at acceptable load and query rates. Performance tests varying data base designs and various data base management system parameters are valuable to applications for choosing packages and critical to designing effective data bases. An applications productivity increases with the use of data base management system because of enhanced capabilities such as a screen formatter, a reporter writer, and a data dictionary.

  20. User interface to administrative DRMS within a distributed environment

    NASA Technical Reports Server (NTRS)

    Martin, L. D.; Kirk, R. D.

    1983-01-01

    The implementation of a data base management system (DBMS) into a communications office to control and report on communication leased service contracts is discussed. The system user executes online programs to update five files residing on a UNIVAC 1100/82, through the forms mode features of the Tektronix 4025 terminal and IMSAI 8080 microcomputer. This user can select the appropriate form to the Tektronix 4025 screen, and enter new data, update existing data, or discontinue service. Selective online printing of 40 reports is accomplished by the system user to satisfy management, budget, and bill payment reporting requirements.

  1. StarBase: A Firm Real-Time Database Manager for Time-Critical Applications

    DTIC Science & Technology

    1995-01-01

    Mellon University [IO]. StarBase differs from previous RT-DBMS work [l, 2, 31 in that a) it relies on a real - time operating system which provides...simulation studies, StarBase uses a real - time operating system to provide basic real-time functionality and deals with issues beyond transaction...re- source scheduling provided by the underlying real - time operating system . Issues of data contention are dealt with by use of a priority

  2. Managing Contention and Timing Constraints in a Real-Time Database System

    DTIC Science & Technology

    1995-01-01

    In order to realize many of these goals, StarBase is constructed on top of RT-Mach, a real - time operating system developed at Carnegie Mellon...University [ll]. StarBase differs from previous RT-DBMS work [l, 2, 31 in that a) it relies on a real - time operating system which provides priority...CPU and resource scheduling pro- vided by tlhe underlying real - time operating system . Issues of data contention are dealt with by use of a priority

  3. Integrating Real-time Earthquakes into Natural Hazard Courses

    NASA Astrophysics Data System (ADS)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm real-time, consulting other members of their class and accessing the E-DBMS server and other links to glean information that they will then use to make decisions. Students wrestle with the complications in interpreting natural hazard data, evaluating whether a response is needed, and problems such as those associated with communication between media and the public through these focused exercises. Although earthquakes are targeted at present, similar DBMS systems are envisioned for other natural hazards like flooding, volcanoes, and severe weather. We are testing this system as a prototype intended to be expanded to provide web-based access to classes at both the middle/high school and college/university levels.

  4. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  5. Using a Java Dynamic Tree to manage the terminology in a suite of medical applications.

    PubMed

    Yang, K; Evens, M W; Trace, D A

    2008-01-01

    Now that the National Library of Medicine has made SNOMED-CT widely available, we are trying to manage the terminology of a whole suite of medical applications and map our terminology into that in SNOMED. This paper describes the design and implementation of the Java Dynamic Tree that provides structure to our medical terminology and explains how it functions as the core of our system. The tree was designed to reflect the stages in a patient interview, so it contains components for identifying the patient and the provider, a large set of chief complaints, review of systems, physical examination, several history modules, medications, laboratory tests, imaging, and special procedures. The tree is mirrored in a commercial DBMS, which also stores multi-encounter patient data, disorder patterns for our Bayesian diagnostic system, and the data and rules for other expert systems. The DBMS facilitates the import and export of large terminology files. Our Java Dynamic Tree allows the health care provider to view the entire terminology along with the structure that supports it, as well as the mechanism for the generation of progress notes and other documents, in terms of a single hierarchical structure. Changes in terminology can be propagated through the system under the control of the expert. The import/ export facility has been a major help by replacing our original terminology by the terminology in SNOMED-CT.

  6. Time series patterns and language support in DBMS

    NASA Astrophysics Data System (ADS)

    Telnarova, Zdenka

    2017-07-01

    This contribution is focused on pattern type Time Series as a rich in semantics representation of data. Some example of implementation of this pattern type in traditional Data Base Management Systems is briefly presented. There are many approaches how to manipulate with patterns and query patterns. Crucial issue can be seen in systematic approach to pattern management and specific pattern query language which takes into consideration semantics of patterns. Query language SQL-TS for manipulating with patterns is shown on Time Series data.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kogalovskii, M.R.

    This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.

  8. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed

    Lemaitre, D; Sauquet, D; Fofol, I; Tanguy, L; Jean, F C; Degoulet, P

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described.

  9. Karst database development in Minnesota: Design and data assembly

    USGS Publications Warehouse

    Gao, Y.; Alexander, E.C.; Tipping, R.G.

    2005-01-01

    The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces. ?? Springer-Verlag 2005.

  10. The use of database management systems and artificial intelligence in automating the planning of optical navigation pictures

    NASA Technical Reports Server (NTRS)

    Davis, Robert P.; Underwood, Ian M.

    1987-01-01

    The use of database management systems (DBMS) and AI to minimize human involvement in the planning of optical navigation pictures for interplanetary space probes is discussed, with application to the Galileo mission. Parameters characterizing the desirability of candidate pictures, and the program generating them, are described. How these parameters automatically build picture records in a database, and the definition of the database structure, are then discussed. The various rules, priorities, and constraints used in selecting pictures are also described. An example is provided of an expert system, written in Prolog, for automatically performing the selection process.

  11. Database architectures for Space Telescope Science Institute

    NASA Astrophysics Data System (ADS)

    Lubow, Stephen

    1993-08-01

    At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).

  12. Geospatial Based Information System Development in Public Administration for Sustainable Development and Planning in Urban Environment

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-09-01

    It is generally agreed that the governmental authorities should actively encourage the development of an efficient framework of information and communication technology initiatives so as to advance and promote sustainable development and planning strategies. This paper presents a prototype Information System for public administration which was designed to facilitate public management and decision making for sustainable development and planning. The system was developed by using several programming languages and programming tools and also a Database Management System (DBMS) for storing and managing urban data of many kinds. Furthermore, geographic information systems were incorporated into the system in order to make possible to the authorities to deal with issues of spatial nature such as spatial planning. The developed system provides a technology based management of geospatial information, environmental and crime data of urban environment aiming at improving public decision making and also at contributing to a more efficient sustainable development and planning.

  13. An innovative, multidisciplinary educational program in interactive information storage and retrieval. Presentation visuals. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Mary C.

    1985-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-12. The project objectives are to develop a set of transportable, hands-on, data base management courses for science and engineering students to facilitate their utilization of information storage and retrieval programs.

  14. In vitro and in vivo antitumor effects of doxorubicin loaded with bacterial magnetosomes (DBMs) on H22 cells: the magnetic bio-nanoparticles as drug carriers.

    PubMed

    Sun, Jian-Bo; Duan, Jin-Hong; Dai, Shun-Ling; Ren, Jun; Zhang, Yan-Dong; Tian, Jie-Sheng; Li, Ying

    2007-12-08

    Hepatocellular carcinoma (HCC) is the most common form of cancer although effective therapeutic strategy especially targeted therapy is lacking. We recently employed bacterial magnetosomes (BMs) as the magnetic-targeted drug carrier and found an antitumor effect of doxorubicin (DOX)-loaded BMs (DBMs) in EMT-6 and HL60 cell lines. The aim of this study was to evaluate the in vitro and in vivo anti-neoplastic effects of DBMs on hepatic cancer. DBMs, DOX and BMs displayed tumor suppression rates of 86.8%, 78.6% and 4.3%, respectively, in H22 cell-bearing mice. The mortality rates following administration of DBMs, DOX and BMs were 20%, 80% and 0%, respectively. Pathological examination of hearts and tumors revealed that both DBMs and DOX effectively inhibited tumor growth although DBMs displayed a much lower cardiac toxicity compared with DOX. The DBMs were cytotoxic to H22 cells manifested as inhibition of cell proliferation and c-myc expression, consistent with DOX. The IC(50) of DOX, DBMs and BMs in target cells were 5.309 +/- 0.010, 4.652 +/- 0.256 and 22.106 +/- 3.330 microg/ml, respectively. Our data revealed both in vitro and in vivo antitumor property of DBMs similar to that of DOX. More importantly, the adverse cardiac toxicity was significantly reduced in DBMs compared with DOX. Collectively, our study suggests the therapeutic potential of DBMs in target-therapy against liver cancer.

  15. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed Central

    Lemaitre, D.; Sauquet, D.; Fofol, I.; Tanguy, L.; Jean, F. C.; Degoulet, P.

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described. PMID:8563252

  16. An Expert System Interfaced with a Database System to Perform Troubleshooting of Aircraft Carrier Piping Systems

    DTIC Science & Technology

    1988-12-01

    interval of four feet, and are numbered sequentially bow to stem. * "wing tank" is a tank or void, outboard of the holding bulkhead, away from the center...system and DBMS simultaneously with a multi-processor, allowing queries to the DBMS without terminating the expert system. This method was judged...RECIRC). eductor -strip("Y"):- ask _ques _read_ans(OVBD,"ovbd dis open"),ovbd dis-open(OVBD). eductor-strip("N"):- ask_ques read_ans( LINEUP , "strip lineup

  17. Database Reports Over the Internet

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    2002-01-01

    Most of the summer was spent developing software that would permit existing test report forms to be printed over the web on a printer that is supported by Adobe Acrobat Reader. The data is stored in a DBMS (Data Base Management System). The client asks for the information from the database using an HTML (Hyper Text Markup Language) form in a web browser. JavaScript is used with the forms to assist the user and verify the integrity of the entered data. Queries to a database are made in SQL (Sequential Query Language), a widely supported standard for making queries to databases. Java servlets, programs written in the Java programming language running under the control of network server software, interrogate the database and complete a PDF form template kept in a file. The completed report is sent to the browser requesting the report. Some errors are sent to the browser in an HTML web page, others are reported to the server. Access to the databases was restricted since the data are being transported to new DBMS software that will run on new hardware. However, the SQL queries were made to Microsoft Access, a DBMS that is available on most PCs (Personal Computers). Access does support the SQL commands that were used, and a database was created with Access that contained typical data for the report forms. Some of the problems and features are discussed below.

  18. Comparing NetCDF and SciDB on managing and querying 5D hydrologic dataset

    NASA Astrophysics Data System (ADS)

    Liu, Haicheng; Xiao, Xiao

    2016-11-01

    Efficiently extracting information from high dimensional hydro-meteorological modelling datasets requires smart solutions. Traditional methods are mostly based on files, which can be edited and accessed handily. But they have problems of efficiency due to contiguous storage structure. Others propose databases as an alternative for advantages such as native functionalities for manipulating multidimensional (MD) arrays, smart caching strategy and scalability. In this research, NetCDF file based solutions and the multidimensional array database management system (DBMS) SciDB applying chunked storage structure are benchmarked to determine the best solution for storing and querying 5D large hydrologic modelling dataset. The effect of data storage configurations including chunk size, dimension order and compression on query performance is explored. Results indicate that dimension order to organize storage of 5D data has significant influence on query performance if chunk size is very large. But the effect becomes insignificant when chunk size is properly set. Compression of SciDB mostly has negative influence on query performance. Caching is an advantage but may be influenced by execution of different query processes. On the whole, NetCDF solution without compression is in general more efficient than the SciDB DBMS.

  19. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing.

    PubMed

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2014-10-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA's CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream . Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels.

  20. Quality characteristics of bread and cookies enriched with debittered Moringa oleifera seed flour.

    PubMed

    Ogunsina, B S; Radha, C; Indrani, D

    2011-03-01

    The effects of replacing wheat flour with 0-15% debittered moringa seed (DBMS) flour on the dough rheology of wheat flour and physical, sensory and chemical properties of bread were studied. Incorporation of an increasing amount of DBMS from 0 to 15% decreased farinograph water absorption, dough stability, amylograph peak viscosity and overall quality of bread. The bread with 10% DBMS had a typical moringa seed taste and was acceptable. Addition of combination of additives improved the dough strength and quality of bread with 10% DBMS flour. Replacement of wheat flour with 10%, 20% and 30% DBMS grits was found to affect cookies quality. Cookies with 20% DBMS grits had the nutty taste of moringa seeds and were acceptable. Bread with 10% DBMS flour and cookies with 20% DBMS grits had more protein, iron and calcium. Incorporating moringa seeds in baked foods may be exploited as a means of boosting nutrition in Africa and Asia where malnutrition is prevalent.

  1. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  2. The design of PC/MISI, a PC-based common user interface to remote information storage and retrieval systems. Presentation visuals. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, The Design of PC/MISI, a PC-Based Common User Interface to Remote Information Storage and Retrieval Systems, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-15. The paper discusses the following: problem definition; the PC solution; the goals of system design; the design description; future considerations, the research environment; conclusions.

  3. Information management systems for pharmacogenomics.

    PubMed

    Thallinger, Gerhard G; Trajanoski, Slave; Stocker, Gernot; Trajanoski, Zlatko

    2002-09-01

    The value of high-throughput genomic research is dramatically enhanced by association with key patient data. These data are generally available but of disparate quality and not typically directly associated. A system that could bring these disparate data sources into a common resource connected with functional genomic data would be tremendously advantageous. However, the integration of clinical and accurate interpretation of the generated functional genomic data requires the development of information management systems capable of effectively capturing the data as well as tools to make that data accessible to the laboratory scientist or to the clinician. In this review these challenges and current information technology solutions associated with the management, storage and analysis of high-throughput data are highlighted. It is suggested that the development of a pharmacogenomic data management system which integrates public and proprietary databases, clinical datasets, and data mining tools embedded in a high-performance computing environment should include the following components: parallel processing systems, storage technologies, network technologies, databases and database management systems (DBMS), and application services.

  4. A Data Definition Language for GLAD (Graphic Language for Databases).

    DTIC Science & Technology

    1986-06-20

    basic premises. These principles state that a DBMS interface must be descriptive, powerful, easy-to use and easy to learn . This thesis proposes a data...basic premises. These principles state that a DBMS interface must be descriptive, powerful, easy to use and easy to learn . This thesis proposes a data...criteria will be the most successful. 9 If a system is hard to learn , of those capable of mastering the system few may be willing to expend the time and

  5. Parallel In Situ Indexing for Data-intensive Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Abbasi, Hasan; Chacon, Luis

    2011-09-09

    As computing power increases exponentially, vast amount of data is created by many scientific re- search activities. However, the bandwidth for storing the data to disks and reading the data from disks has been improving at a much slower pace. These two trends produce an ever-widening data access gap. Our work brings together two distinct technologies to address this data access issue: indexing and in situ processing. From decades of database research literature, we know that indexing is an effective way to address the data access issue, particularly for accessing relatively small fraction of data records. As data sets increasemore » in sizes, more and more analysts need to use selective data access, which makes indexing an even more important for improving data access. The challenge is that most implementations of in- dexing technology are embedded in large database management systems (DBMS), but most scientific datasets are not managed by any DBMS. In this work, we choose to include indexes with the scientific data instead of requiring the data to be loaded into a DBMS. We use compressed bitmap indexes from the FastBit software which are known to be highly effective for query-intensive workloads common to scientific data analysis. To use the indexes, we need to build them first. The index building procedure needs to access the whole data set and may also require a significant amount of compute time. In this work, we adapt the in situ processing technology to generate the indexes, thus removing the need of read- ing data from disks and to build indexes in parallel. The in situ data processing system used is ADIOS, a middleware for high-performance I/O. Our experimental results show that the indexes can improve the data access time up to 200 times depending on the fraction of data selected, and using in situ data processing system can effectively reduce the time needed to create the indexes, up to 10 times with our in situ technique when using identical parallel settings.« less

  6. Depth to the bottom of magnetic sources (DBMS) from aeromagnetic data of Central India using modified centroid method for fractal distribution of sources

    NASA Astrophysics Data System (ADS)

    Bansal, A. R.; Anand, S. P.; Rajaram, Mita; Rao, V. K.; Dimri, V. P.

    2013-09-01

    The depth to the bottom of the magnetic sources (DBMS) has been estimated from the aeromagnetic data of Central India. The conventional centroid method of DBMS estimation assumes random uniform uncorrelated distribution of sources and to overcome this limitation a modified centroid method based on scaling distribution has been proposed. Shallower values of the DBMS are found for the south western region. The DBMS values are found as low as 22 km in the south west Deccan trap covered regions and as deep as 43 km in the Chhattisgarh Basin. In most of the places DBMS are much shallower than the Moho depth, earlier found from the seismic study and may be representing the thermal/compositional/petrological boundaries. The large variation in the DBMS indicates the complex nature of the Indian crust.

  7. Technology-based management of environmental organizations using an Environmental Management Information System (EMIS): Design and development

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-01-01

    The adoption of Information and Communication Technologies (ICT) in environmental management has become a significant demand nowadays with the rapid growth of environmental information. This paper presents a prototype Environmental Management Information System (EMIS) that was developed to provide a systematic way of managing environmental data and human resources of an environmental organization. The system was designed using programming languages, a Database Management System (DBMS) and other technologies and programming tools and combines information from the relational database in order to achieve the principal goals of the environmental organization. The developed application can be used to store and elaborate information regarding: human resources data, environmental projects, observations, reports, data about the protected species, environmental measurements of pollutant factors or other kinds of analytical measurements and also the financial data of the organization. Furthermore, the system supports the visualization of spatial data structures by using geographic information systems (GIS) and web mapping technologies. This paper describes this prototype software application, its structure, its functions and how this system can be utilized to facilitate technology-based environmental management and decision-making process.

  8. A GH-Based Ontology to Support Applications for Automating Decision Support

    DTIC Science & Technology

    2005-03-01

    architecture for a decision support sys - tem. For this reason, it obtains data from, and updates, a database. IDA also wanted the prototype’s architecture...Chief In- formation Officer CoABS Control of Agent Based Sys - tems DBMS Database Management System DoD Department of Defense DTD Document Type...Generic Hub, the Moyeu Générique, and the Generische Nabe , specifying each as a separate service description with property names and values of the GH

  9. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  10. Spatial Dmbs Architecture for a Free and Open Source Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Valari, E.; Karachaliou, E.; Stylianidis, E.

    2017-08-01

    Recent research on the field of Building Information Modelling (BIM) technology, revealed that except of a few, accessible and free BIM viewers there is a lack of Free & Open Source Software (FOSS) BIM software for the complete BIM process. With this in mind and considering BIM as the technological advancement of Computer-Aided Design (CAD) systems, the current work proposes the use of a FOSS CAD software in order to extend its capabilities and transform it gradually into a FOSS BIM platform. Towards this undertaking, a first approach on developing a spatial Database Management System (DBMS) able to store, organize and manage the overall amount of information within a single application, is presented.

  11. Prefetching Simulation Objects in a Persistent Simulation Environment

    DTIC Science & Technology

    1989-11-01

    i th to montrminte DBMS, POSTGRES [Stonebraker and Rowe 1985). In our parameters (described in section 5) to help determine theimlementatio P o f E...erpaed P Rowe w9.ith our amount of speedup we can attain. Nonetheless, factors suchimplementation of PSE, we replaced POSTGRES with our own file...tihe sophisti- a uniprocessor system, and therefore we would like to even- cated DBMS facilities offered by POSTGRES . tually test prefetching

  12. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  13. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  14. Analysis of Information Requirements and Design of the Consolidated AFIT Database and Information System (CADIS) with an AFIT/CI Implementation Design.

    DTIC Science & Technology

    1982-12-01

    management, plus the comments received from the faculty and staff. A major assumption in this thesis is that automated database tech- niques offer the...and major advantage of a DBMS is that of real-time, on- line data accessibility. Routine queries, reports and ad hoc queries caii be performed...used or as applications programs evolve. Such changes can have a major impact on the organization and storage of data and ultimately on the response

  15. Observational database for studies of nearby universe

    NASA Astrophysics Data System (ADS)

    Kaisina, E. I.; Makarov, D. I.; Karachentsev, I. D.; Kaisin, S. S.

    2012-01-01

    We present the description of a database of galaxies of the Local Volume (LVG), located within 10 Mpc around the Milky Way. It contains more than 800 objects. Based on an analysis of functional capabilities, we used the PostgreSQL DBMS as a management system for our LVG database. Applying semantic modelling methods, we developed a physical ER-model of the database. We describe the developed architecture of the database table structure, and the implemented web-access, available at http://www.sao.ru/lv/lvgdb.

  16. Performance Modeling in CUDA Streams - A Means for High-Throughput Data Processing

    PubMed Central

    Li, Hao; Yu, Di; Kumar, Anand; Tu, Yi-Cheng

    2015-01-01

    Push-based database management system (DBMS) is a new type of data processing software that streams large volume of data to concurrent query operators. The high data rate of such systems requires large computing power provided by the query engine. In our previous work, we built a push-based DBMS named G-SDMS to harness the unrivaled computational capabilities of modern GPUs. A major design goal of G-SDMS is to support concurrent processing of heterogenous query processing operations and enable resource allocation among such operations. Understanding the performance of operations as a result of resource consumption is thus a premise in the design of G-SDMS. With NVIDIA’s CUDA framework as the system implementation platform, we present our recent work on performance modeling of CUDA kernels running concurrently under a runtime mechanism named CUDA stream. Specifically, we explore the connection between performance and resource occupancy of compute-bound kernels and develop a model that can predict the performance of such kernels. Furthermore, we provide an in-depth anatomy of the CUDA stream mechanism and summarize the main kernel scheduling disciplines in it. Our models and derived scheduling disciplines are verified by extensive experiments using synthetic and real-world CUDA kernels. PMID:26566545

  17. Building a highly available and intrusion tolerant Database Security and Protection System (DSPS).

    PubMed

    Cai, Liang; Yang, Xiao-Hu; Dong, Jin-Xiang

    2003-01-01

    Database Security and Protection System (DSPS) is a security platform for fighting malicious DBMS. The security and performance are critical to DSPS. The authors suggested a key management scheme by combining the server group structure to improve availability and the key distribution structure needed by proactive security. This paper detailed the implementation of proactive security in DSPS. After thorough performance analysis, the authors concluded that the performance difference between the replicated mechanism and proactive mechanism becomes smaller and smaller with increasing number of concurrent connections; and that proactive security is very useful and practical for large, critical applications.

  18. Heterogeneous distributed databases: A case study

    NASA Technical Reports Server (NTRS)

    Stewart, Tracy R.; Mukkamala, Ravi

    1991-01-01

    Alternatives are reviewed for accessing distributed heterogeneous databases and a recommended solution is proposed. The current study is limited to the Automated Information Systems Center at the Naval Sea Combat Systems Engineering Station at Norfolk, VA. This center maintains two databases located on Digital Equipment Corporation's VAX computers running under the VMS operating system. The first data base, ICMS, resides on a VAX11/780 and has been implemented using VAX DBMS, a CODASYL based system. The second database, CSA, resides on a VAX 6460 and has been implemented using the ORACLE relational database management system (RDBMS). Both databases are used for configuration management within the U.S. Navy. Different customer bases are supported by each database. ICMS tracks U.S. Navy ships and major systems (anti-sub, sonar, etc.). Even though the major systems on ships and submarines have totally different functions, some of the equipment within the major systems are common to both ships and submarines.

  19. Leucosome distribution in migmatitic paragneisses and orthogneisses: A record of self-organized melt migration and entrapment in a heterogeneous partially-molten crust

    NASA Astrophysics Data System (ADS)

    Yakymchuk, C.; Brown, M.; Ivanic, T. J.; Korhonen, F. J.

    2013-09-01

    The depth to the bottom of the magnetic sources (DBMS) has been estimated from the aeromagnetic data of Central India. The conventional centroid method of DBMS estimation assumes random uniform uncorrelated distribution of sources and to overcome this limitation a modified centroid method based on scaling distribution has been proposed. Shallower values of the DBMS are found for the south western region. The DBMS values are found as low as 22 km in the south west Deccan trap covered regions and as deep as 43 km in the Chhattisgarh Basin. In most of the places DBMS are much shallower than the Moho depth, earlier found from the seismic study and may be representing the thermal/compositional/petrological boundaries. The large variation in the DBMS indicates the complex nature of the Indian crust.

  20. Bio-optical data integration based on a 4 D database system approach

    NASA Astrophysics Data System (ADS)

    Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.

    2015-04-01

    Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.

  1. Mississippi State University Center for Air Sea Technology. FY93 and FY 94 Research Program in Navy Ocean Modeling and Prediction

    DTIC Science & Technology

    1994-09-30

    relational versus object oriented DBMS, knowledge discovery, data models, rnetadata, data filtering, clustering techniques, and synthetic data. A secondary...The first was the investigation of Al/ES Lapplications (knowledge discovery, data mining, and clustering ). Here CAST collabo.rated with Dr. Fred Petry...knowledge discovery system based on clustering techniques; implemented an on-line data browser to the DBMS; completed preliminary efforts to apply object

  2. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  3. The epiphany of data warehousing technologies in the pharmaceutical industry.

    PubMed

    Barrett, J S; Koprowski, S P

    2002-03-01

    The highly competitive pharmaceutical industry has seen many external changes to its landscape as companies consume each other increasing their pipelines while removing redundant functions and processes. Internally, companies have sought to streamline the discovery and development phases in an attempt to improve candidate selection and reduce the time to regulatory filing. In conjunction with efforts to screen and develop more compounds faster and more efficiently, database management systems (DBMS) have been developed for numerous groups supporting various R&D efforts. An outgrowth of DBMS evolution has been the birth of data warehousing. Often confused with DBMS, data warehousing provides a conduit for data residing across platforms, networks, and in different data structures. Through the use of metadata, the warehouse establishes connectivity of varied data stores (operational detail data, ODD) and permits identification of data ownership, location and transaction history. This evolution has closely mirrored and in some ways been driven by the electronic submission (formerly CANDA). The integration of the electronic submissions and document management with R&D data warehousing initiatives should provide a platform by which companies can address compliance with 21 CFR Part 11. Now more than ever "corporate memory" is being extended to the data itself. The when, why and how of successes and failures are constantly being probed by R&D management teams. The volume of information being generated by today's pharmaceutical companies requires mining of historical data on a routine basis. Data warehousing represents a core technology to assist in this endeavor. New initiatives in this field address the necessity of data portals through which warehouse data can be web-enabled and exploited by diverse data customers both internal and external to the company. The epiphany of data warehousing technologies within the pharmaceutical industry has begun and promises to change the way in which companies process and provide data to regulatory agencies. The improvements in drug discovery and reduction in development timelines remain to be seen but would seem to be rational if such technology is fully exploited.

  4. I&W (Indications and Warning) Data Base Management Systems Analysis.

    DTIC Science & Technology

    1983-06-01

    34 NUMBER ALPHA ACTOR NUMBER ALPHA ACTOR 1 7 MAC Macao N10 TAZ Tanzania5 AG Malagasy 00 TAI Thailand 5 MAW Malawi 461 TOG Togo 2 HAL Malaysia 052 TRI...independence ’ 040 Berlin conflict 050 Sino-Soviet conflicts S0 Indonesia- Malaysia disputes 00 India-China conflicts USA-China conflicts " 090 India...41, ..4- Table 13-6 Test Series Summary Descriptions (page 5 of 19) 13-8f . .. "*- I&W DBMS ANALYSIS TEST DESCRIPTION . Test ID: F6 ScenariG ____No

  5. Enforcing compatibility and constraint conditions and information retrieval at the design action

    NASA Technical Reports Server (NTRS)

    Woodruff, George W.

    1990-01-01

    The design of complex entities is a multidisciplinary process involving several interacting groups and disciplines. There is a need to integrate the data in such environments to enhance the collaboration between these groups and to enforce compatibility between dependent data entities. This paper discusses the implementation of a workstation based CAD system that is integrated with a DBMS and an expert system, CLIPS, (both implemented on a mini computer) to provide such collaborative and compatibility enforcement capabilities. The current implementation allows for a three way link between the CAD system, the DBMS and CLIPS. The engineering design process associated with the design and fabrication of sheet metal housing for computers in a large computer manufacturing facility provides the basis for this prototype system.

  6. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michele, Mangiameli, E-mail: michele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerablemore » to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks.« less

  7. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    NASA Astrophysics Data System (ADS)

    Michele, Mangiameli; Giuseppe, Mussumeci

    2015-12-01

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerable to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks..

  8. Network Configuration of Oracle and Database Programming Using SQL

    NASA Technical Reports Server (NTRS)

    Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.

    2000-01-01

    A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).

  9. Planar Diamond-Based Multiarrays to Monitor Neurotransmitter Release and Action Potential Firing: New Perspectives in Cellular Neuroscience.

    PubMed

    Carabelli, Valentina; Marcantoni, Andrea; Picollo, Federico; Battiato, Alfio; Bernardi, Ettore; Pasquarelli, Alberto; Olivero, Paolo; Carbone, Emilio

    2017-02-15

    High biocompatibility, outstanding electrochemical responsiveness, inertness, and transparency make diamond-based multiarrays (DBMs) first-rate biosensors for in vitro detection of electrochemical and electrical signals from excitable cells together, with potential for in vivo applications as neural interfaces and prostheses. Here, we will review the electrochemical and physical properties of various DBMs and how these devices have been employed for recording released neurotransmitter molecules and all-or-none action potentials from living cells. Specifically, we will overview how DBMs can resolve localized exocytotic events from subcellular compartments using high-density microelectrode arrays (MEAs), or monitoring oxidizable neurotransmitter release from populations of cells in culture and tissue slices using low-density MEAs. Interfacing DBMs with excitable cells is currently leading to the promising opportunity of recording electrical signals as well as creating neuronal interfaces through the same device. Given the recent increasingly growing development of newly available DBMs of various geometries to monitor electrical activity and neurotransmitter release in a variety of excitable and neuronal tissues, the discussion will be limited to planar DBMs.

  10. An approach for management of geometry data

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Herron, G. J.; Schweitzer, J. E.; Warkentine, E. R.

    1980-01-01

    The strategies for managing Integrated Programs for Aerospace Design (IPAD) computer-based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. IPAD's data base system makes this information available to all authorized departments in a company. A discussion of the data structures and algorithms required to support geometry in IPIP (IPAD's data base management system) is presented. Through the use of IPIP's data definition language, the structure of the geometry components is defined. The data manipulation language is the vehicle by which a user defines an instance of the geometry. The manipulation language also allows a user to edit, query, and manage the geometry. The selection of canonical forms is a very important part of the IPAD geometry. IPAD has a canonical form for each entity and provides transformations to alternate forms; in particular, IPAD will provide a transformation to the ANSI standard. The DBMS schemas required to support IPAD geometry are explained.

  11. DOEDEF Software System, Version 2. 2: Operational instructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meirans, L.

    The DOEDEF (Department of Energy Data Exchange Format) Software System is a collection of software routines written to facilitate the manipulation of IGES (Initial Graphics Exchange Specification) data. Typically, the IGES data has been produced by the IGES processors for a Computer-Aided Design (CAD) system, and the data manipulations are user-defined ''flavoring'' operations. The DOEDEF Software System is used in conjunction with the RIM (Relational Information Management) DBMS from Boeing Computer Services (Version 7, UD18 or higher). The three major pieces of the software system are: Parser, reads an ASCII IGES file and converts it to the RIM database equivalent;more » Kernel, provides the user with IGES-oriented interface routines to the database; and Filewriter, writes the RIM database to an IGES file.« less

  12. DBMS UTILIZATION: A Corporate Information System (CIS) development approach

    NASA Technical Reports Server (NTRS)

    Rozett, P.

    1983-01-01

    The Corporate Information System (CIS), an integrated information system intended to tie the corporation together as a functioning entity, is described. In addition to being a major upgraded automated data processing system, the CIS is a management philosophy which recognizes data as a valuable corporate resource and which distinguishes between data and selected data, or information. It further recognizes that different users need different kinds of information. Plans for CIS development are discussed. It will offer its users not just after-the-fact data, but timely information in a format that is meaningful and useful to the particular user, so that the information can be applied in planning, controlling, and decision making by all levels of management. In effect, CIS will help the corporation itself to function as a total, integrated system by typing together administrative activities through information exchange. The CIS supports the operational, tactical control, and strategic planning functions of the corporation. Operational functions are the day-to-day processing necessary to support the corporation's work, such as purchasing and payroll.

  13. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  14. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  15. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    PubMed

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  16. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases

    PubMed Central

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-01-01

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form. PMID:29608174

  17. Executing Complexity-Increasing Queries in Relational (MySQL) and NoSQL (MongoDB and EXist) Size-Growing ISO/EN 13606 Standardized EHR Databases.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Castro, Antonio L; Moreno, Oscar; Pascual, Mario

    2018-03-19

    This research shows a protocol to assess the computational complexity of querying relational and non-relational (NoSQL (not only Structured Query Language)) standardized electronic health record (EHR) medical information database systems (DBMS). It uses a set of three doubling-sized databases, i.e. databases storing 5000, 10,000 and 20,000 realistic standardized EHR extracts, in three different database management systems (DBMS): relational MySQL object-relational mapping (ORM), document-based NoSQL MongoDB, and native extensible markup language (XML) NoSQL eXist. The average response times to six complexity-increasing queries were computed, and the results showed a linear behavior in the NoSQL cases. In the NoSQL field, MongoDB presents a much flatter linear slope than eXist. NoSQL systems may also be more appropriate to maintain standardized medical information systems due to the special nature of the updating policies of medical information, which should not affect the consistency and efficiency of the data stored in NoSQL databases. One limitation of this protocol is the lack of direct results of improved relational systems such as archetype relational mapping (ARM) with the same data. However, the interpolation of doubling-size database results to those presented in the literature and other published results suggests that NoSQL systems might be more appropriate in many specific scenarios and problems to be solved. For example, NoSQL may be appropriate for document-based tasks such as EHR extracts used in clinical practice, or edition and visualization, or situations where the aim is not only to query medical information, but also to restore the EHR in exactly its original form.

  18. m-Diethynylbenzene macrocycles: syntheses and self-association behavior in solution.

    PubMed

    Tobe, Yoshito; Utsumi, Naoto; Kawabata, Kazuya; Nagano, Atsushi; Adachi, Kiyomi; Araki, Shunji; Sonoda, Motohiro; Hirose, Keiji; Naemura, Koichiro

    2002-05-15

    m-Diethynylbenzene macrocycles (DBMs), buta-1,3-diyne-bridged [4(n)]metacyclophanes, have been synthesized and their self-association behaviors in solution were investigated. Cyclic tetramers, hexamers, and octamers of DBMs having exo-annular octyl, hexadecyl, and 3,6,9-trioxadecyl ester groups were prepared by intermolecular oxidative coupling of dimer units or intramolecular cyclization of the corresponding open-chain oligomers. The aggregation properties were investigated by two methods, the (1)H NMR spectra and the vapor pressure osmometry (VPO). Although some discrepancies were observed between the association constants obtained from the two methods, the qualitative view was consistent with each other. The analysis of self-aggregation by VPO revealed unique aggregation behavior of DBMs in acetone and toluene, which was not elucidated by the NMR method. Namely, the association constants for infinite association are several times larger than the dimerization constant, suggesting that the aggregation is enhanced by the formation of dimers (a nucleation mechanism). In polar solvents, DBMs aggregate more strongly than in chloroform due to the solvophobic interactions between the macrocyclic framework and the solvents. Moreover, DBMs self-associate in aromatic solvents such as toluene and o-xylene more readily than in chloroform. In particular, the hexameric DBM having a large macrocyclic cavity exhibits extremely large association constants in aromatic solvents. By comparing the aggregation properties of DBMs with the corresponding acyclic oligomers, the effect of the macrocyclic structure on the aggregation propensity was clarified. Finally, it turned out that DBMs tend to aggregate more readily than the corresponding phenylacetylene macrocycles, acetylene-bridged [2(n)]metacyclophanes, owing to the withdrawal of the electron density from the aromatic rings by the butadiyne linkages which facilitates pi-pi stacking interactions.

  19. Depth to the bottom of magnetic sources (DBMS) from aeromagnetic data of Central India using modified centroid method for fractal distribution of sources

    NASA Astrophysics Data System (ADS)

    Bansal, A. R.; Anand, S.; Rajaram, M.; Rao, V.; Dimri, V. P.

    2012-12-01

    The depth to the bottom of the magnetic sources (DBMS) may be used as an estimate of the Curie - point depth. The DBMSs can also be interpreted in term of thermal structure of the crust. The thermal structure of the crust is a sensitive parameter and depends on the many properties of crust e.g. modes of deformation, depths of brittle and ductile deformation zones, regional heat flow variations, seismicity, subsidence/uplift patterns and maturity of organic matter in sedimentary basins. The conventional centroid method of DBMS estimation assumes random uniform uncorrelated distribution of sources and to overcome this limitation a modified centroid method based on fractal distribution has been proposed. We applied this modified centroid method to the aeromagnetic data of the central Indian region and selected 29 half overlapping blocks of dimension 200 km x 200 km covering different parts of the central India. Shallower values of the DBMS are found for the western and southern portion of Indian shield. The DBMSs values are found as low as close to middle crust in the south west Deccan trap and probably deeper than Moho in the Chhatisgarh basin. In few places DBMS are close to the Moho depth found from the seismic study and others places shallower than the Moho. The DBMS indicate complex nature of the Indian crust.

  20. Preparation and anti-tumor efficiency evaluation of doxorubicin-loaded bacterial magnetosomes: magnetic nanoparticles as drug carriers isolated from Magnetospirillum gryphiswaldense.

    PubMed

    Sun, Jian-Bo; Duan, Jin-Hong; Dai, Shun-Ling; Ren, Jun; Guo, Lin; Jiang, Wei; Li, Ying

    2008-12-15

    Bacterial magnetosomes (BMs) are commonly used as vehicles for certain enzymes, nucleic acids and antibodies, although they have never been considered drug carriers. To evaluate the clinical potential of BMs extracted from Magnetospirillum gryphiswaldense in cancer therapy, doxorubicin (DOX) was loaded onto the purified BMs at a ratio of 0.87 +/- 0.08 mg/mg using glutaraldehyde. The DOX-coupled BMs (DBMs) and BMs exhibited uniform sizes and morphology evaluated by TEM. The diameters of DBMs and BMs obtained by AFM were 71.02 +/- 6.73 and 34.93 +/- 8.24 nm, respectively. The DBMs released DOX slowly into serum and maintained at least 80% stability following 48 h of incubation. In vitro cytotoxic tests showed that the DBMs were cytotoxic to HL60 and EMT-6 cells, manifested as inhibition of cell proliferation and suppression in c-myc expression, consistent with DOX. These observations depicted in vitro antitumor property of DBMs similar to DOX. The approach of coupling DOX to magnetosomes may have clinical potential in anti-tumor drug delivery.

  1. Leadership Class Configuration Interaction Code - Status and Opportunities

    NASA Astrophysics Data System (ADS)

    Vary, James

    2011-10-01

    With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).

  2. Distributed and parallel approach for handle and perform huge datasets

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  3. Technical Aspects of Interfacing MUMPS to an External SQL Relational Database Management System

    PubMed Central

    Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail

    1988-01-01

    This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.

  4. Evaluation of RDBMS packages for use in astronomy

    NASA Technical Reports Server (NTRS)

    Page, C. G.; Davenhall, A. C.

    1992-01-01

    Tabular data sets arise in many areas of astronomical data analysis, from raw data (such as photon event lists) to final results (such as source catalogs). The Starlink catalog access and reporting package, SCAR, was originally developed to handle IRAS data and it has been the principal relational DBMS in the Starlink software collection for several years. But SCAR has many limitations and is VMS-specific, while Starlink is in transition from VMS to Unix. Rather than attempt a major re-write of SCAR for Unix, it seemed more sensible to see whether any existing database packages are suitable for general astronomical use. The authors first drew up a list of desirable properties for such a system and then used these criteria to evaluate a number of packages, both free ones and those commercially available. It is already clear that most commercial DBMS packages are not very well suited to the requirements; for example, most cannot carry out efficiently even fairly basic operations such as joining two catalogs on an approximate match of celestial positions. This paper reports the results of the evaluation exercise and notes the problems in using a standard DBMS package to process scientific data. In parallel with this the authors have started to develop a simple database engine that can handle tabular data in a range of common formats including simple direct-access files (such as SCAR and Exosat DBMS tables) and FITS tables (both ASCII and binary).

  5. USL/DBMS NASA/RECON working paper series. Standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.

    1984-01-01

    The USL/DBMS NASA/RECON Working Paper Series contains a collection of reports representing results of activities being conducted by the Computer Science Department of the University of Southwestern Louisiana pursuant to the specifications of NASA Contract number NASw-3846. The work on this portion of the contract is being performed jointly by the University of Southwestern Louisiana and Southern University. This report contains the full set of standards for the development, formatting, reviewing, and issuance of entries within the USL/DBMS NASA/RECON Working Paper Series.

  6. Medical informatics in medical research - the Severe Malaria in African Children (SMAC) Network's experience.

    PubMed

    Olola, C H O; Missinou, M A; Issifou, S; Anane-Sarpong, E; Abubakar, I; Gandi, J N; Chagomerana, M; Pinder, M; Agbenyega, T; Kremsner, P G; Newton, C R J C; Wypij, D; Taylor, T E

    2006-01-01

    Computers are widely used for data management in clinical trials in the developed countries, unlike in developing countries. Dependable systems are vital for data management, and medical decision making in clinical research. Monitoring and evaluation of data management is critical. In this paper we describe database structures and procedures of systems used to implement, coordinate, and sustain data management in Africa. We outline major lessons, challenges and successes achieved, and recommendations to improve medical informatics application in biomedical research in sub-Saharan Africa. A consortium of experienced research units at five sites in Africa in studying children with disease formed a new clinical trials network, Severe Malaria in African Children. In December 2000, the network introduced an observational study involving these hospital-based sites. After prototyping, relational database management systems were implemented for data entry and verification, data submission and quality assurance monitoring. Between 2000 and 2005, 25,858 patients were enrolled. Failure to meet data submission deadline and data entry errors correlated positively (correlation coefficient, r = 0.82), with more errors occurring when data was submitted late. Data submission lateness correlated inversely with hospital admissions (r = -0.62). Developing and sustaining dependable DBMS, ongoing modifications to optimize data management is crucial for clinical studies. Monitoring and communication systems are vital in multi-center networks for good data management. Data timeliness is associated with data quality and hospital admissions.

  7. Kingfisher: a system for remote sensing image database management

    NASA Astrophysics Data System (ADS)

    Bruzzo, Michele; Giordano, Ferdinando; Dellepiane, Silvana G.

    2003-04-01

    At present retrieval methods in remote sensing image database are mainly based on spatial-temporal information. The increasing amount of images to be collected by the ground station of earth observing systems emphasizes the need for database management with intelligent data retrieval capabilities. The purpose of the proposed method is to realize a new content based retrieval system for remote sensing images database with an innovative search tool based on image similarity. This methodology is quite innovative for this application, at present many systems exist for photographic images, as for example QBIC and IKONA, but they are not able to extract and describe properly remote image content. The target database is set by an archive of images originated from an X-SAR sensor (spaceborne mission, 1994). The best content descriptors, mainly texture parameters, guarantees high retrieval performances and can be extracted without losses independently of image resolution. The latter property allows DBMS (Database Management System) to process low amount of information, as in the case of quick-look images, improving time performance and memory access without reducing retrieval accuracy. The matching technique has been designed to enable image management (database population and retrieval) independently of dimensions (width and height). Local and global content descriptors are compared, during retrieval phase, with the query image and results seem to be very encouraging.

  8. USL/DBMS NASA/PC R and D project system design standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1984-01-01

    A set of system design standards intended to assure the completeness and quality of designs developed for PC research and development projects is established. The standards presented address the areas of problem definition, initial design plan, design specification, and re-evaluation.

  9. Myths and realities: Defining re-engineering for a large organization

    NASA Technical Reports Server (NTRS)

    Yin, Sandra; Mccreary, Julia

    1992-01-01

    This paper describes the background and results of three studies concerning software reverse engineering, re-engineering, and reuse (R3) hosted by the Internal Revenue Service in 1991 and 1992. The situation at the Internal Revenue--aging, piecemeal computer systems and outdated technology maintained by a large staff--is familiar to many institutions, especially among management information systems. The IRS is distinctive for the sheer magnitude and diversity of its problems; the country's tax records are processed using assembly language and COBOL and spread across tape and network DBMS files. How do we proceed with replacing legacy systems? The three software re-engineering studies looked at methods, CASE tool support, and performed a prototype project using re-engineering methods and tools. During the course of these projects, we discovered critical issues broader than the mechanical definitions of methods and tool technology.

  10. A context-adaptable approach to clinical guidelines.

    PubMed

    Terenziani, Paolo; Montani, Stefania; Bottrighi, Alessio; Torchio, Mauro; Molino, Gianpaolo; Correndo, Gianluca

    2004-01-01

    One of the most relevant obstacles to the use and dissemination of clinical guidelines is the gap between the generality of guidelines (as defined, e.g., by physicians' committees) and the peculiarities of the specific context of application. In particular, general guidelines do not take into account the fact that the tools needed for laboratory and instrumental investigations might be unavailable at a given hospital. Moreover, computer-based guideline managers must also be integrated with the Hospital Information System (HIS), and usually different DBMS are adopted by different hospitals. The GLARE (Guideline Acquisition, Representation and Execution) system addresses these issues by providing a facility for automatic resource-based adaptation of guidelines to the specific context of application, and by providing a modular architecture in which only limited and well-localised changes are needed to integrate the system with the HIS at hand.

  11. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  12. Implications for the crustal Architecture in West Antarctica revealed by the means of depth-to-the-bottom of the magnetic source (DBMS) mapping and 3D FEM geothermal heat flux models

    NASA Astrophysics Data System (ADS)

    Dziadek, Ricarda; Gohl, Karsten; Kaul, Norbert

    2017-04-01

    The West Antarctic Rift System (WARS) is one of the largest rift systems in the world, which displays unique coupled relationships between tectonic processes and ice sheet dynamics. Palaeo-ice streams have eroded troughs across the Amundsen Sea Embayment (ASE) that today route warm ocean deep water to the West Antarctic Ice Sheet (WAIS) grounding zone and reinforce dynamic ice sheet thinning. Rift basins, which cut across West Antarctica's landward-sloping shelves, promote ice sheet instability. Young, continental rift systems are regions with significantly elevated geothermal heat flux (GHF), because the transient thermal perturbation to the lithosphere caused by rifting requires 100 m.y. to reach long-term thermal equilibrium. The GHF in this region is, especially on small scales, poorly constrained and suspected to be heterogeneous as a reflection of the distribution of tectonic and volcanic activity along the complex branching geometry of the WARS, which reflects its multi-stage history and structural inheritance. We investigate the crustal architecture and the possible effects of rifting history from the WARS on the ASE ice sheet dynamics, by the use of depth-to-the-bottom of the magnetic source (DBMS) estimates. These are based on airborne-magnetic anomaly data and provide an additional insight into the deeper crustal properties. With the DBMS estimates we reveal spatial changes at the bottom of the igneous crust and the thickness of the magnetic layer, which can be further incorporated into tectonic interpretations. The DBMS also marks an important temperature transition zone of approximately 580°C and therefore serves as a boundary condition for our numerical FEM models in 2D and 3D. On balance, and by comparison to global values, we find average GHF of 90 mWm-2 with spatial variations due to crustal heterogeneities and volcanic activities. This estimate is 30% more than commonly used in ice sheet models in the ASE region.

  13. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  14. Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  15. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing

    PubMed Central

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information. PMID:22859646

  16. Development, validation and clinical application of a method for the simultaneous quantification of lamivudine, emtricitabine and tenofovir in dried blood and dried breast milk spots using LC-MS/MS.

    PubMed

    Waitt, Catriona; Diliiy Penchala, Sujan; Olagunju, Adeniyi; Amara, Alieu; Else, Laura; Lamorde, Mohammed; Khoo, Saye

    2017-08-15

    To present the validation and clinical application of a LC-MS/MS method for the quantification of lamivudine (3TC), emtricitabine (FTC) and tenofovir (TFV) in dried blood spots (DBS) and dried breast milk spots (DBMS). DBS and DBMS were prepared from 50 and 30μL of drug-spiked whole blood and human breast milk, respectively. Following extraction with acetonitrile and water, chromatographic separation utilised a Synergi polar column with a gradient mobile phase program consisting of 0.1% formic acid in water and 0.1% formic acid in acetonitrile. Detection and quantification was performed using a TSQ Quantum Ultra triple quadrupole mass spectrometer. The analytical method was used to evaluate NRTI drug levels in HIV-positive nursing mothers-infant pairs. The assay was validated over the concentration range of 16.6-5000ng/mL for 3TC, FTC and TFV in DBS and DBMS except for TFV in DBMS where linearity was established from 4.2-1250ng/mL. Intra and inter-day precision (%CV) ranged from 3.5-8.7 and accuracy was within 15% for all analytes in both matrices. The mean recovery in DBS was >61% and in DBMS >43% for all three analytes. Matrix effect was insignificant. Median AUC 0-8 values in maternal DBS and DBMS, respectively, were 4683 (4165-6057) and 6050 (5217-6417)ngh/mL for 3TC, 3312 (2259-4312) and 4853 (4124-6691)ngh/mL for FTC and 1559 (930-1915) and 56 (45-80)ngh/mL for TFV. 3TC and FTC were quantifiable (>16.6ng/mL) in DBS from 2/6 and 1/6 infants respectively whereas TFV was undetectable in all infants. DBS and DBMS sampling for bioanalysis of 3TC, FTC and TFV is straightforward, robust, accurate and precise, and ideal for use in low-resource settings. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Artificial intelligent decision support for low-cost launch vehicle integrated mission operations

    NASA Astrophysics Data System (ADS)

    Szatkowski, Gerard P.; Schultz, Roger

    1988-11-01

    The feasibility, benefits, and risks associated with Artificial Intelligence (AI) Expert Systems applied to low cost space expendable launch vehicle systems are reviewed. This study is in support of the joint USAF/NASA effort to define the next generation of a heavy-lift Advanced Launch System (ALS) which will provide economical and routine access to space. The significant technical goals of the ALS program include: a 10 fold reduction in cost per pound to orbit, launch processing in under 3 weeks, and higher reliability and safety standards than current expendables. Knowledge-based system techniques are being explored for the purpose of automating decision support processes in onboard and ground systems for pre-launch checkout and in-flight operations. Issues such as: satisfying real-time requirements, providing safety validation, hardware and Data Base Management System (DBMS) interfacing, system synergistic effects, human interfaces, and ease of maintainability, have an effect on the viability of expert systems as a useful tool.

  18. Artificial intelligent decision support for low-cost launch vehicle integrated mission operations

    NASA Technical Reports Server (NTRS)

    Szatkowski, Gerard P.; Schultz, Roger

    1988-01-01

    The feasibility, benefits, and risks associated with Artificial Intelligence (AI) Expert Systems applied to low cost space expendable launch vehicle systems are reviewed. This study is in support of the joint USAF/NASA effort to define the next generation of a heavy-lift Advanced Launch System (ALS) which will provide economical and routine access to space. The significant technical goals of the ALS program include: a 10 fold reduction in cost per pound to orbit, launch processing in under 3 weeks, and higher reliability and safety standards than current expendables. Knowledge-based system techniques are being explored for the purpose of automating decision support processes in onboard and ground systems for pre-launch checkout and in-flight operations. Issues such as: satisfying real-time requirements, providing safety validation, hardware and Data Base Management System (DBMS) interfacing, system synergistic effects, human interfaces, and ease of maintainability, have an effect on the viability of expert systems as a useful tool.

  19. Octree-based indexing for 3D pointclouds within an Oracle Spatial DBMS

    NASA Astrophysics Data System (ADS)

    Schön, Bianca; Mosa, Abu Saleh Mohammad; Laefer, Debra F.; Bertolotto, Michela

    2013-02-01

    A large proportion of today's digital datasets have a spatial component. The effective storage and management of which poses particular challenges, especially with light detection and ranging (LiDAR), where datasets of even small geographic areas may contain several hundred million points. While in the last decade 2.5-dimensional data were prevalent, true 3-dimensional data are increasingly commonplace via LiDAR. They have gained particular popularity for urban applications including generation of city-scale maps, baseline data disaster management, and utility planning. Additionally, LiDAR is commonly used for flood plane identification, coastal-erosion tracking, and forest biomass mapping. Despite growing data availability, current spatial information systems do not provide suitable full support for the data's true 3D nature. Consequently, one system is needed to store the data and another for its processing, thereby necessitating format transformations. The work presented herein aims at a more cost-effective way for managing 3D LiDAR data that allows for storage and manipulation within a single system by enabling a new index within existing spatial database management technology. Implementation of an octree index for 3D LiDAR data atop Oracle Spatial 11g is presented, along with an evaluation showing up to an eight-fold improvement compared to the native Oracle R-tree index.

  20. Integrating Scientific Array Processing into Standard SQL

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Bachhuber, Johannes; Baumann, Peter

    2014-05-01

    We live in a time that is dominated by data. Data storage is cheap and more applications than ever accrue vast amounts of data. Storing the emerging multidimensional data sets efficiently, however, and allowing them to be queried by their inherent structure, is a challenge many databases have to face today. Despite the fact that multidimensional array data is almost always linked to additional, non-array information, array databases have mostly developed separately from relational systems, resulting in a disparity between the two database categories. The current SQL standard and SQL DBMS supports arrays - and in an extension also multidimensional arrays - but does so in a very rudimentary and inefficient way. This poster demonstrates the practicality of an SQL extension for array processing, implemented in a proof-of-concept multi-faceted system that manages a federation of array and relational database systems, providing transparent, efficient and scalable access to the heterogeneous data in them.

  1. BioImaging Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Nix, Lisa Simirenko

    2006-10-25

    The Biolmaging Database (BID) is a relational database developed to store the data and meta-data for the 3D gene expression in early Drosophila embryo development on a cellular level. The schema was written to be used with the MySQL DBMS but with minor modifications can be used on any SQL compliant relational DBMS.

  2. Database Design and Management in Engineering Optimization.

    DTIC Science & Technology

    1988-02-01

    scientific and engineer- Q.- ’ method In the mid-19SOs along with modern digital com- ing applications. The paper highlights the difference puters, have made...is continuously tion software can call standard subroutines from the DBMS redefined in an application program, DDL must have j libary to define...operations. .. " type data usually encountered in engineering applications. GFDGT: Computes the number of digits needed to display " "’ A user

  3. What Influenced the Development of the Air Force Nurse Corps from 1969 Through 1983

    DTIC Science & Technology

    1999-04-01

    and required physician support for these health care extenders was discussed with each Director, Base Medical Services (DBMS) and Chief Nurse. (9:7...Education (SGE) addressing the feasibility on starting an Air Reserve Forces Nursing Services Management Course at the School of Health Care Sciences... administration , mental health , operating room, anesthesia, clinical, and flight nursing. Nursing roles were expanded in 1974 to obstetrics/gynecology

  4. Multimedia data repository for the World Wide Web

    NASA Astrophysics Data System (ADS)

    Chen, Ken; Lu, Dajin; Xu, Duanyi

    1998-08-01

    This paper introduces the design and implementation of a Multimedia Data Repository served as a multimedia information system, which provides users a Web accessible, platform independent interface to query, browse, and retrieve multimedia data such as images, graphics, audio, video from a large multimedia data repository. By integrating the multimedia DBMS, in which the textual information and samples of the multimedia data is organized and stored, and Web server together into the Microsoft ActiveX Server Framework, users can access the DBMS and query the information by simply using a Web browser at the client-side. The original multimedia data can then be located and transmitted through the Internet from the tertiary storage device, a 400 CDROM optical jukebox at the server-side, to the client-side for further use.

  5. Design and implementation of a biomedical image database (BDIM).

    PubMed

    Aubry, F; Badaoui, S; Kaplan, H; Di Paola, R

    1988-01-01

    We developed a biomedical image database (BDIM) which proposes a standardized representation of value arrays such as images and curves, and of their associated parameters, independently of their acquisition mode to make their transmission and processing easier. It includes three kinds of interactions, oriented to the users. The network concept was kept as a constraint to incorporate the BDIM in a distributed structure and we maintained compatibility with the ACR/NEMA communication protocol. The management of arrays and their associated parameters includes two distinct bases of objects, linked together via a gateway. The first one manages arrays according to their storage mode: long term storage on optionally on-line mass storage devices, and, for consultations, partial copies of long term stored arrays on hard disk. The second one manages the associated parameters and the gateway by means of the relational DBMS ORACLE. Parameters are grouped into relations. Some of them are in agreement with groups defined by the ACR/NEMA. The other relations describe objects resulting from processed initial objects. These new objects are not described by the ACR/NEMA but they can be inserted as shadow groups of ACR/NEMA description. The relations describing the storage and their pathname constitute the gateway. ORACLE distributed tools and the two-level storage technique will allow the integration of the BDIM into a distributed structure, Queries and array (alone or in sequences) retrieval module has access to the relations via a level in which a dictionary managed by ORACLE is included. This dictionary translates ACR/NEMA objects into objects that can be handled by the DBMS.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. USL/DBMS NASA/PC R and D project system testing standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu; Moreau, Dennis R.; Yan, Lin

    1984-01-01

    A set of system testing standards to be used in the development of all C software within the NASA/PC Research and Development Project is established. Testing will be considered in two phases: the program testing phase and the system testing phase. The objective of these standards is to provide guidelines for the planning and conduct of program and software system testing.

  7. An Approach to Develop 3d Geo-Dbms Topological Operators by Re-Using Existing 2d Operators

    NASA Astrophysics Data System (ADS)

    Xu, D.; Zlatanova, S.

    2013-09-01

    Database systems are continuously extending their capabilities to store, process and analyse 3D data. Topological relationships which describe the interaction of objects in space is one of the important spatial issues. However, spatial operators for 3D objects are still insufficient. In this paper we present the development of a new 3D topological function to distinguish intersections of 3D planar polygons. The development uses existing 2D functions in the DBMS and two geometric transformations (rotation and projection). This function is tested for a real dataset to detect overlapping 3D city objects. The paper presents the algorithms and analyses the challenges. Suggestions for improvements of the current algorithm as well as possible extensions to handle more 3D topological cases are discussed at the end.

  8. Web-Based Honorarium Confirmation System Prototype

    NASA Astrophysics Data System (ADS)

    Wisswani, N. W.; Catur Bawa, I. G. N. B.

    2018-01-01

    Improving services in academic environment can be applied by regulating salary payment process for all employees. As a form of control to maintain financial transparency, employees should have information concerning salary payment process. Currently, notification process of committee honorarium will be accepted by the employees in a manual manner. The salary will be received by the employee bank account and to know its details, they should go to the accounting unit to find out further information. Though there are some employees entering the accounting unit, they still find difficulty to obtain information about detailed honor information that they received in their accounts. This can be caused by many data collected and to be managed. Based on this issue, this research will design a prototype of web-based system for accounting unit system in order to provide detailed financial transaction confirmation to employee bank accounts that have been informed through mobile banking system. This prototype will be developed with Waterfall method through testing on final users after it is developed through PHP program with MySQL as DBMS

  9. Distributed Computerized Catalog System

    NASA Technical Reports Server (NTRS)

    Borgen, Richard L.; Wagner, David A.

    1995-01-01

    DarkStar Distributed Catalog System describes arbitrary data objects in unified manner, providing end users with versatile, yet simple search mechanism for locating and identifying objects. Provides built-in generic and dynamic graphical user interfaces. Design of system avoids some of problems of standard DBMS, and system provides more flexibility than do conventional relational data bases, or object-oriented data bases. Data-collection lattice partly hierarchical representation of relationships among collections, subcollections, and data objects.

  10. The unified database for the fixed target experiment BM@N

    NASA Astrophysics Data System (ADS)

    Gertsenberger, K. V.

    2016-09-01

    The article describes the developed database designed as comprehensive data storage of the fixed target experiment BM@N [1] at Joint Institute for Nuclear Research (JINR) in Dubna. The structure and purposes of the BM@N facility will be briefly presented. The scheme of the unified database and its parameters will be described in detail. The use of the BM@N database implemented on the PostgreSQL database management system (DBMS) allows one to provide user access to the actual information of the experiment. Also the interfaces developed for the access to the database will be presented. One was implemented as the set of C++ classes to access the data without SQL statements, the other-Web-interface being available on the Web page of the BM@N experiment.

  11. Database interfaces on NASA's heterogeneous distributed database system

    NASA Technical Reports Server (NTRS)

    Huang, Shou-Hsuan Stephen

    1989-01-01

    The syntax and semantics of all commands used in the template are described. Template builders should consult this document for proper commands in the template. Previous documents (Semiannual reports) described other aspects of this project. Appendix 1 contains all substituting commands used in the system. Appendix 2 includes all repeating commands. Appendix 3 is a collection of DEFINE templates from eight different DBMS's.

  12. ROME (Request Object Management Environment)

    NASA Astrophysics Data System (ADS)

    Kong, M.; Good, J. C.; Berriman, G. B.

    2005-12-01

    Most current astronomical archive services are based on an HTML/ CGI architecture where users submit HTML forms via a browser and CGI programs operating under a web server process the requests. Most services return an HTML result page with URL links to the result files or, for longer jobs, return a message indicating that email will be sent when the job is done. This paradigm has a few serious shortcomings. First, it is all too common for something to go wrong and for the user to never hear about the job again. Second, for long and complicated jobs there is often important intermediate information that would allow the user to adjust the processing. Finally, unless some sort of custom queueing mechanism is used, background jobs are started immediately upon receiving the CGI request. When there are many such requests the server machine can easily be overloaded and either slow to a crawl or crash. Request Object Management Environment (ROME) is a collection of middleware components being developed under the National Virtual Observatory Project to provide mechanism for managing long jobs such as computationally intensive statistical analysis requests or the generation of large scale mosaic images. Written as EJB objects within the open-source JBoss applications server, ROME receives processing requests via a servelet interface, stores them in a DBMS using JDBC, distributes the processing (via queuing mechanisms) across multiple machines and environments (including Grid resources), manages realtime messages from the processing modules, and ensures proper user notification. The request processing modules are identical in structure to standard CGI-programs -- though they can optionally implement status messaging -- and can be written in any language. ROME will persist these jobs across failures of processing modules, network outages, and even downtime of ROME and the DBMS, restarting them as necessary.

  13. A DBMS architecture for global change research

    NASA Astrophysics Data System (ADS)

    Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.

    1993-08-01

    The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.

  14. Mandatory and Location-Aware Access Control for Relational Databases

    NASA Astrophysics Data System (ADS)

    Decker, Michael

    Access control is concerned with determining which operations a particular user is allowed to perform on a particular electronic resource. For example, an access control decision could say that user Alice is allowed to perform the operation read (but not write) on the resource research report. With conventional access control this decision is based on the user's identity whereas the basic idea of Location-Aware Access Control (LAAC) is to evaluate also a user's current location when making the decision if a particular request should be granted or denied. LAAC is an interesting approach for mobile information systems because these systems are exposed to specific security threads like the loss of a device. Some data models for LAAC can be found in literature, but almost all of them are based on RBAC and none of them is designed especially for Database Management Systems (DBMS). In this paper we therefore propose a LAAC-approach for DMBS and describe a prototypical implementation of that approach that is based on database triggers.

  15. Missense mutations located in structural p53 DNA-binding motifs are associated with extremely poor survival in chronic lymphocytic leukemia.

    PubMed

    Trbusek, Martin; Smardova, Jana; Malcikova, Jitka; Sebejova, Ludmila; Dobes, Petr; Svitakova, Miluse; Vranova, Vladimira; Mraz, Marek; Francova, Hana Skuhrova; Doubek, Michael; Brychtova, Yvona; Kuglik, Petr; Pospisilova, Sarka; Mayer, Jiri

    2011-07-01

    There is a distinct connection between TP53 defects and poor prognosis in chronic lymphocytic leukemia (CLL). It remains unclear whether patients harboring TP53 mutations represent a homogenous prognostic group. We evaluated the survival of patients with CLL and p53 defects identified at our institution by p53 yeast functional assay and complementary interphase fluorescence in situ hybridization analysis detecting del(17p) from 2003 to 2010. A defect of the TP53 gene was identified in 100 of 550 patients. p53 mutations were strongly associated with the deletion of 17p and the unmutated IgVH locus (both P < .001). Survival assessed from the time of abnormality detection was significantly reduced in patients with both missense (P < .001) and nonmissense p53 mutations (P = .004). In addition, patients harboring missense mutation located in p53 DNA-binding motifs (DBMs), structurally well-defined parts of the DNA-binding domain, manifested a clearly shorter median survival (12 months) compared with patients having missense mutations outside DBMs (41 months; P = .002) or nonmissense alterations (36 months; P = .005). The difference in survival was similar in the analysis limited to patients harboring mutation accompanied by del(17p) and was also confirmed in a subgroup harboring TP53 defect at diagnosis. The patients with p53 DBMs mutation (at diagnosis) also manifested a short median time to first therapy (TTFT; 1 month). The substantially worse survival and the short TTFT suggest a strong mutated p53 gain-of-function phenotype in patients with CLL with DBMs mutations. The impact of p53 DBMs mutations on prognosis and response to therapy should be analyzed in investigative clinical trials.

  16. Undergraduate Research Opportunities in OSS

    NASA Astrophysics Data System (ADS)

    Boldyreff, Cornelia; Capiluppi, Andrea; Knowles, Thomas; Munro, James

    Using Open Source Software (OSS) in undergraduate teaching in universities is now commonplace. Students use OSS applications and systems in their courses on programming, operating systems, DBMS, web development to name but a few. Studying OSS projects from both a product and a process view also forms part of the software engineering curriculum at various universities. Many students have taken part in OSS projects as well as developers.

  17. A Brief Assessment of LC2IEDM, MIST and Web Services for use in Naval Tactical Data Management

    DTIC Science & Technology

    2004-07-01

    server software, messaging between the client and server, and a database. The MIST database is implemented in an open source DBMS named PostGreSQL ... PostGreSQL had its beginnings at the University of California, Berkley, in 1986 [11]. The development of PostGreSQL has since evolved into a...contact history from the database. DRDC Atlantic TM 2004-148 9 Request Software Request Software Server Side Response from service

  18. Evaluating Non-In-Place Update Techniques for Flash-Based Transaction Processing Systems

    NASA Astrophysics Data System (ADS)

    Wang, Yongkun; Goda, Kazuo; Kitsuregawa, Masaru

    Recently, flash memory is emerging as the storage device. With price sliding fast, the cost per capacity is approaching to that of SATA disk drives. So far flash memory has been widely deployed in consumer electronics even partly in mobile computing environments. For enterprise systems, the deployment has been studied by many researchers and developers. In terms of the access performance characteristics, flash memory is quite different from disk drives. Without the mechanical components, flash memory has very high random read performance, whereas it has a limited random write performance because of the erase-before-write design. The random write performance of flash memory is comparable with or even worse than that of disk drives. Due to such a performance asymmetry, naive deployment to enterprise systems may not exploit the potential performance of flash memory at full blast. This paper studies the effectiveness of using non-in-place-update (NIPU) techniques through the IO path of flash-based transaction processing systems. Our deliberate experiments using both open-source DBMS and commercial DBMS validated the potential benefits; x3.0 to x6.6 performance improvement was confirmed by incorporating non-in-place-update techniques into file system without any modification of applications or storage devices.

  19. Simulation and management games for training command and control in emergencies.

    PubMed

    Levi, Leon; Bregman, David

    2003-01-01

    The aim of our project was to introduce and implement simulation techniques in a problematic field of increasing health care system preparedness for disasters. This field was chosen as knowledge is gained by few experienced staff members who need to disperse it to others during the busy routine work of the system personnel. Knowledge management techniques ranging from classifying the current data, centralized organizational knowledge storage and using it for decision making and dispersing it through the organization--were used in this project. In the first stage we analyzed the current system of building a preparedness protocol (set of orders). We identified the pitfalls of changing personnel and loosing knowledge gained through lessons from local and national experience. For this stage we developed a database of resources and objects (casualties) to be used in the simulation in different possibilities. One of those was the differentiation between drills with trainer and those in front of computers enable to set the needed solution. The model rules for different scenarios of multi-casualty incidents from conventional warfare trauma to combined chemical/toxicological as well as, levels of care pre and inside hospitals--were incorporated to the database management system (we used Microsoft Access' DBMS). The hardware for management game was comprised of serial computers with network and possibility of projection of scenes. For prehospital phase the possibility of portable PC's and connections to central server was used to assess bidirectional flow of information. Simulation software (ARENA) and graphical interfase (Visual Basic, GUI) as shown in the attached figure. We hereby conclude that our system provides solutions which are in use in different levels of healthcare system to assess and improve management command and control for different scenarios of multi-casualty incidents.

  20. Implications for Crustal Structures and Heat Fluxes from Depth-to-the-Bottom of the Magnetic Source Estimates in West Antarctica, Amundsen Sea Sector

    NASA Astrophysics Data System (ADS)

    Dziadek, R.; Ferraccioli, F.; Gohl, K.; Spiegel, C.; Kaul, N. E.

    2017-12-01

    The West Antarctic Rift System is one of the least understood rift systems on earth, but displays a unique coupled relationship between tectonic processes and ice sheet dynamics. Geothermal heat flux (GHF) is a poorly constrained parameter in Antarctica and suspected to affect basal conditions of ice sheets, i.e., basal melting and subglacial hydrology. Thermomechanical models demonstrate the influential boundary condition of geothermal heat flux for (paleo) ice sheet stability. Young, continental rift systems are regions with significantly elevated geothermal heat flux (GHF), because the transient thermal perturbation to the lithosphere caused by rifting requires 100 Ma to reach long-term thermal equilibrium. We discuss airborne, high-resolution magnetic anomaly data from the Amundsen Sea Sector, to provide additional insight into deeper crustal structures related to the West Antarctic Rift System in the Amundsen/Bellingshausen sector. With the depth-to-the-bottom of the magnetic source (DBMS) estimates we reveal spatial changes at the bottom of the igneous crust and the thickness of the magnetic layer, which can be further incorporated into tectonic interpretations. The DBMS also marks an important temperature transition zone of approximately 580°C and therefore serves as a boundary condition for our numerical FEM thermal models in 2D and 3D.

  1. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  2. DAS: A Data Management System for Instrument Tests and Operations

    NASA Astrophysics Data System (ADS)

    Frailis, M.; Sartor, S.; Zacchei, A.; Lodi, M.; Cirami, R.; Pasian, F.; Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Franceschi, E.; Nicastro, L.; Conforti, V.; Zoli, A.; Smart, R.; Morbidelli, R.; Dadina, M.

    2014-05-01

    The Data Access System (DAS) is a and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.

  3. [A development and evaluation of nursing KMS using QFD in outpatient departments].

    PubMed

    Lee, Han Na; Yun, Eun Kyoung

    2014-02-01

    This study was done to develop and implement the Nursing KMS (knowledge management system) in order to improve knowledge sharing and creation among clinical nurses in outpatient departments. This study was a methodological research using the 'System Development Life Cycle': consisting of planning, analyzing, design, implementation, and evaluation. Quality Function Deployment (QFD) was applied to establish nurse requirements and to identify important design requirements. Participants were 32 nurses and for evaluation data were collected pre and post intervention at K Hospital in Seoul, a tertiary hospital with over 1,000 beds. The Nursing KMS was built using a Linux-based operating system, Oracle DBMS, and Java 1.6 web programming tools. The system was implemented as a sub-system of the hospital information system. There was statistically significant differences in the sharing of knowledge but creating of knowledge was no statistically meaningful difference observed. In terms of satisfaction with the system, system efficiency ranked first followed by system convenience, information suitability and information usefulness. The results indicate that the use of Nursing KMS increases nurses' knowledge sharing and can contribute to increased quality of nursing knowledge and provide more opportunities for nurses to gain expertise from knowledge shared among nurses.

  4. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  5. Relax with CouchDB - Into the non-relational DBMS era of Bioinformatics

    PubMed Central

    Manyam, Ganiraju; Payton, Michelle A.; Roth, Jack A.; Abruzzo, Lynne V.; Coombes, Kevin R.

    2012-01-01

    With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. PMID:22609849

  6. A report on the USL NASA/RECON project. Part 2: PC-based R and D in support of IS and R applications

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1984-01-01

    This Working Paper Series entry describes the PC R and D development effort initiated as part of the NASA/RECON Project at the University of Southwestern Louisiana. This effort involves the development of a PC-based environment for the prototyping and evaluation of various tools designed to enhance the interaction between scientists and engineers and remote information systems. The design of PC-based tools for the enhancement of the NASA/RECON university-level courses is described as well as the design of a multi-functional PC-based workstation to support access to and processing of information from local, distributed, and remote sources. Course preparation activities are described in a companion report entitled A Report on the USL NASA/RECON Project: Part 1, the Development of a Transportable, University-Level, IS and R Educational Program, by Suzy Gallagher and Martin Granier, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-7.

  7. SciDB versus Spark: A Preliminary Comparison Based on an Earth Science Use Case

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.; Doan, K.; Oloso, A.

    2015-12-01

    We compare two Big Data technologies, SciDB and Spark, for performance, usability, and extensibility, when applied to a representative Earth science use case. SciDB is a new-generation parallel distributed database management system (DBMS) based on the array data model that is capable of handling multidimensional arrays efficiently but requires lengthy data ingest prior to analysis, whereas Spark is a fast and general engine for large scale data processing that can immediately process raw data files and thereby avoid the ingest process. Once data have been ingested, SciDB is very efficient in database operations such as subsetting. Spark, on the other hand, provides greater flexibility by supporting a wide variety of high-level tools including DBMS's. For the performance aspect of this preliminary comparison, we configure Spark to operate directly on text or binary data files and thereby limit the need for additional tools. Arguably, a more appropriate comparison would involve exploring other configurations of Spark which exploit supported high-level tools, but that is beyond our current resources. To make the comparison as "fair" as possible, we export the arrays produced by SciDB into text files (or converting them to binary files) for the intake by Spark and thereby avoid any additional file processing penalties. The Earth science use case selected for this comparison is the identification and tracking of snowstorms in the NASA Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalysis data. The identification portion of the use case is to flag all grid cells of the MERRA high-resolution hourly data that satisfies our criteria for snowstorm, whereas the tracking portion connects flagged cells adjacent in time and space to form a snowstorm episode. We will report the results of our comparisons at this presentation.

  8. Incorporating the APS Catalog of the POSS I and Image Archive in ADS

    NASA Technical Reports Server (NTRS)

    Humphreys, Roberta M.

    1998-01-01

    The primary purpose of this contract was to develop the software to both create and access an on-line database of images from digital scans of the Palomar Sky Survey. This required modifying our DBMS (called Star Base) to create an image database from the actual raw pixel data from the scans. The digitized images are processed into a set of coordinate-reference index and pixel files that are stored in run-length files, thus achieving an efficient lossless compression. For efficiency and ease of referencing, each digitized POSS I plate is then divided into 900 subplates. Our custom DBMS maps each query into the corresponding POSS plate(s) and subplate(s). All images from the appropriate subplates are retrieved from disk with byte-offsets taken from the index files. These are assembled on-the-fly into a GIF image file for browser display, and a FITS format image file for retrieval. The FITS images have a pixel size of 0.33 arcseconds. The FITS header contains astrometric and photometric information. This method keeps the disk requirements manageable while allowing for future improvements. When complete, the APS Image Database will contain over 130 Gb of data. A set of web pages query forms are available on-line, as well as an on-line tutorial and documentation. The database is distributed to the Internet by a high-speed SGI server and a high-bandwidth disk system. URL is http://aps.umn.edu/IDB/. The image database software is written in perl and C and has been compiled on SGI computers with MIX5.3. A copy of the written documentation is included and the software is on the accompanying exabyte tape.

  9. A Fuzzy Query Mechanism for Human Resource Websites

    NASA Astrophysics Data System (ADS)

    Lai, Lien-Fu; Wu, Chao-Chin; Huang, Liang-Tsung; Kuo, Jung-Chih

    Users' preferences often contain imprecision and uncertainty that are difficult for traditional human resource websites to deal with. In this paper, we apply the fuzzy logic theory to develop a fuzzy query mechanism for human resource websites. First, a storing mechanism is proposed to store fuzzy data into conventional database management systems without modifying DBMS models. Second, a fuzzy query language is proposed for users to make fuzzy queries on fuzzy databases. User's fuzzy requirement can be expressed by a fuzzy query which consists of a set of fuzzy conditions. Third, each fuzzy condition associates with a fuzzy importance to differentiate between fuzzy conditions according to their degrees of importance. Fourth, the fuzzy weighted average is utilized to aggregate all fuzzy conditions based on their degrees of importance and degrees of matching. Through the mutual compensation of all fuzzy conditions, the ordering of query results can be obtained according to user's preference.

  10. Designing appropriate blended courses: a students' perspective.

    PubMed

    Tsai, Chia-Wen

    2010-10-01

    The computing education in Taiwan's vocational schools usually focuses on how to help students enhance their professional skills and pass certified examinations. In addition, due to national education policy and universities' regulations, pure online courses are not permitted in Taiwan. In order to design appropriate blended learning (BL) courses, the author explored the effects of web-mediated self-regulated learning (SRL) with variations in online class frequency on enhancing students' computing skills and their perspective of the blended courses. A total of 172 students, divided into four groups, participated in the experiment. The results showed that students in the SRL and BL group with five online classes had the highest scores for using a database management system (DBMS), and the highest pass rate on certified examinations. Students in this group also expressed their positive perspective on the arrangement of their blended course with the intervention of web-mediated SRL.

  11. A Relational Encoding of a Conceptual Model with Multiple Temporal Dimensions

    NASA Astrophysics Data System (ADS)

    Gubiani, Donatella; Montanari, Angelo

    The theoretical interest and the practical relevance of a systematic treatment of multiple temporal dimensions is widely recognized in the database and information system communities. Nevertheless, most relational databases have no temporal support at all. A few of them provide a limited support, in terms of temporal data types and predicates, constructors, and functions for the management of time values (borrowed from the SQL standard). One (resp., two) temporal dimensions are supported by historical and transaction-time (resp., bitemporal) databases only. In this paper, we provide a relational encoding of a conceptual model featuring four temporal dimensions, namely, the classical valid and transaction times, plus the event and availability times. We focus our attention on the distinctive technical features of the proposed temporal extension of the relation model. In the last part of the paper, we briefly show how to implement it in a standard DBMS.

  12. A Codasyl-Type Schema for Natural Language Medical Records

    PubMed Central

    Sager, N.; Tick, L.; Story, G.; Hirschman, L.

    1980-01-01

    This paper describes a CODASYL (network) database schema for information derived from narrative clinical reports. The goal of this work is to create an automated process that accepts natural language documents as input and maps this information into a database of a type managed by existing database management systems. The schema described here represents the medical events and facts identified through the natural language processing. This processing decomposes each narrative into a set of elementary assertions, represented as MEDFACT records in the database. Each assertion in turn consists of a subject and a predicate classed according to a limited number of medical event types, e.g., signs/symptoms, laboratory tests, etc. The subject and predicate are represented by EVENT records which are owned by the MEDFACT record associated with the assertion. The CODASYL-type network structure was found to be suitable for expressing most of the relations needed to represent the natural language information. However, special mechanisms were developed for storing the time relations between EVENT records and for recording connections (such as causality) between certain MEDFACT records. This schema has been implemented using the UNIVAC DMS-1100 DBMS.

  13. Distributed data collection for a database of radiological image interpretations

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.

    1997-01-01

    The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.

  14. Chest wall reconstruction in a canine model using polydioxanone mesh, demineralized bone matrix and bone marrow stromal cells.

    PubMed

    Tang, Hua; Xu, Zhifei; Qin, Xiong; Wu, Bin; Wu, Lihui; Zhao, XueWei; Li, Yulin

    2009-07-01

    Extensive chest wall defect reconstruction remains a challenging problem for surgeons. In the past several years, little progress has been made in this area. In this study, a biodegradable polydioxanone (PDO) mesh and demineralized bone matrix (DBM) seeded with osteogenically induced bone marrow stromal cells (BMSCs) were used to reconstruct a 6 cm x 5.5 cm chest wall defect. Four experimental groups were evaluated (n=6 per group): polydioxanone (PDO) mesh/DBMs/BMSCs group, polydioxanone (PDO) mesh/DBMs group, polydioxanone (PDO) mesh group, and a blank group (no materials) in a canine model. All the animals survived except those in the blank group. In all groups receiving biomaterial implants, the polydioxanone (PDO) mesh completely degraded at 24 weeks and was replaced by fibrous tissue with thickness close to that of the normal intercostal tissue (P>0.05). In the polydioxanone (PDO) mesh/DBMs/BMSCs group, new bone formation and bone-union were observed by radiographic and histological examination. More importantly, the reconstructed rib could maintain its original radian and achieve satisfactory biomechanics close to normal ribs in terms of bending stress (P>0.05). However, in the other two groups, fibrous tissue was observed in the defect and junctions, and the reconstructed ribs were easily distorted under an outer force. Based on these results, a surgical approach utilizing biodegradable polydioxanone (PDO) mesh in combination with DBMs and BMSCs could repair the chest wall defect not only in function but also in structure.

  15. The DEEP-South: Scheduling and Data Reduction Software System

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  16. Real Time Integration of Field Data Into a GIS Platform for the Management of Hydrological Emergencies

    NASA Astrophysics Data System (ADS)

    Mangiameli, M.; Mussumeci, G.

    2013-01-01

    A wide series of events requires immediate availability of information and field data to be provided to decision-makers. An example is the necessity of quickly transferring the information acquired from monitoring and alerting sensors or the data of the reconnaissance of damage after a disastrous event to an Emergency Operations Center. To this purpose, we developed an integrated GIS and WebGIS system to dynamically create and populate via Web a database with spatial features. In particular, this work concerns the gathering and transmission of spatial data and related information to the desktop GIS so that they can be displayed and analyzed in real time to characterize the operational scenario and to decide the rescue interventions. As basic software, we used only free and open source: QuantumGIS and Grass as Desktop GIS, Map Server with PMapper application for the Web-Gis functionality and PostGreSQL/PostGIS as Data Base Management System (DBMS). The approach has been designed, developed and successfully tested in the management of GIS-based navigation of an autonomous robot, both to map its trajectories and to assign optimal paths. This paper presents the application of our system to a simulated hydrological event that could interest the province of Catania, in Sicily. In particular, assuming that more teams draw up an inventory of the damage, we highlight the benefits of real-time transmission of the information collected from the field to headquarters.

  17. The Design of Data Disaster Recovery of National Fundamental Geographic Information System

    NASA Astrophysics Data System (ADS)

    Zhai, Y.; Chen, J.; Liu, L.; Liu, J.

    2014-04-01

    With the development of information technology, data security of information system is facing more and more challenges. The geographic information of surveying and mapping is fundamental and strategic resource, which is applied in all areas of national economic, defence and social development. It is especially vital to national and social interests when such classified geographic information is directly concerning Chinese sovereignty. Several urgent problems that needs to be resolved for surveying and mapping are how to do well in mass data storage and backup, establishing and improving the disaster backup system especially after sudden natural calamity accident, and ensuring all sectors rapidly restored on information system will operate correctly. For overcoming various disaster risks, protect the security of data and reduce the impact of the disaster, it's no doubt the effective way is to analysis and research on the features of storage and management and security requirements, as well as to ensure that the design of data disaster recovery system suitable for the surveying and mapping. This article analyses the features of fundamental geographic information data and the requirements of storage management, three site disaster recovery system of DBMS plan based on the popular network, storage and backup, data replication and remote switch of application technologies. In LAN that synchronous replication between database management servers and the local storage of backup management systems, simultaneously, remote asynchronous data replication between local storage backup management systems and remote database management servers. The core of the system is resolving local disaster in the remote site, ensuring data security and business continuity of local site. This article focuses on the following points: background, the necessity of disaster recovery system, the analysis of the data achievements and data disaster recovery plan. Features of this program is to use a hardware-based data hot backup, and remote online disaster recovery support for Oracle database system. The achievement of this paper is in summarizing and analysing the common characteristics of disaster of surveying and mapping business system requirements, while based on the actual situation of the industry, designed the basic GIS disaster recovery solutions, and we also give the conclusions about key technologies of RTO and RPO.

  18. Database interfaces on NASA's heterogeneous distributed database system

    NASA Technical Reports Server (NTRS)

    Huang, S. H. S.

    1986-01-01

    The purpose of the ORACLE interface is to enable the DAVID program to submit queries and transactions to databases running under the ORACLE DBMS. The interface package is made up of several modules. The progress of these modules is described below. The two approaches used in implementing the interface are also discussed. Detailed discussion of the design of the templates is shown and concluding remarks are presented.

  19. Secure DBMS Auditor

    DTIC Science & Technology

    1990-07-01

    i k RAYMOND P. URTZ, JR. Technical Director Directorate of Command & Control FOR TH!E C0OKANDER: IGOR G. PLONISCH Directorate of Plans & Programs If...access controls and for thwarting inference and aggregation attacks ae generally considered inadequate for high usurance systems. Consequently, thee is...requirements was to have been based on a state-of-the-art survey involving interviews with TDBMS researchers and developers and security officers and auditors

  20. Growth factors--BMPs, DBMs, and buffy coat products: are there any proven differences amongst them?

    PubMed

    Veillette, Christian J H; McKee, Michael D

    2007-03-01

    Advances in the understanding of bone repair and improved biotechnology have led to the introduction of new strategies for orthopedic surgeons to control and modulate bone healing using growth factors. However, many orthopedic surgeons are uncertain about the current levels of evidence supporting the use of materials that possess these properties and their therapeutic role in the management of skeletal problems such as fracture, long-bone nonunion, and spine fusion. In particular, the differences amongst osteoinductive factors synthesized by recombinant gene technology, or derived from demineralized bone matrix or platelet rich plasma requires clarification.

  1. A case Study of Applying Object-Relational Persistence in Astronomy Data Archiving

    NASA Astrophysics Data System (ADS)

    Yao, S. S.; Hiriart, R.; Barg, I.; Warner, P.; Gasson, D.

    2005-12-01

    The NOAO Science Archive (NSA) team is developing a comprehensive domain model to capture the science data in the archive. Java and an object model derived from the domain model weil address the application layer of the archive system. However, since RDBMS is the best proven technology for data management, the challenge is the paradigm mismatch between the object and the relational models. Transparent object-relational mapping (ORM) persistence is a successful solution to this challenge. In the data modeling and persistence implementation of NSA, we are using Hibernate, a well-accepted ORM tool, to bridge the object model in the business tier and the relational model in the database tier. Thus, the database is isolated from the Java application. The application queries directly on objects using a DBMS-independent object-oriented query API, which frees the application developers from the low level JDBC and SQL so that they can focus on the domain logic. We present the detailed design of the NSA R3 (Release 3) data model and object-relational persistence, including mapping, retrieving and caching. Persistence layer optimization and performance tuning will be analyzed. The system is being built on J2EE, so the integration of Hibernate into the EJB container and the transaction management are also explored.

  2. Collecting data along the continuum of prevention and care: a Continuous Quality Improvement approach.

    PubMed

    Indyk, Leonard; Indyk, Debbie

    2006-01-01

    For the past 14 years, a team of applied social scientists and system analysts has worked with a wide variety of Community- Based Organizations (CBO's), other grassroots agencies and networks, and Medical Center departments to support resource, program, staff and data development and evaluation for hospital- and community-based programs and agencies serving HIV at-risk and affected populations. A by-product of this work has been the development, elaboration and refinement of an approach to Continuous Quality Improvement (CQI) which is appropriate for diverse community-based providers and agencies. A key component of our CQI system involves the installation of a sophisticated relational database management and reporting system (DBMS) which is used to collect, analyze, and report data in an iterative process to provide feedback among the evaluators, agency administration and staff. The database system is designed for two purposes: (1) to support the agency's administrative internal and external reporting requirements; (2) to support the development of practice driven health services and early intervention research. The body of work has fostered a unique opportunity for the development of exploratory service-driven research which serves both administrative and research needs.

  3. A Framework for WWW Query Processing

    NASA Technical Reports Server (NTRS)

    Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)

    2000-01-01

    Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).

  4. Relax with CouchDB--into the non-relational DBMS era of bioinformatics.

    PubMed

    Manyam, Ganiraju; Payton, Michelle A; Roth, Jack A; Abruzzo, Lynne V; Coombes, Kevin R

    2012-07-01

    With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  6. Comparative Analysis of Data Structures for Storing Massive Tins in a Dbms

    NASA Astrophysics Data System (ADS)

    Kumar, K.; Ledoux, H.; Stoter, J.

    2016-06-01

    Point cloud data are an important source for 3D geoinformation. Modern day 3D data acquisition and processing techniques such as airborne laser scanning and multi-beam echosounding generate billions of 3D points for simply an area of few square kilometers. With the size of the point clouds exceeding the billion mark for even a small area, there is a need for their efficient storage and management. These point clouds are sometimes associated with attributes and constraints as well. Storing billions of 3D points is currently possible which is confirmed by the initial implementations in Oracle Spatial SDO PC and the PostgreSQL Point Cloud extension. But to be able to analyse and extract useful information from point clouds, we need more than just points i.e. we require the surface defined by these points in space. There are different ways to represent surfaces in GIS including grids, TINs, boundary representations, etc. In this study, we investigate the database solutions for the storage and management of massive TINs. The classical (face and edge based) and compact (star based) data structures are discussed at length with reference to their structure, advantages and limitations in handling massive triangulations and are compared with the current solution of PostGIS Simple Feature. The main test dataset is the TIN generated from third national elevation model of the Netherlands (AHN3) with a point density of over 10 points/m2. PostgreSQL/PostGIS DBMS is used for storing the generated TIN. The data structures are tested with the generated TIN models to account for their geometry, topology, storage, indexing, and loading time in a database. Our study is useful in identifying what are the limitations of the existing data structures for storing massive TINs and what is required to optimise these structures for managing massive triangulations in a database.

  7. Information management and analysis system for groundwater data in Thailand

    NASA Astrophysics Data System (ADS)

    Gill, D.; Luckananurung, P.

    1992-01-01

    The Ground Water Division of the Thai Department of Mineral Resources maintains a large archive of groundwater data with information on some 50,000 water wells. Each well file contains information on well location, well completion, borehole geology, water levels, water quality, and pumping tests. In order to enable efficient use of this information a computer-based system for information management and analysis was created. The project was sponsored by the United Nations Development Program and the Thai Department of Mineral Resources. The system was designed to serve users who lack prior training in automated data processing. Access is through a friendly user/system dialogue. Tasks are segmented into a number of logical steps, each of which is managed by a separate screen. Selective retrieval is possible by four different methods of area definition and by compliance with user-specified constraints on any combination of database variables. The main types of outputs are: (1) files of retrieved data, screened according to users' specifications; (2) an assortment of pre-formatted reports; (3) computed geochemical parameters and various diagrams of water chemistry derived therefrom; (4) bivariate scatter diagrams and linear regression analysis; (5) posting of data and computed results on maps; and (6) hydraulic aquifer characteristics as computed from pumping tests. Data are entered directly from formatted screens. Most records can be copied directly from hand-written documents. The database-management program performs data integrity checks in real time, enabling corrections at the time of input. The system software can be grouped into: (1) database administration and maintenance—these functions are carried out by the SIR/DBMS software package; (2) user communication interface for task definition and execution control—the interface is written in the operating system command language (VMS/DCL) and in FORTRAN 77; and (3) scientific data-processing programs, written in FORTRAN 77. The system was implemented on a DEC MicroVAX II computer.

  8. Partitioned learning of deep Boltzmann machines for SNP data.

    PubMed

    Hess, Moritz; Lenz, Stefan; Blätte, Tamara J; Bullinger, Lars; Binder, Harald

    2017-10-15

    Learning the joint distributions of measurements, and in particular identification of an appropriate low-dimensional manifold, has been found to be a powerful ingredient of deep leaning approaches. Yet, such approaches have hardly been applied to single nucleotide polymorphism (SNP) data, probably due to the high number of features typically exceeding the number of studied individuals. After a brief overview of how deep Boltzmann machines (DBMs), a deep learning approach, can be adapted to SNP data in principle, we specifically present a way to alleviate the dimensionality problem by partitioned learning. We propose a sparse regression approach to coarsely screen the joint distribution of SNPs, followed by training several DBMs on SNP partitions that were identified by the screening. Aggregate features representing SNP patterns and the corresponding SNPs are extracted from the DBMs by a combination of statistical tests and sparse regression. In simulated case-control data, we show how this can uncover complex SNP patterns and augment results from univariate approaches, while maintaining type 1 error control. Time-to-event endpoints are considered in an application with acute myeloid leukemia patients, where SNP patterns are modeled after a pre-screening based on gene expression data. The proposed approach identified three SNPs that seem to jointly influence survival in a validation dataset. This indicates the added value of jointly investigating SNPs compared to standard univariate analyses and makes partitioned learning of DBMs an interesting complementary approach when analyzing SNP data. A Julia package is provided at 'http://github.com/binderh/BoltzmannMachines.jl'. binderh@imbi.uni-freiburg.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  9. Management of low-level radioactive waste in Israel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabtai, B.; Brenner, S.; Ne`eman, E.

    1995-12-31

    Radioactive materials are used extensively in Israel in many areas and applications for medicine, industry, agriculture, research and development and others. Israel`s primary concern in waste management is population safety and environmental protection. The Ministry of The Environment (MOE), in cooperation with the Israeli Atomic Energy Commission (IAEC), supervise over the disposal system, and ensure an effective control. The MOE is responsible for the granting of permits to users of radioactive elements in about 300 plants and institutes, with about 2,200 installations. The MOE operates a computerized database management system (DBMS) on radioactive materials, with data on licensing, import andmore » distribution, waste disposal and transportation. Supervision over the disposal of LLRW has deepened recently, and periodic reports, based on the number of drums containing LLRW, which were transferred from all institutes in Israel to the NRWDS, were prepared. Draft regulations on the disposal of LLRW from institutes of research and education, hospitals, medical laboratories and other, have been recently prepared. These regulations include instructions on the disposal of solid and liquid LLRW as well as radioactive gases and vapors. As a general rule, no LLRW of any sort will be disposed of through the ordinary waste system or general sewage. However, in some extraordinary cases, residues of liquid LLRW are allowed to be disposed in this manner, if the requirements for disposal are satisfied. There are some conditions, in which solid LLRW might be treated as a conventional waste, as well as for safe emission of radioactive gases and aerosols. In light of these considerations, a new and more specific approach to radiation protection organizations and management of low-level radioactive waste problems, supervision and optimization is presented.« less

  10. Meta-All: a system for managing metabolic pathway information.

    PubMed

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-10-23

    Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at http://bic-gh.de/meta-all and can be downloaded free of charge and installed locally.

  11. Meta-All: a system for managing metabolic pathway information

    PubMed Central

    Weise, Stephan; Grosse, Ivo; Klukas, Christian; Koschützki, Dirk; Scholz, Uwe; Schreiber, Falk; Junker, Björn H

    2006-01-01

    Background Many attempts are being made to understand biological subjects at a systems level. A major resource for these approaches are biological databases, storing manifold information about DNA, RNA and protein sequences including their functional and structural motifs, molecular markers, mRNA expression levels, metabolite concentrations, protein-protein interactions, phenotypic traits or taxonomic relationships. The use of these databases is often hampered by the fact that they are designed for special application areas and thus lack universality. Databases on metabolic pathways, which provide an increasingly important foundation for many analyses of biochemical processes at a systems level, are no exception from the rule. Data stored in central databases such as KEGG, BRENDA or SABIO-RK is often limited to read-only access. If experimentalists want to store their own data, possibly still under investigation, there are two possibilities. They can either develop their own information system for managing that own data, which is very time-consuming and costly, or they can try to store their data in existing systems, which is often restricted. Hence, an out-of-the-box information system for managing metabolic pathway data is needed. Results We have designed META-ALL, an information system that allows the management of metabolic pathways, including reaction kinetics, detailed locations, environmental factors and taxonomic information. Data can be stored together with quality tags and in different parallel versions. META-ALL uses Oracle DBMS and Oracle Application Express. We provide the META-ALL information system for download and use. In this paper, we describe the database structure and give information about the tools for submitting and accessing the data. As a first application of META-ALL, we show how the information contained in a detailed kinetic model can be stored and accessed. Conclusion META-ALL is a system for managing information about metabolic pathways. It facilitates the handling of pathway-related data and is designed to help biochemists and molecular biologists in their daily research. It is available on the Web at and can be downloaded free of charge and installed locally. PMID:17059592

  12. Development of computer informational system of diagnostics integrated optical materials, elements, and devices

    NASA Astrophysics Data System (ADS)

    Volosovitch, Anatoly E.; Konopaltseva, Lyudmila I.

    1995-11-01

    Well-known methods of optical diagnostics, database for their storage, as well as expert system (ES) for their development are analyzed. A computer informational system is developed, which is based on a hybrid ES built on modern DBMS. As an example, the structural and constructive circuits of the hybrid integrated-optical devices based on laser diodes, diffusion waveguides, geodetic lenses, package-free linear photodiode arrays, etc. are presented. The features of methods and test results as well as the advanced directions of works related to the hybrid integrated-optical devices in the field of metrology are discussed.

  13. Security concept in 'MyAngelWeb' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli, F; Nahaissi, D; Boschini, M; Ferrari, R; Meloni, G; Camnasio, M; Spaggiari, P; Carnerone, G

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  14. Security concept in 'MyAngelWeb((R))' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli; Nahaissi; Boschini; Ferrari; Meloni; Camnasio; Spaggiari; Carnerone

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  15. An Object-Relational Ifc Storage Model Based on Oracle Database

    NASA Astrophysics Data System (ADS)

    Li, Hang; Liu, Hua; Liu, Yong; Wang, Yuan

    2016-06-01

    With the building models are getting increasingly complicated, the levels of collaboration across professionals attract more attention in the architecture, engineering and construction (AEC) industry. In order to adapt the change, buildingSMART developed Industry Foundation Classes (IFC) to facilitate the interoperability between software platforms. However, IFC data are currently shared in the form of text file, which is defective. In this paper, considering the object-based inheritance hierarchy of IFC and the storage features of different database management systems (DBMS), we propose a novel object-relational storage model that uses Oracle database to store IFC data. Firstly, establish the mapping rules between data types in IFC specification and Oracle database. Secondly, design the IFC database according to the relationships among IFC entities. Thirdly, parse the IFC file and extract IFC data. And lastly, store IFC data into corresponding tables in IFC database. In experiment, three different building models are selected to demonstrate the effectiveness of our storage model. The comparison of experimental statistics proves that IFC data are lossless during data exchange.

  16. A relational database in neurosurgery.

    PubMed

    Sicurello, F; Marchetti, M R; Cazzaniga, P

    1995-01-01

    This paper describes teh automatic procedure for a clinical record management in a Neurosurgery ward. The automated record allows the storage, querying and effective management of clinical data. This is useful during the patient stay and also for data processing and analysis aiming at clinical research and statistical studies. The clinical record is problem-oriented. It contains a minimum data set regarding every patient and a data set which is defined by a classification nomenclature (using an inner protocol). The main parts of the clinical record are the following tables: PERSONAL DATA: contains the fields relating to personal and admission data of the patient. The compilation of some fields is compulsory because they serve as input for the automated discharge letter. This table is used as an identifier for patient retrieval. composed of five different tables according to the kind of data. They are: familiar anamnesis, physiological anamnesis, past and next pathology anamnesis, and trauma anamnesis. GENERAL OBJECTIVITY: contains the general physical information of a patient. The field hold default values, which quickens the compilation and assures the recording of normal values. NEUROLOGICAL EXAMINATION: contains information about the neurological status of the patient. Also in this table, ther are default values in the fields. COMA: contains standardized ata and classifications. The multiple choices are automated and driven and belong to homogeneous classes. SURGICAL OPERATIONS: the information recording is made defining the general kind of operation and then defining the peculiar kind of operation. INSTRUMENTAL EXAMINATIONS: some examination results are recorded in a free structure, while other ones (TAC, etc.) follow codified structure. In order to identify a pathology by means of TAC, it is enough to record three values corresponding to three variables. THis classification fully describes a lot of neurosurgical pathologies. DISCHARGE: contains conclusions, therapies, result, and hospital course. Medical language is closer to the natural one and presents some abiguities. In order to solve this problem, a classification nomenclature was used for diagnosis definition. DISCHARGE LETTER: the document given to the patient when he is discharged. It extracts data from the previously described modules and contains standard headings. The information stored int he database is structured (e.g., diagnosis, name, surname, etc.) and access to this data takes place when the user wants to search the database, using particular queries where the identifying data of a patient is put as conditions for the research (SELECT age, name WHERE diagnosis="TRAUMA"). Logical operators and relational algebra of the relational DBMS allows more complex queries ((diagnosis="TRAUMA" AND age="19") OR sex="M"). The queries are deterministic, because data management uses a classification nomenclature. Data retrieval takes place through a matching, and the DBMS answers directly to the queries. The information retrieval speed depends upon the kind of system that is used; in our case retrieval time is low because the accesses to disk are few even for big databases. In medicine, clinical records can have a hierarchical structure and/or a relational one. Nevertheless, the hierarchical model presents a disadvantage: it is not very flexible because it is linked to a pre-defined structure; as a matter of fact, the definition of path is established in the beginning and not during the execution. Thus, a better representation of the system at a logical level requries a relational DBMS which exploits the relationships between entities in a vertical and horizontal way. That is why the developers adopted a mixed strategy which exploits the advantages of both models and which is provided by M Technology with SQL language (M/SQL). For the future, it is important to have at one's disposal multimedia technologies, which integrate different kinds of information (alp

  17. A Simulation Model Of A Picture Archival And Communication System

    NASA Astrophysics Data System (ADS)

    D'Silva, Vijay; Perros, Harry; Stockbridge, Chris

    1988-06-01

    A PACS architecture was simulated to quantify its performance. The model consisted of reading stations, acquisition nodes, communication links, a database management system, and a storage system consisting of magnetic and optical disks. Two levels of storage were simulated, a high-speed magnetic disk system for short term storage, and optical disk jukeboxes for long term storage. The communications link was a single bus via which image data were requested and delivered. Real input data to the simulation model were obtained from surveys of radiology procedures (Bowman Gray School of Medicine). From these the following inputs were calculated: - the size of short term storage necessary - the amount of long term storage required - the frequency of access of each store, and - the distribution of the number of films requested per diagnosis. The performance measures obtained were - the mean retrieval time for an image, - mean queue lengths, and - the utilization of each device. Parametric analysis was done for - the bus speed, - the packet size for the communications link, - the record size on the magnetic disk, - compression ratio, - influx of new images, - DBMS time, and - diagnosis think times. Plots give the optimum values for those values of input speed and device performance which are sufficient to achieve subsecond image retrieval times

  18. Unified Access Architecture for Large-Scale Scientific Datasets

    NASA Astrophysics Data System (ADS)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output data formats have been anticipated and considered during the design of the unified architecture. The research focuses on the feasibility of the designed coupling mechanism and the evaluation of the efficiency and benefits of our proposed unified access architecture. Zhang 2011: Zhang, Ying and Kersten, Martin and Ivanova, Milena and Nes, Niels, SciQL: Bridging the Gap Between Science and Relational DBMS, Proceedings of the 15th Symposium on International Database Engineering Applications, 2011. Baumann 98: Baumann, P., Dehmel, A., Furtado, P., Ritsch, R., Widmann, N., "The Multidimensional Database System RasDaMan", SIGMOD 1998, Proceedings ACM SIGMOD International Conference on Management of Data, June 2-4, 1998, Seattle, Washington, 1998. hadoop1: hadoop.apache.org, "Hadoop", http://hadoop.apache.org/, [Online; accessed 12-Jan-2014]. scalapack1: netlib.org/scalapack, "ScaLAPACK", http://www.netlib.org/scalapack,[Online; accessed 12-Jan-2014]. r1: r-project.org, "R", http://www.r-project.org/,[Online; accessed 12-Jan-2014]. matlab1: mathworks.com, "Matlab Documentation", http://www.mathworks.de/de/help/matlab/,[Online; accessed 12-Jan-2014]. scidbusr1: scidb.org, "SciDB User's Guide", http://scidb.org/HTMLmanual/13.6/scidb_ug,[Online; accessed 01-Dec-2013].

  19. Addressing BI Transactional Flows in the Real-Time Enterprise Using GoldenGate TDM

    NASA Astrophysics Data System (ADS)

    Pareek, Alok

    It's time to visit low latency and reliable real-time (RT) infrastructures to support next generation BI applications instead of continually debating the need and notion of real-time. The last few years have illuminated some key paradigms affecting data management. The arguments put forth to move away from traditional DBMS architectures have proven persuasive - and specialized architectural data stores are being adopted in the industry [1]. The change from traditional database pull methods towards intelligent routing/push models is underway, causing applications to be redesigned, redeployed, and re-architected. One direct result of this is that despite original warnings about replication [2] - enterprises continue to deploy multiple replicas to support both performance, and high availability of RT applications, with an added complexity around manageability of heterogeneous computing systems. The enterprise is overflowing with data streams that require instantaneous processing and integration, to deliver faster visibility and invoke conjoined actions for RT decision making, resulting in deployment of advanced BI applications as can be seen by stream processing over RT feeds from operational systems for CEP [3]. Given these various paradigms, a multitude of new challenges and requirements have emerged, thereby necessitating different approaches to management of RT applications for BI. The purpose of this paper is to offer a viewpoint on how RT affects critical operational applications, evolves the weight of non-critical applications, and pressurizes availability/data-movement requirements in the underlying infrastructure. I will discuss how the GoldenGate TDM platform is being deployed within the RTE to manage some of these challenges particularly around RT dissemination of transactional data to reduce latency in data integration flows, to enable real-time reporting/DW, and to increase availability of underlying operational systems. Real world case studies will be used to support the various discussion points. The paper is an argument to augment traditional DI flows with a real-time technology (referred to as transactional data management) to support operational BI requirements.

  20. Associations of selected bedding types with incidence rates of subclinical and clinical mastitis in primiparous Holstein dairy cows.

    PubMed

    Rowbotham, R F; Ruegg, P L

    2016-06-01

    The objective of this observational study was to determine the association of exposure to selected bedding types with incidence of subclinical (SM) and clinical mastitis (CM) in primiparous Holstein dairy cows housed in identical pens at a single facility. At parturition, primiparous cows were randomly assigned to pens containing freestalls with 1 of 4 bedding materials: (1) deep-bedded new sand (NES, n=27 cows), (2) deep-bedded recycled sand (RS, n=25 cows), (3) deep-bedded manure solids (DBMS, n=31 cows), and (4) shallow-bedded manure solids over foam-core mattresses (SBMS, n=26 cows). For 12mo, somatic cell counts of quarter milk samples were determined every 28d and duplicate quarter milk samples were collected for microbiological analysis from all quarters with SM (defined as somatic cell count >200,000 cells/mL). During this period, duplicate quarter milk samples were also collected for microbial analysis from all cases of CM. For an additional 16mo, cases of CM were recorded; however, no samples were collected. Quarter days at risk (62,980) were distributed among bedding types and most quarters were enrolled for >150d. Of 135 cases of SM, 63% resulted in nonsignificant growth and 87% of recovered pathogens (n=33) were identified as coagulase-negative staphylococci. The distribution of etiologies of pathogens recovered from cases of SM was associated with bedding type. Coagulase-negative staphylococci were recovered from 12, 38, 11, and 46% of quarters with SM from cows in pens containing NES, RS, DBMS, and SBMS, respectively. A result of nonsignificant growth was obtained for 81, 59, 89, and 46% of quarters with SM from cows in pens containing NES, RS, DBMS, and SBMS, respectively. Quarters of primiparous cows bedded with NES tended to have greater survival time to incidence of CM than quarters of primiparous cows bedded with RS or DBMS. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. Design and Implementation of A Backend Multiple-Processor Relational Data Base Computer System.

    DTIC Science & Technology

    1981-12-01

    propogated to other parts of the data base. 18 Cost. As mentioned earlier, a primary motivation for the backend DBMS work is the development of an...uniquely identify the n- tuples of the relation is called the primary key. For example, in Figure 3, the primary key is NUMBER. A primary key is said to...identifying the tuple. For example, in Figure 3, (NUMBER,TITLE) would not be a nonredundant primary key for COURSE. A relation can contain more than one

  2. 'SON-GO-KU' : a dream of automated library

    NASA Astrophysics Data System (ADS)

    Sato, Mamoru; Kishimoto, Juji

    In the process of automating libraries, the retrieval of books through the browsing of shelves is being overlooked. The telematic library is a document based DBMS which can deliver the content of books by simulating the browsing process. The retrieval actually simulates the process a person would use in selecting a book in a real library, where a visual presentation using a graphic display is substituted. The characteristics of prototype system "Son-Go-Ku" for such retrieval implemented in 1988 are mentioned.

  3. A DBMS-based medical teleconferencing system.

    PubMed

    Chun, J; Kim, H; Lee, S; Choi, J; Cho, H

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database.

  4. A DBMS-based Medical Teleconferencing System

    PubMed Central

    Chun, Jonghoon; Kim, Hanjoon; Lee, Sang-goo; Choi, Jinwook; Cho, Hanik

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database. PMID:11522766

  5. Application of new type of distributed multimedia databases to networked electronic museum

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.

  6. Modular System Control Development Model (MSCDM). Design Specification.

    DTIC Science & Technology

    1979-08-01

    with power supply and ¶ can be used independently of the loop. The PDU can be used as a general purpose processor. The loop is contained in a separate...inputs to nodes 22 (VSQC), 23 (DSQC ) , and 26 (BWBSA) will be generated by a LSI—ll microprocessor used as a simulated input generator (SIG). The SIG...who c o b m n u n i — cate tau lt - s to the FIAC module. F~IAC generates even t reports to the OCRI and DBMS. The PDP1I/40 in loop 2 generates

  7. Impact of data base structure in a successful in vitro-in vivo correlation for pharmaceutical products.

    PubMed

    Roudier, B; Davit, B; Schütz, H; Cardot, J-M

    2015-01-01

    The in vitro-in vivo correlation (IVIVC) (Food and Drug Administration 1997) aims to predict performances in vivo of a pharmaceutical formulation based on its in vitro characteristics. It is a complex process that (i) incorporates in a gradual and incremental way a large amount of information and (ii) requires information from different properties (formulation, analytical, clinical) and associated dedicated treatments (statistics, modeling, simulation). These results in many studies that are initiated and integrated into the specifications (quality target product profile, QTPP). This latter defines the appropriate experimental designs (quality by design, QbD) (Food and Drug Administration 2011, 2012) whose main objectives are determination (i) of key factors of development and manufacturing (critical process parameters, CPPs) and (ii) of critical points of physicochemical nature relating to active ingredients (API) and critical quality attribute (CQA) which may have implications in terms of efficiency, safety, and inoffensiveness for the patient, due to their non-inclusion. These processes generate a very large amount of data that is necessary to structure. In this context, the storage of information in a database (DB) and the management of this database (database management system, DBMS) become an important issue for the management of projects and IVIVC and more generally for development of new pharmaceutical forms. This article describes the implementation of a prototype object-oriented database (OODB) considered as a tool, which is helpful for decision taking, responding in a structured and consistent way to the issues of project management of IVIVC (including bioequivalence and bioavailability) (Food and Drug Administration 2003) necessary for the implementation of QTPP.

  8. Validation and clinical application of a method to quantify nevirapine in dried blood spots and dried breast-milk spots.

    PubMed

    Olagunju, Adeniyi; Amara, Alieu; Waitt, Catriona; Else, Laura; Penchala, Sujan D; Bolaji, Oluseye; Soyinka, Julius; Siccardi, Marco; Back, David; Owen, Andrew; Khoo, Saye

    2015-10-01

    The validation and clinical application of an LC-MS/MS method for the quantification of nevirapine in dried blood spots (DBS) and dried breast-milk spots (DBMS) are presented. DBS and DBMS were prepared from 50 and 30 μL of nevirapine-spiked whole blood and human breast milk, respectively. Chromatographic separation was achieved on a reverse-phase C18 column with 0.1% formic acid in water/acetonitrile using a solvent gradient programme at a flow rate of 400 μL/min, and detection was by a TSQ Quantum Access triple quadrupole mass spectrometer. The clinical application was evaluated in HIV-positive nursing mothers and their breastfed infants. The assay was validated over the concentration range 50-10,000 ng/mL. Accuracy ranged from 93.3% to 113.4% and precision ranged from 1.9% to 12.0%. The mean (percentage coefficient of variation) recovery of nevirapine from DBS and DBMS was ≥ 70.7% (≤ 8.2) and the matrix effect was ≤ 1.04 (≤ 6.1). Nevirapine was stable in DBS and DBMS for ≥ 15 months at room temperature and -80°C. Mean (SD) AUC0-12, Cmax and Cmin in maternal plasma versus breast milk were 57,808 ng · h/mL (24,315) versus 55,817 ng · h/mL (22,368), 6140 ng/mL (2605) versus 5231 ng/mL (2215) and 4334 ng/mL (1880) versus 4342 ng/mL (2245), respectively. The milk-to-plasma concentration ratio over the dosing interval was 0.94 (0.15). Infant plasma concentrations 2 and 8 h after maternal dosing were 580.6 ng/mL (464.7-1607) and 584.1 ng/mL (381.5-1570), respectively. These methods further extend opportunities for conducting clinical pharmacokinetic studies in nursing mother-infant pairs, especially in resource-limited settings. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. DIDBase: Intelligent, Interactive Archiving Technology for Ionogram Data

    NASA Astrophysics Data System (ADS)

    Reinisch, B. W.; Khmyrov, G.; Galkin, I. A.; Kozlov, A.

    2004-12-01

    Vertical ionospheric sounding data have been used in a variety of scenarios for ionospheric now-casting. Growing need for an accurate real-time specification of vertical electron density distribution at multiple locations stimulates interest to intelligent data management systems that can arrange concurrent, remote access to the acquired data. This type of data access requires high level of interaction and organization to support routing of data between ionosondes, data analysts, quality validation experts, end user applications, data managers, and online data repositories such as the World Data Centers. Digital Ionogram Database (DIDBase) is a pilot project started at UMASS Lowell in 2001, sponsored in part by the Air Force Research Laboratory, for management of real-time and retro data from a network of 50 digisondes. The DIDBase archives hold both raw and derived digisonde data under management of a commerical strength DBMS, providing convenient means for automated ingestion of real-time data from online digisondes (40 locations worldwide as of September 2004), remote read access to the data over HTTP Web protocol (http://ulcar.uml.edu/DIDBase/), remote read/write access from SAO Explorer workstations used for data visualization and interactive editing, and an ADRES subsystem for automated management of data requests. DIDBase and ADRES employ cross-platform solutions for all involved software, exchange protocols, and data. The paper briefly describes the DIDBase operations during a recent Cal/Val campaign for the SSUSI/SSULI instruments on the DMSP F16 spacecraft. Here 26 online digisondes provided ground-truth NmF2 data for the overhead and limb passes of the spacecraft. Since the start of the campaign in December 2003, the total number of the ADRES requests exceeded 9,000 by summer 2004.

  10. Measuring the usefulness of hidden units in Boltzmann machines with mutual information.

    PubMed

    Berglund, Mathias; Raiko, Tapani; Cho, Kyunghyun

    2015-04-01

    Restricted Boltzmann machines (RBMs) and deep Boltzmann machines (DBMs) are important models in deep learning, but it is often difficult to measure their performance in general, or measure the importance of individual hidden units in specific. We propose to use mutual information to measure the usefulness of individual hidden units in Boltzmann machines. The measure is fast to compute, and serves as an upper bound for the information the neuron can pass on, enabling detection of a particular kind of poor training results. We confirm experimentally that the proposed measure indicates how much the performance of the model drops when some of the units of an RBM are pruned away. We demonstrate the usefulness of the measure for early detection of poor training in DBMs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. N-terminal segments modulate the α-helical propensities of the intrinsically disordered basic regions of bZIP proteins.

    PubMed

    Das, Rahul K; Crick, Scott L; Pappu, Rohit V

    2012-02-17

    Basic region leucine zippers (bZIPs) are modular transcription factors that play key roles in eukaryotic gene regulation. The basic regions of bZIPs (bZIP-bRs) are necessary and sufficient for DNA binding and specificity. Bioinformatic predictions and spectroscopic studies suggest that unbound monomeric bZIP-bRs are uniformly disordered as isolated domains. Here, we test this assumption through a comparative characterization of conformational ensembles for 15 different bZIP-bRs using a combination of atomistic simulations and circular dichroism measurements. We find that bZIP-bRs have quantifiable preferences for α-helical conformations in their unbound monomeric forms. This helicity varies from one bZIP-bR to another despite a significant sequence similarity of the DNA binding motifs (DBMs). Our analysis reveals that intramolecular interactions between DBMs and eight-residue segments directly N-terminal to DBMs are the primary modulators of bZIP-bR helicities. We test the accuracy of this inference by designing chimeras of bZIP-bRs to have either increased or decreased overall helicities. Our results yield quantitative insights regarding the relationship between sequence and the degree of intrinsic disorder within bZIP-bRs, and might have general implications for other intrinsically disordered proteins. Understanding how natural sequence variations lead to modulation of disorder is likely to be important for understanding the evolution of specificity in molecular recognition through intrinsically disordered regions (IDRs). Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Efficacy of different bone volume expanders for augmenting lumbar fusions.

    PubMed

    Epstein, Nancy E

    2008-01-01

    A wide variety of bone volume expanders are being used in performing posterolateral lumbar noninstrumented and instrumented lumbar fusions. This article presents a review of their efficacy based on fusion rates, complications, and outcomes. Lumbar noninstrumented and instrumented fusions frequently use laminar autografts and different bone graft expanders. This review presents the utility of multiple forms/ratios of DBMs containing allografts. It also discusses the efficacy of artificial bone graft substitutes, including HA and B-TCP. Dynamic x-ray and/or CT examinations were used to document fusion in most series. Outcomes were variously assessed using Odom's criteria or different outcome questionnaires (Oswestry Questionnaire, SF-36, Dallas Pain Questionnaire, and/or Low Back Pain Rating Scale). Performing noninstrumented and instrumented lumbar posterolateral fusions resulted in comparable fusion rates in many series. Similar outcomes were also documented based on Odom's criteria or the multiple patient-based questionnaires. However, in some studies, the addition of spinal instrumentation increased the reoperation rate, operative time, blood loss, and cost. Various forms of DBMs, applied in different ratios to autografts, effectively supplemented spinal fusions in animal models and patient series. beta-Tricalcium phosphate, which is used to augment autograft fusions addressing idiopathic scoliosis or lumbar disease, also proved to be effective. Different types of bone volume expanders, including various forms of allograft-based DBMs, and artificial bone graft substitutes (HA and B-TCP) effectively promote posterolateral lumbar noninstrumented and instrumented fusions when added to autografts.

  13. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  14. Developing a Web-based system by integrating VGI and SDI for real estate management and marketing

    NASA Astrophysics Data System (ADS)

    Salajegheh, J.; Hakimpour, F.; Esmaeily, A.

    2014-10-01

    Property importance of various aspects, especially the impact on various sectors of the economy and the country's macroeconomic is clear. Because of the real, multi-dimensional and heterogeneous nature of housing as a commodity, the lack of an integrated system includes comprehensive information of property, the lack of awareness of some actors in this field about comprehensive information about property and the lack of clear and comprehensive rules and regulations for the trading and pricing, several problems arise for the people involved in this field. In this research implementation of a crowd-sourced Web-based real estate support system is desired. Creating a Spatial Data Infrastructure (SDI) in this system for collecting, updating and integrating all official data about property is also desired in this study. In this system a Web2.0 broker and technologies such as Web services and service composition has been used. This work aims to provide comprehensive and diverse information about property from different sources. For this purpose five-level real estate support system architecture is used. PostgreSql DBMS is used to implement the desired system. Geoserver software is also used as map server and reference implementation of OGC (Open Geospatial Consortium) standards. And Apache server is used to run web pages and user interfaces. Integration introduced methods and technologies provide a proper environment for various users to use the system and share their information. This goal is only achieved by cooperation between all involved organizations in real estate with implementation their required infrastructures in interoperability Web services format.

  15. AQBE — QBE Style Queries for Archetyped Data

    NASA Astrophysics Data System (ADS)

    Sachdeva, Shelly; Yaginuma, Daigo; Chu, Wanming; Bhalla, Subhash

    Large-scale adoption of electronic healthcare applications requires semantic interoperability. The new proposals propose an advanced (multi-level) DBMS architecture for repository services for health records of patients. These also require query interfaces at multiple levels and at the level of semi-skilled users. In this regard, a high-level user interface for querying the new form of standardized Electronic Health Records system has been examined in this study. It proposes a step-by-step graphical query interface to allow semi-skilled users to write queries. Its aim is to decrease user effort and communication ambiguities, and increase user friendliness.

  16. Open Source Hbim for Cultural Heritage: a Project Proposal

    NASA Astrophysics Data System (ADS)

    Diara, F.; Rinaudo, F.

    2018-05-01

    Actual technologies are changing Cultural Heritage research, analysis, conservation and development ways, allowing new innovative approaches. The possibility of integrating Cultural Heritage data, like archaeological information, inside a three-dimensional environment system (like a Building Information Modelling) involve huge benefits for its management, monitoring and valorisation. Nowadays there are many commercial BIM solutions. However, these tools are thought and developed mostly for architecture design or technical installations. An example of better solution could be a dynamic and open platform that might consider Cultural Heritage needs as priority. Suitable solution for better and complete data usability and accessibility could be guaranteed by open source protocols. This choice would allow adapting software to Cultural Heritage needs and not the opposite, thus avoiding methodological stretches. This work will focus exactly on analysis and experimentations about specific characteristics of these kind of open source software (DBMS, CAD, Servers) applied to a Cultural Heritage example, in order to verifying their flexibility, reliability and then creating a dynamic HBIM open source prototype. Indeed, it might be a starting point for a future creation of a complete HBIM open source solution that we could adapt to others Cultural Heritage researches and analysis.

  17. The PROTICdb database for 2-DE proteomics.

    PubMed

    Langella, Olivier; Zivy, Michel; Joets, Johann

    2007-01-01

    PROTICdb is a web-based database mainly designed to store and analyze plant proteome data obtained by 2D polyacrylamide gel electrophoresis (2D PAGE) and mass spectrometry (MS). The goals of PROTICdb are (1) to store, track, and query information related to proteomic experiments, i.e., from tissue sampling to protein identification and quantitative measurements; and (2) to integrate information from the user's own expertise and other sources into a knowledge base, used to support data interpretation (e.g., for the determination of allelic variants or products of posttranslational modifications). Data insertion into the relational database of PROTICdb is achieved either by uploading outputs from Mélanie, PDQuest, IM2d, ImageMaster(tm) 2D Platinum v5.0, Progenesis, Sequest, MS-Fit, and Mascot software, or by filling in web forms (experimental design and methods). 2D PAGE-annotated maps can be displayed, queried, and compared through the GelBrowser. Quantitative data can be easily exported in a tabulated format for statistical analyses with any third-party software. PROTICdb is based on the Oracle or the PostgreSQLDataBase Management System (DBMS) and is freely available upon request at http://cms.moulon.inra.fr/content/view/14/44/.

  18. Bacterial counts on teat skin and in new sand, recycled sand, and recycled manure solids used as bedding in freestalls.

    PubMed

    Rowbotham, R F; Ruegg, P L

    2016-08-01

    On modern dairy farms, environmental mastitis pathogens are usually the predominant cause of mastitis, and bedding often serves as a point of exposure to these organisms. The objective of this longitudinal study was to determine bacterial populations of 4 different bedding types [deep-bedded new sand (NES), deep-bedded recycled sand (RS), deep-bedded manure solids (DBMS), and shallow-bedded manure solids over foam core mattresses (SBMS)] and of teat skin swabs of primarily primiparous cows housed in a single facility over all 4 seasons. Samples of bedding were collected weekly (n=49wk) from pens that each contained 32 lactating dairy cows. Throughout the length of the same period, composite swabs of teat skin were collected weekly from all cows before and after premilking teat sanitation. Median numbers of streptococci and streptococci-like organisms (SSLO) were >8.6×10(6) cfu/g and >6.9×10(3) cfu/teat swab for all bedding types and teat swabs, respectively. Numbers of SSLO were greatest in samples of SBMS (2.1×10(8) cfu/g) and least in samples of NES (8.6×10(6) cfu/g), RS (1.3×10(7) cfu/g), and DBMS (1.7×10(7) cfu/g). Numbers of gram-negative bacteria in bedding (5.5×10(4) to 1.2×10(7) cfu/g) were fewer than numbers of SSLO (8.6×10(6) to 2.1×10(8) cfu/g). Numbers of coliform bacteria were greatest in samples of DBMS (2.2×10(6) cfu/g) and least in samples of NES (3.6×10(3) cfu/g). In general, the relative number of bacteria on teat skin corresponded to exposure in bedding. Numbers of gram-negative bacteria recovered from prepreparation teat swabs were greatest for cows bedded with DBMS (1.0×10(4) cfu/swab) and RS (2.5×10(3) cfu/swab) and least for cows bedded with NES (5.8×10(2) cfu/swab). Median numbers of coliform and Klebsiella spp. recovered from prepreparation teat swabs were below the limit of detection for all cows except those bedded with DBMS. Numbers of SSLO recovered from prepreparation teat swabs were least for cows bedded with DBMS (6.9×10(3) cfu/swab) and greatest for cows bedded with RS (5.1×10(4) cfu/swab) or SBMS (1.6×10(5) cfu/swab). The numbers of all types of measured bacteria (total gram-negative, coliforms, Klebsiella spp., SSLO) on postpreparation teat swabs were reduced by up to 2.6 logs from numbers of bacteria on prepreparation swabs, verifying effective preparation procedures. Significant correlations between bacterial counts of bedding samples and teat skin swabs were observed for several types of bacteria. As compared with other bedding types, the least amount of gram-negative bacteria were recovered from NES and may indicate that cows on NES have a reduced risk of exposure to pathogens that are typically a cause of clinical mastitis. In contrast, exposure to large numbers of SSLO was consistent across all bedding types and may indicate that risk of subclinical mastitis typically associated with streptococci is not as influenced by bedding type; however, significantly greater numbers of SSLO were found in SBMS than in other bedding types. These findings indicate that use of different bedding types results in exposure to different distributions of mastitis pathogens that may alter the proportion of etiologies of clinical mastitis, although the incidence rate of clinical mastitis did not differ among bedding types. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. Enabling multi-level relevance feedback on PubMed by integrating rank learning into DBMS.

    PubMed

    Yu, Hwanjo; Kim, Taehoon; Oh, Jinoh; Ko, Ilhwan; Kim, Sungchul; Han, Wook-Shin

    2010-04-16

    Finding relevant articles from PubMed is challenging because it is hard to express the user's specific intention in the given query interface, and a keyword query typically retrieves a large number of results. Researchers have applied machine learning techniques to find relevant articles by ranking the articles according to the learned relevance function. However, the process of learning and ranking is usually done offline without integrated with the keyword queries, and the users have to provide a large amount of training documents to get a reasonable learning accuracy. This paper proposes a novel multi-level relevance feedback system for PubMed, called RefMed, which supports both ad-hoc keyword queries and a multi-level relevance feedback in real time on PubMed. RefMed supports a multi-level relevance feedback by using the RankSVM as the learning method, and thus it achieves higher accuracy with less feedback. RefMed "tightly" integrates the RankSVM into RDBMS to support both keyword queries and the multi-level relevance feedback in real time; the tight coupling of the RankSVM and DBMS substantially improves the processing time. An efficient parameter selection method for the RankSVM is also proposed, which tunes the RankSVM parameter without performing validation. Thereby, RefMed achieves a high learning accuracy in real time without performing a validation process. RefMed is accessible at http://dm.postech.ac.kr/refmed. RefMed is the first multi-level relevance feedback system for PubMed, which achieves a high accuracy with less feedback. It effectively learns an accurate relevance function from the user's feedback and efficiently processes the function to return relevant articles in real time.

  20. Enabling multi-level relevance feedback on PubMed by integrating rank learning into DBMS

    PubMed Central

    2010-01-01

    Background Finding relevant articles from PubMed is challenging because it is hard to express the user's specific intention in the given query interface, and a keyword query typically retrieves a large number of results. Researchers have applied machine learning techniques to find relevant articles by ranking the articles according to the learned relevance function. However, the process of learning and ranking is usually done offline without integrated with the keyword queries, and the users have to provide a large amount of training documents to get a reasonable learning accuracy. This paper proposes a novel multi-level relevance feedback system for PubMed, called RefMed, which supports both ad-hoc keyword queries and a multi-level relevance feedback in real time on PubMed. Results RefMed supports a multi-level relevance feedback by using the RankSVM as the learning method, and thus it achieves higher accuracy with less feedback. RefMed "tightly" integrates the RankSVM into RDBMS to support both keyword queries and the multi-level relevance feedback in real time; the tight coupling of the RankSVM and DBMS substantially improves the processing time. An efficient parameter selection method for the RankSVM is also proposed, which tunes the RankSVM parameter without performing validation. Thereby, RefMed achieves a high learning accuracy in real time without performing a validation process. RefMed is accessible at http://dm.postech.ac.kr/refmed. Conclusions RefMed is the first multi-level relevance feedback system for PubMed, which achieves a high accuracy with less feedback. It effectively learns an accurate relevance function from the user’s feedback and efficiently processes the function to return relevant articles in real time. PMID:20406504

  1. Footprint Representation of Planetary Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Walter, S. H. G.; Gasselt, S. V.; Michael, G.; Neukum, G.

    The geometric outline of remote sensing image data, the so called footprint, can be represented as a number of coordinate tuples. These polygons are associated with according attribute information such as orbit name, ground- and image resolution, solar longitude and illumination conditions to generate a powerful base for classification of planetary experiment data. Speed, handling and extended capabilites are the reasons for using geodatabases to store and access these data types. Techniques for such a spatial database of footprint data are demonstrated using the Relational Database Management System (RDBMS) PostgreSQL, spatially enabled by the PostGIS extension. Exemplary, footprints of the HRSC and OMEGA instruments, both onboard ESA's Mars Express Orbiter, are generated and connected to attribute information. The aim is to provide high-resolution footprints of the OMEGA instrument to the science community for the first time and make them available for web-based mapping applications like the "Planetary Interactive GIS-on-the-Web Analyzable Database" (PIG- WAD), produced by the USGS. Map overlays with HRSC or other instruments like MOC and THEMIS (footprint maps are already available for these instruments and can be integrated into the database) allow on-the-fly intersection and comparison as well as extended statistics of the data. Footprint polygons are generated one by one using standard software provided by the instrument teams. Attribute data is calculated and stored together with the geometric information. In the case of HRSC, the coordinates of the footprints are already available in the VICAR label of each image file. Using the VICAR RTL and PostgreSQL's libpq C library they are loaded into the database using the Well-Known Text (WKT) notation by the Open Geospatial Consortium, Inc. (OGC). For the OMEGA instrument, image data is read using IDL routines developed and distributed by the OMEGA team. Image outlines are exported together with relevant attribute data to the industry standard Shapefile format. These files are translated to a Structured Query Language (SQL) command sequence suitable for insertion into the PostGIS/PostgrSQL database using the shp2pgsql data loader provided by the PostGIS software. PostgreSQL's advanced features such as geometry types, rules, operators and functions allow complex spatial queries and on-the-fly processing of data on DBMS level e.g. generalisation of the outlines. Processing done by the DBMS, visualisation via GIS systems and utilisation for web-based applications like mapservers will be demonstrated.

  2. [The RUTA project (Registro UTIC Triveneto ANMCO). An e-network for the coronary care units for acute myocardial infarction].

    PubMed

    Di Chiara, Antonio; Zonzin, Pietro; Pavoni, Daisy; Fioretti, Paolo Maria

    2003-06-01

    In the era of evidence-based medicine, the monitoring of the adherence to the guidelines is fundamental, in order to verify the diagnostic and therapeutic processes. Informatic paperless databases allow a higher data quality, lower costs and timely analysis with overall advantages over the traditional surveys. The RUTA project (acronym of Triveneto Registry of ANMCO CCUs) was designed in 1999, aiming at creating an informatic network among the coronary care units of a large Italian region, for a permanent survey of patients admitted for acute myocardial infarction. Information ranges from the pre-hospital phase to discharge, including all relevant clinical and management variables. The database uses DBMS Personal Oracle and Power-Builder as user interface, on Windows platform. Anonymous data are sent to a central server.

  3. Influence of demineralized bone matrix's embryonic origin on bone formation: an experimental study in rats.

    PubMed

    Stavropoulos, Andreas; Kostopoulos, Lambros; Mardas, Nicolaos; Karring, Thorkild

    2003-01-01

    There are results suggesting that differences regarding bone-inducing potential, in terms of amount and/or rate of bone formation, exist between demineralized bone matrices (DBMs) of different embryonic origins. The aim of the present study was to examine whether the embryonic origin of DBM affects bone formation when used as an adjunct to guided tissue regeneration (GTR). Endomembranous (EM) and endochondral (ECH) DBMs were produced from calvarial and long bones of rats, respectively. Prior to the study the osteoinductive properties of the DBMs were confirmed in six rats following intramuscular implantation. Following surgical exposure of the mandibular ramus, a rigid hemispheric Teflon capsule loosely packed with a standardized quantity of DBM was placed with its open part facing the lateral surface of the ramus in both sides of the jaw in 30 rats. In one side of the jaw, chosen at random, the capsule was filled with EM-DBM, whereas in the other side ECH-DBM was used. Groups of 10 animals were sacrificed after healing periods of 1, 2, and 4 months, and undecalcified sections of the capsules were produced and subjected to histologic analysis and computer-assisted planimetric measurements. During the experiment increasing amounts of newly formed bone were observed inside the capsules in both sides of the animals' jaws. Limited bone formation was observed in the 1- and 2-month specimens, but after 4 months of healing, the newly formed bone in the ECH-DBM grafted sides occupied 59.1% (range 45.6-74.7%) of the area created by the capsule versus 46.9% (range 23.0-64.0%) in the EM-DBM grafted sides (p =.01). It is concluded that the embryonic origin of DBM influences bone formation by GTR and that ECH-DBM is superior to EM-DBM.

  4. HBVPathDB: a database of HBV infection-related molecular interaction network.

    PubMed

    Zhang, Yi; Bo, Xiao-Chen; Yang, Jing; Wang, Sheng-Qi

    2005-03-21

    To describe molecules or genes interaction between hepatitis B viruses (HBV) and host, for understanding how virus' and host's genes and molecules are networked to form a biological system and for perceiving mechanism of HBV infection. The knowledge of HBV infection-related reactions was organized into various kinds of pathways with carefully drawn graphs in HBVPathDB. Pathway information is stored with relational database management system (DBMS), which is currently the most efficient way to manage large amounts of data and query is implemented with powerful Structured Query Language (SQL). The search engine is written using Personal Home Page (PHP) with SQL embedded and web retrieval interface is developed for searching with Hypertext Markup Language (HTML). We present the first version of HBVPathDB, which is a HBV infection-related molecular interaction network database composed of 306 pathways with 1 050 molecules involved. With carefully drawn graphs, pathway information stored in HBVPathDB can be browsed in an intuitive way. We develop an easy-to-use interface for flexible accesses to the details of database. Convenient software is implemented to query and browse the pathway information of HBVPathDB. Four search page layout options-category search, gene search, description search, unitized search-are supported by the search engine of the database. The database is freely available at http://www.bio-inf.net/HBVPathDB/HBV/. The conventional perspective HBVPathDB have already contained a considerable amount of pathway information with HBV infection related, which is suitable for in-depth analysis of molecular interaction network of virus and host. HBVPathDB integrates pathway data-sets with convenient software for query, browsing, visualization, that provides users more opportunity to identify regulatory key molecules as potential drug targets and to explore the possible mechanism of HBV infection based on gene expression datasets.

  5. In vivo bioluminescence imaging of cell differentiation in biomaterials: a platform for scaffold development.

    PubMed

    Bagó, Juli R; Aguilar, Elisabeth; Alieva, Maria; Soler-Botija, Carolina; Vila, Olaia F; Claros, Silvia; Andrades, José A; Becerra, José; Rubio, Nuria; Blanco, Jerónimo

    2013-03-01

    In vivo testing is a mandatory last step in scaffold development. Agile longitudinal noninvasive real-time monitoring of stem cell behavior in biomaterials implanted in live animals should facilitate the development of scaffolds for tissue engineering. We report on a noninvasive bioluminescence imaging (BLI) procedure for simultaneous monitoring of changes in the expression of multiple genes to evaluate scaffold performance in vivo. Adipose tissue-derived stromal mensenchymal cells were dually labeled with Renilla red fluorescent protein and firefly green fluorescent protein chimeric reporters regulated by cytomegalovirus and tissue-specific promoters, respectively. Labeled cells were induced to differentiate in vitro and in vivo, by seeding in demineralized bone matrices (DBMs) and monitored by BLI. Imaging results were validated by RT-polymerase chain reaction and histological procedures. The proposed approach improves molecular imaging and measurement of changes in gene expression of cells implanted in live animals. This procedure, applicable to the simultaneous analysis of multiple genes from cells seeded in DBMs, should facilitate engineering of scaffolds for tissue repair.

  6. In Vivo Bioluminescence Imaging of Cell Differentiation in Biomaterials: A Platform for Scaffold Development

    PubMed Central

    Bagó, Juli R.; Aguilar, Elisabeth; Alieva, Maria; Soler-Botija, Carolina; Vila, Olaia F.; Claros, Silvia; Andrades, José A.; Becerra, José; Rubio, Nuria

    2013-01-01

    In vivo testing is a mandatory last step in scaffold development. Agile longitudinal noninvasive real-time monitoring of stem cell behavior in biomaterials implanted in live animals should facilitate the development of scaffolds for tissue engineering. We report on a noninvasive bioluminescence imaging (BLI) procedure for simultaneous monitoring of changes in the expression of multiple genes to evaluate scaffold performance in vivo. Adipose tissue-derived stromal mensenchymal cells were dually labeled with Renilla red fluorescent protein and firefly green fluorescent protein chimeric reporters regulated by cytomegalovirus and tissue-specific promoters, respectively. Labeled cells were induced to differentiate in vitro and in vivo, by seeding in demineralized bone matrices (DBMs) and monitored by BLI. Imaging results were validated by RT-polymerase chain reaction and histological procedures. The proposed approach improves molecular imaging and measurement of changes in gene expression of cells implanted in live animals. This procedure, applicable to the simultaneous analysis of multiple genes from cells seeded in DBMs, should facilitate engineering of scaffolds for tissue repair. PMID:23013334

  7. Verification of Security Policy Enforcement in Enterprise Systems

    NASA Astrophysics Data System (ADS)

    Gupta, Puneet; Stoller, Scott D.

    Many security requirements for enterprise systems can be expressed in a natural way as high-level access control policies. A high-level policy may refer to abstract information resources, independent of where the information is stored; it controls both direct and indirect accesses to the information; it may refer to the context of a request, i.e., the request’s path through the system; and its enforcement point and enforcement mechanism may be unspecified. Enforcement of a high-level policy may depend on the system architecture and the configurations of a variety of security mechanisms, such as firewalls, host login permissions, file permissions, DBMS access control, and application-specific security mechanisms. This paper presents a framework in which all of these can be conveniently and formally expressed, a method to verify that a high-level policy is enforced, and an algorithm to determine a trusted computing base for each resource.

  8. [Implementation of a computerized pharmacological database for pediatric use].

    PubMed

    Currò, V; Grimaldi, V; Polidori, G; Cascioli, E; Lanni, R; De Luca, F; D'Atri, A; Bernabei, A

    1990-01-01

    The authors present a pharmacological database to support teaching and care activity carried out in the Divisional Paediatric Ambulatory of the Catholic University of Rome. This database is included in a integrated system, ARPIA (Ambulatory and Research in Pediatric by Information Assistance), devoted to manage ambulatory paediatric data. ARPIA has been implemented by using a relational DBMS, very cheap and highly diffused on personal computers. The database specifies: active ingredient and code number related to it, clinical uses, doses, contra-indications and precautions, adverse effects, besides the possible wrapping available on the market. All this is showed on a single for that appears on the screen and allows a fast reading of the most important elements characterizing every drug. The search of the included drugs can be made on the basis of three different detailed lists: active ingredient, proprietary preparation and clinical use. It is, besides, possible to have a complete report about the drugs requested by the user. This system allows the user, without modifying the program, to interact with the included data modifying each element of the form. In the system there is also a fast consultation handbook containing for every active ingredient, the complete list of italian proprietary medicines. This system aims to give a better knowledge of the most commonly used drugs, not only limited to the paediatrician but also to the ambulatory health staff; an improvement of the therapy furthering, a more effective use of several pharmacological agents and first of all a training device not only to specialists but also to students.

  9. Optimizing Maintenance of Constraint-Based Database Caches

    NASA Astrophysics Data System (ADS)

    Klein, Joachim; Braun, Susanne

    Caching data reduces user-perceived latency and often enhances availability in case of server crashes or network failures. DB caching aims at local processing of declarative queries in a DBMS-managed cache close to the application. Query evaluation must produce the same results as if done at the remote database backend, which implies that all data records needed to process such a query must be present and controlled by the cache, i. e., to achieve “predicate-specific” loading and unloading of such record sets. Hence, cache maintenance must be based on cache constraints such that “predicate completeness” of the caching units currently present can be guaranteed at any point in time. We explore how cache groups can be maintained to provide the data currently needed. Moreover, we design and optimize loading and unloading algorithms for sets of records keeping the caching units complete, before we empirically identify the costs involved in cache maintenance.

  10. The "Discouraged-Business-Major" Hypothesis: Policy Implications

    ERIC Educational Resources Information Center

    Marangos, John

    2012-01-01

    This paper uses a relatively large dataset of the stated academic major preferences of economics majors at a relatively large, not highly selective, public university in the USA to identify the "discouraged-business-majors" (DBMs). The DBM hypothesis addresses the phenomenon where students who are screened out of the business curriculum often…

  11. Using a probabilistic approach in an ecological risk assessment simulation tool: test case for depleted uranium (DU).

    PubMed

    Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A

    2005-06-01

    A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.

  12. Benchmarking Using Basic DBMS Operations

    NASA Astrophysics Data System (ADS)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  13. Extending GIS Technology to Study Karst Features of Southeastern Minnesota

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Tipping, R. G.; Alexander, E. C.; Alexander, S. C.

    2001-12-01

    This paper summarizes ongoing research on karst feature distribution of southeastern Minnesota. The main goals of this interdisciplinary research are: 1) to look for large-scale patterns in the rate and distribution of sinkhole development; 2) to conduct statistical tests of hypotheses about the formation of sinkholes; 3) to create management tools for land-use managers and planners; and 4) to deliver geomorphic and hydrogeologic criteria for making scientifically valid land-use policies and ethical decisions in karst areas of southeastern Minnesota. Existing county and sub-county karst feature datasets of southeastern Minnesota have been assembled into a large GIS-based database capable of analyzing the entire data set. The central database management system (DBMS) is a relational GIS-based system interacting with three modules: GIS, statistical and hydrogeologic modules. ArcInfo and ArcView were used to generate a series of 2D and 3D maps depicting karst feature distributions in southeastern Minnesota. IRIS ExplorerTM was used to produce satisfying 3D maps and animations using data exported from GIS-based database. Nearest-neighbor analysis has been used to test sinkhole distributions in different topographic and geologic settings. All current nearest-neighbor analyses testify that sinkholes in southeastern Minnesota are not evenly distributed in this area (i.e., they tend to be clustered). More detailed statistical methods such as cluster analysis, histograms, probability estimation, correlation and regression have been used to study the spatial distributions of some mapped karst features of southeastern Minnesota. A sinkhole probability map for Goodhue County has been constructed based on sinkhole distribution, bedrock geology, depth to bedrock, GIS buffer analysis and nearest-neighbor analysis. A series of karst features for Winona County including sinkholes, springs, seeps, stream sinks and outcrop has been mapped and entered into the Karst Feature Database of Southeastern Minnesota. The Karst Feature Database of Winona County is being expanded to include all the mapped karst features of southeastern Minnesota. Air photos from 1930s to 1990s of Spring Valley Cavern Area in Fillmore County were scanned and geo-referenced into our GIS system. This technology has been proved to be very useful to identify sinkholes and study the rate of sinkhole development.

  14. An Innovative Marketing Model: Promoting Technical Programs by Conducting One-Day Conferences.

    ERIC Educational Resources Information Center

    Petrosian, Anahid

    This document examines an innovative marketing strategy developed by South Texas Community College (STCC) to promote its technical programs. In 2000, STCC organized the "Business Conference Institute" to develop 1-day conferences with the Division of Business, Math & Sciences (DBMS). The creation of this Institute linked the College with the local…

  15. Synthesis and photophysical properties of halogenated derivatives of (dibenzoylmethanato)boron difluoride

    NASA Astrophysics Data System (ADS)

    Kononevich, Yuriy N.; Surin, Nikolay M.; Sazhnikov, Viacheslav A.; Svidchenko, Evgeniya A.; Aristarkhov, Vladimir M.; Safonov, Andrei A.; Bagaturyants, Alexander A.; Alfimov, Mikhail V.; Muzafarov, Aziz M.

    2017-03-01

    A series of (dibenzoylmethanato)boron difluoride (BF2DBM) derivatives with a halogen atom in one of the phenyl rings at the para-position were synthesized and used to elucidate the effects of changing the attached halogen atom on the photophysical properties of BF2DBM. The room-temperature absorption and fluorescence maxima of fluoro-, chloro-, bromo- and iodo-substituted derivatives of BF2DBM in THF are red-shifted by about 2-10 nm relative to the corresponding peaks of the parent BF2DBM. The fluorescence quantum yields of the halogenated BF2DBMs (except the iodinated derivative) are larger than that of the unsubstituted BF2DBM. All the synthesized compounds are able to form fluorescent exciplexes with benzene and toluene (emission maxima at λem = 433 and 445 nm, respectively). The conformational structure and electronic spectral properties of halogenated BF2DBMs have been modeled by DFT/TDDFT calculations at the PBE0/SVP level of theory. The structure and fluorescence spectra of exciplexes were calculated using the CIS method with empirical dispersion correction.

  16. Intervariability and intravariability of bone morphogenetic proteins in commercially available demineralized bone matrix products.

    PubMed

    Bae, Hyun W; Zhao, Li; Kanim, Linda E A; Wong, Pamela; Delamarter, Rick B; Dawson, Edgar G

    2006-05-20

    Enzyme-linked immunosorbent assay was used to detect bone morphogenetic proteins (BMPs) 2, 4, and 7 in 9 commercially available ("off the shelf") demineralized bone matrix (DBM) product formulations using 3 different manufacturer's production lots of each DBM formulation. To evaluate and compare the quantity of BMPs among several different DBM formulations (inter-product variability), as well as examine the variability of these proteins in different production lots within the same DBM formulation (intra-product variability). DBMs are commonly used to augment available bone graft in spinal fusion procedures. Surgeons are presented with an ever-increasing variety of commercially available human DBMs from which to choose. Yet, there is limited information on a specific DBM product's osteoinductive efficacy, potency, and constancy. There were protein extracts from each DBM sample separately dialyzed 4 times against distilled water at 4 degrees C for 48 hours. The amount of BMP-2, BMP-4, and BMP-7 was determined using enzyme-linked immunosorbent assay. RESULTS.: The concentrations of detected BMP-2 and BMP-7 were low for all DBM formulations, only nanograms of BMP were extracted from each gram of DBM (20.2-120.6 ng BMP-2/g DBM product; 54.2-226.8 ng BMP-7/g DBM). The variability of BMP concentrations among different lots of the same DBM formulation, intra-product variability, was higher than the variability of concentrations among different DBM formulations, inter-product variability (coefficient of variation range BMP-2 [16.34% to 76.01%], P < 0.01; BMP-7 [3.71% to 82.08%], P < 0.001). BMP-4 was undetectable. The relative quantities of BMPs in DBMs are low, in the order of 1 x 10(-9) g of BMP/g of DBM. There is higher variability in concentration of BMPs among 3 different lots of the same DBM formulation than among different DBM formulations. This variability questions DBM products' reliability and, possibly, efficacy in providing consistent osteoinduction.

  17. Geologic data management at AVO: building authoritative coverage with radical availability (Invited)

    NASA Astrophysics Data System (ADS)

    Cameron, C.; Snedigar, S. F.; Nye, C. J.

    2009-12-01

    In 2002, the Alaska Volcano Observatory (AVO) began creating the Geologic Database of Information on Volcanoes in Alaska (GeoDIVA) to create a system that contains complete, flexible, timely, and accurate geologic and geographic information on Pleistocene and younger volcanoes in Alaska. This system was primarily intended to be a tool for scientific investigation, crisis response, and public information - delivered in a dynamic, digital format to both internal and external users. It is now the back-end of the AVO public website. GeoDIVA does not interface with our daily monitoring activities, however -- seismic and satellite data are handled by different database efforts. GeoDIVA also doesn’t store volcanic unrest data, although we hope WOVOdat will. GeoDIVA does include modules for the following datasets: bibliography (every subsequent piece of data in GeoDIVA is tied to a reference), basic volcano information (~137 edifices), historical eruption history information (~550 events), images (~17,000), sample information (~4400), geochemistry (~1500; population in progress), petrography (very early stages of data creation), sample storage (~14,000), and Quaternary vent information (~1200 vents). Modules in progress include GIS data, tephra data, and geochronologic data. In recent years, we have been doing maintenance work on older modules (for example, adding new references to the bibliography, and creating new queries and data fields in response to user feedback) as well as developing, designing, and populating new modules. Population can be quite time consuming, as there are no pre-compiled comprehensive existing sources for most information on Alaskan volcanoes, and we carefully reference each item. Newer modules also require more complex data arrangements than older modules. To meet the needs of a diverse group of users on widely varying computer platforms, GeoDIVA data is primarily stored in a MySQL DBMS; PostGIS/PostgreSQL are currently used to store and search spatial point data such as sample and volcano location. The spatial data storage system is evolving rapidly, and may change to a different DBMS in the future. Data upload is done via a web browser (one-record-at-a-time, tedious), or through automated .csv upload. Because we use open-source software and provide access through web browsers, AVO staff can view and update information from anywhere. In the future, we hope GeoDIVA will be a complete site for all geologic information about Alaskan volcanoes; because all data points are linked together (by references, sample IDs, volcanoes, geologists, etc.) we’ll be able to draw a box on a map and retrieve information on edifices, vents, samples, and all associated metadata, images, references, analytical data, and accompanying GIS files. As we look toward our goals, remaining challenges include: linking our data with other national and international efforts, creating easier ways for all to upload data, GIS development, and balancing the speed of new module development with the need for older module maintenance.

  18. Can we use GIS as a historic city's heritage management system? The case study of Hermoupolis-Syros

    NASA Astrophysics Data System (ADS)

    Chatzigrigoriou, Pavlos

    2016-08-01

    Because of the severe economic crisis, Greek historic heritage is in risk. Historic cities as Hermoupolis, were dealing with this risk years before the crisis. The current situation needed drastic action, with innovative low cost ideas. The historic building stock in Hermoupolis counts more than 1.200 buildings. By recording the pathology, the GIS and the D.B.M.S "HERMeS" with the appropriate algorithms identify the historic buildings in risk. In the first application of the system those buildings were 160, with a rate of 2.4 historic buildings collapse every year. The prioritization of interventions in these buildings is critical, as it is not possible to lower the collapsing risk simultaneously in 160 buildings, but neither the interventions can be judged solely by the reactions of local residents. Bearing in mind the fact that one, given the current economic conditions, has to make best use of the funds for this purpose, it is proved that the relevant decision requires multi criteria analysis method of prioritizing interventions. Specifically, the analysis takes into account the risk of collapse of each building, but in connection with a series of other variables, such as the role of building in Hermoupolis, the position in the city, the influence in other areas of interest, the social impact etc. The final result is a catalogue with historic buildings and a point system, which reflects the risk of loosing the building. The point system leads to a Conservation Plan for the city of Hermoupolis, giving the hierarchy of interventions that must be done in order to save the maximum architecture heritage with the minimum funds, postponing the risk of collapsing. In 2015, EU and EUROPA-NOSTRA awarded the above-mentioned project in the category of "Research and Digitization".

  19. Introduction to Data Acquisition 3.Let’s Acquire Data!

    NASA Astrophysics Data System (ADS)

    Nakanishi, Hideya; Okumura, Haruhiko

    In fusion experiments, diagnostic control and logging devices are usually connected through the field bus, e.g. GP-IB. Internet technologies are often applied for their remote operation. All equipment and digitizers are driven by pre-programmed sequences, in which clocks and triggers give the essential timing for data acquisition. Data production rate and amount must be checked in comparison with the transfer and store rates. To store binary raw data safely, journaling file systems are preferably used with redundant disks (RAID) or mirroring mechanism, such as “rsync”. A proper choice of the data compression method not only reduces the storage size but also improves the I/O throughputs. DBMS is even applicable to quick search or security around the table data.

  20. Large Declarative Memories in ACT-R

    DTIC Science & Technology

    2009-12-01

    containing the persistent DM of interest PDM-user Username required by the PostgreSQL DBMS for DB access PDM- passwd Password required by the PostgreSQL...34model-v5-DM" :pdm-user "Scott" :pdm- passwd “Open_Seseme" :pdm-resets-clear-db T :pdm-add-dm-serializes T :pdm-active T ... Figure 1: Activating and

  1. Integrated Array/Metadata Analytics

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2015-04-01

    Data comes in various forms and types, and integration usually presents a problem that is often simply ignored and solved with ad-hoc solutions. Multidimensional arrays are an ubiquitous data type, that we find at the core of virtually all science and engineering domains, as sensor, model, image, statistics data. Naturally, arrays are richly described by and intertwined with additional metadata (alphanumeric relational data, XML, JSON, etc). Database systems, however, a fundamental building block of what we call "Big Data", lack adequate support for modelling and expressing these array data/metadata relationships. Array analytics is hence quite primitive or non-existent at all in modern relational DBMS. Recognizing this, we extended SQL with a new SQL/MDA part seamlessly integrating multidimensional array analytics into the standard database query language. We demonstrate the benefits of SQL/MDA with real-world examples executed in ASQLDB, an open-source mediator system based on HSQLDB and rasdaman, that already implements SQL/MDA.

  2. Query3d: a new method for high-throughput analysis of functional residues in protein structures.

    PubMed

    Ausiello, Gabriele; Via, Allegra; Helmer-Citterich, Manuela

    2005-12-01

    The identification of local similarities between two protein structures can provide clues of a common function. Many different methods exist for searching for similar subsets of residues in proteins of known structure. However, the lack of functional and structural information on single residues, together with the low level of integration of this information in comparison methods, is a limitation that prevents these methods from being fully exploited in high-throughput analyses. Here we describe Query3d, a program that is both a structural DBMS (Database Management System) and a local comparison method. The method conserves a copy of all the residues of the Protein Data Bank annotated with a variety of functional and structural information. New annotations can be easily added from a variety of methods and known databases. The algorithm makes it possible to create complex queries based on the residues' function and then to compare only subsets of the selected residues. Functional information is also essential to speed up the comparison and the analysis of the results. With Query3d, users can easily obtain statistics on how many and which residues share certain properties in all proteins of known structure. At the same time, the method also finds their structural neighbours in the whole PDB. Programs and data can be accessed through the PdbFun web interface.

  3. Query3d: a new method for high-throughput analysis of functional residues in protein structures

    PubMed Central

    Ausiello, Gabriele; Via, Allegra; Helmer-Citterich, Manuela

    2005-01-01

    Background The identification of local similarities between two protein structures can provide clues of a common function. Many different methods exist for searching for similar subsets of residues in proteins of known structure. However, the lack of functional and structural information on single residues, together with the low level of integration of this information in comparison methods, is a limitation that prevents these methods from being fully exploited in high-throughput analyses. Results Here we describe Query3d, a program that is both a structural DBMS (Database Management System) and a local comparison method. The method conserves a copy of all the residues of the Protein Data Bank annotated with a variety of functional and structural information. New annotations can be easily added from a variety of methods and known databases. The algorithm makes it possible to create complex queries based on the residues' function and then to compare only subsets of the selected residues. Functional information is also essential to speed up the comparison and the analysis of the results. Conclusion With Query3d, users can easily obtain statistics on how many and which residues share certain properties in all proteins of known structure. At the same time, the method also finds their structural neighbours in the whole PDB. Programs and data can be accessed through the PdbFun web interface. PMID:16351754

  4. USL/DBMS NASA/PC R and D project C programming standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1984-01-01

    A set of programming standards intended to promote reliability, readability, and portability of C programs written for PC research and development projects is established. These standards must be adhered to except where reasons for deviation are clearly identified and approved by the PC team. Any approved deviation from these standards must also be clearly documented in the pertinent source code.

  5. Zero tolerance for incorrect data: Best practices in SQL transaction programming

    NASA Astrophysics Data System (ADS)

    Laiho, M.; Skourlas, C.; Dervos, D. A.

    2015-02-01

    DBMS products differ in the way they support even the basic SQL transaction services. In this paper, a framework of best practices in SQL transaction programming is given and discussed. The SQL developers are advised to experiment with and verify the services supported by the DBMS product used. The framework has been developed by DBTechNet, a European network of teachers, trainers and ICT professionals. A course module on SQL transactions, offered by the LLP "DBTech VET Teachers" programme, is also presented and discussed. Aims and objectives of the programme include the introduction of the topics and content of SQL transactions and concurrency control to HE/VET curricula and addressing the need for initial and continuous training on these topics to in-company trainers, VET teachers, and Higher Education students. An overview of the course module, its learning outcomes, the education and training (E&T) content, virtual database labs with hands-on self-practicing exercises, plus instructions for the teacher/trainer on the pedagogy and the usage of the course modules' content are briefly described. The main principle adopted is to "Learn by verifying in practice" and the transactions course motto is: "Zero Tolerance for Incorrect Data".

  6. GeoNetGIS: a Geodetic Network Geographical Information System to manage GPS networks in seismic and volcanic areas

    NASA Astrophysics Data System (ADS)

    Cristofoletti, P.; Esposito, A.; Anzidei, M.

    2003-04-01

    This paper presents the methodologies and issues involved in the use of GIS techniques to manage geodetic information derived from networks in seismic and volcanic areas. Organization and manipulation of different geodetical, geological and seismic database, give us a new challenge in interpretation of information that has several dimensions, including spatial and temporal variations, also the flexibility and brand range of tools available in GeoNetGIS, make it an attractive platform for earthquake risk assessment. During the last decade the use of geodetic networks based on the Global Positioning System, devoted to geophysical applications, especially for crustal deformation monitoring in seismic and volcanic areas, increased dramatically. The large amount of data provided by these networks, combined with different and independent observations, such as epicentre distribution of recent and historical earthquakes, geological and structural data, photo interpretation of aerial and satellite images, can aid for the detection and parameterization of seismogenic sources. In particular we applied our geodetic oriented GIS to a new GPS network recently set up and surveyed in the Central Apennine region: the CA-GeoNet. GeoNetGIS is designed to analyze in three and four dimensions GPS sources and to improve crustal deformation analysis and interpretation related with tectonic structures and seismicity. It manages many database (DBMS) consisting of different classes, such as Geodesy, Topography, Seismicity, Geology, Geography and Raster Images, administrated according to Thematic Layers. GeoNetGIS represents a powerful research tool allowing to join the analysis of all data layers to integrate the different data base which aid for the identification of the activity of known faults or structures and suggesting the new evidences of active tectonics. A new approach to data integration given by GeoNetGIS capabilities, allow us to create and deliver a wide range of maps, digital and 3-dimensional environment data analysis applications for geophysical users and civil defense companies, also distributing them on the World Wide Web or in wireless connection realized by PDA computer. It runs on powerful PC platform under Win2000 Prof OS © and based on ArcGIS 8.2 ESRI © software.

  7. A comparative evaluation plan for the Maintenance, Inventory, and Logistics Planning (MILP) System Human-Computer Interface (HCI)

    NASA Technical Reports Server (NTRS)

    Overmyer, Scott P.

    1993-01-01

    The primary goal of this project was to develop a tailored and effective approach to the design and evaluation of the human-computer interface (HCI) to the Maintenance, Inventory and Logistics Planning (MILP) System in support of the Mission Operations Directorate (MOD). An additional task that was undertaken was to assist in the review of Ground Displays for Space Station Freedom (SSF) by attending the Ground Displays Interface Group (GDIG), and commenting on the preliminary design for these displays. Based upon data gathered over the 10 week period, this project has hypothesized that the proper HCI concept for navigating through maintenance databases for large space vehicles is one based upon a spatial, direct manipulation approach. This dialogue style can be then coupled with a traditional text-based DBMS, after the user has determined the general nature and location of the information needed. This conclusion is in contrast with the currently planned HCI for MILP which uses a traditional form-fill-in dialogue style for all data access and retrieval. In order to resolve this difference in HCI and dialogue styles, it is recommended that comparative evaluation be performed which combines the use of both subjective and objective metrics to determine the optimal (performance-wise) and preferred approach for end users. The proposed plan has been outlined in the previous paragraphs and is available in its entirety in the Technical Report associated with this project. Further, it is suggested that several of the more useful features of the Maintenance Operations Management System (MOMS), especially those developed by the end-users, be incorporated into MILP to save development time and money.

  8. An offline-online Web-GIS Android application for fast data acquisition of landslide hazard and risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Sudmeier-Rieux, Karen; Jaboyedoff, Michel; Derron, Marc-Henri; Devkota, Sanjaya

    2017-04-01

    Regional landslide assessments and mapping have been effectively pursued by research institutions, national and local governments, non-governmental organizations (NGOs), and different stakeholders for some time, and a wide range of methodologies and technologies have consequently been proposed. Land-use mapping and hazard event inventories are mostly created by remote-sensing data, subject to difficulties, such as accessibility and terrain, which need to be overcome. Likewise, landslide data acquisition for the field navigation can magnify the accuracy of databases and analysis. Open-source Web and mobile GIS tools can be used for improved ground-truthing of critical areas to improve the analysis of hazard patterns and triggering factors. This paper reviews the implementation and selected results of a secure mobile-map application called ROOMA (Rapid Offline-Online Mapping Application) for the rapid data collection of landslide hazard and risk. This prototype assists the quick creation of landslide inventory maps (LIMs) by collecting information on the type, feature, volume, date, and patterns of landslides using open-source Web-GIS technologies such as Leaflet maps, Cordova, GeoServer, PostgreSQL as the real DBMS (database management system), and PostGIS as its plug-in for spatial database management. This application comprises Leaflet maps coupled with satellite images as a base layer, drawing tools, geolocation (using GPS and the Internet), photo mapping, and event clustering. All the features and information are recorded into a GeoJSON text file in an offline version (Android) and subsequently uploaded to the online mode (using all browsers) with the availability of Internet. Finally, the events can be accessed and edited after approval by an administrator and then be visualized by the general public.

  9. Comparison between two different methods for evaluating rumen papillae measures related to different diets.

    PubMed

    Scocco, Paola; Brusaferro, Andrea; Catorci, Andrea

    2012-07-01

    Although the Geographical Information System (GIS), which integrates computerized drawing computer assisted design (CAD) and relational databases (data base management system (DBMS)), is best known for applications in geographical and planning cartography, it can also use many kinds of information concerning the territory. A multidisciplinary project was initiated since 5 years a multidisciplinary study was initiated to use GIS to integrate environmental and ecological data with findings on animal health, ethology, and anatomy. This study is chiefly aimed at comparing two different methods for measuring the absorptive surface of rumen papillae. To this scope, 21 female sheep (Ovis aries) on different alimentary regimes (e.g., milk and forage mixed diet, early herbaceous diet, dry hay diet, and fresh hay diet at the maximum of pasture flowering and at the maximum of pasture dryness) were used; after slaughtering, 20 papillae were randomly removed from each sample collected from four indicator regions of rumen wall, placed near a metric reference and digitally photographed. The images were developed with the ArcGIS™ software to calculate the area of rumen papillae by means of GIS and to measure their mid-level width and length to calculate the papillae area as previously performed with a different method. Spatial measurements were analyzed using univariate and multivariate methods. This work demonstrates that the GIS methodology can be efficiently used for measuring the absorptive surface of rumen papillae. In addition, GIS demonstrated to be a rapid, precise, and objective tool when compared with previously used method. Copyright © 2012 Wiley Periodicals, Inc.

  10. New Capabilities in the Astrophysics Multispectral Archive Search Engine

    NASA Astrophysics Data System (ADS)

    Cheung, C. Y.; Kelley, S.; Roussopoulos, N.

    The Astrophysics Multispectral Archive Search Engine (AMASE) uses object-oriented database techniques to provide a uniform multi-mission and multi-spectral interface to search for data in the distributed archives. We describe our experience of porting AMASE from Illustra object-relational DBMS to the Informix Universal Data Server. New capabilities and utilities have been developed, including a spatial datablade that supports Nearest Neighbor queries.

  11. Data Warehouse Architecture for Army Installations

    DTIC Science & Technology

    1999-11-01

    Laboratory (CERL). Dr. Moonja Kim is Chief, CN-B and Dr. John Bandy is Chief, CN. The technical editor was Linda L. Wheatley, Information Technology...1994. Devlin, Barry, Data Warehouse, From Architecture to Implementation (Addison-Wesley, 1997). Inmon, W.H., Building the Data Warehouse ( John ...Magazine, August 1997. Kimball, Ralph, "Digging into Data Mining," DBMS Magazine, October 1997. Lewison , Lisa, "Data Mining: Intelligent Technology

  12. Anatomical characteristics of teats and premilking bacterial counts of teat skin swabs of primiparous cows exposed to different types of bedding.

    PubMed

    Guarín, J F; Baumberger, C; Ruegg, P L

    2017-02-01

    Bacterial populations of teat skin are associated with risk of intramammary infection and may be influenced by anatomical characteristics of teats. The objective of this study was to evaluate associations of selected anatomical characteristics of teats with bacterial counts of teat skin of cows exposed to different types of bedding. Primarily primiparous Holstein cows (n = 128) were randomly allocated to 4 pens within a single barn. Each pen contained 1 type of bedding [new sand (NES), recycled sand (RS), deep-bedded manure solids (DBMS), and shallow-bedded manure solids over foam core mattresses (SBMS)]. During a single farm visit udders (n = 112) were scored for hygiene and 1 front (n = 112) and 1 rear teat (n = 111) of each enrolled cow were scored for hyperkeratosis (HK). Teat length, teat barrel diameter, and teat apex diameter were measured and teat skin swabs were systematically collected for microbiological analysis. Linear type evaluation data for udders of each cow were retrieved for each cow. Teat position (front or rear) was associated with occurrence of clinical mastitis during the 12 mo before the farm visit and more cases occurred in front quarters. The proportion of udders that were classified as clean (score 1 or 2) was 68, 82, 54, and 95% for cows housed in pens containing NES, RS, SBMS, and DBMS, respectively. No association was found between HK score and teat position and no association was found between HK score and teat skin bacterial count. Bacterial counts of teat skin swabs from front teats of cows in pens containing RS and SBMS were significantly less than those of rear teats of cows in pens containing DBMS or NES. Teat skin bacterial counts were significantly greater for swabs obtained from teats of cows with udder hygiene scores of 3 and 4 as compared with swabs obtained from cows with cleaner udders. Of all udder conformation traits evaluated, only narrower rear teat placement was positively associated with bacterial counts on teat skin. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. A Methodology for Benchmarking Relational Database Machines,

    DTIC Science & Technology

    1984-01-01

    user benchmarks is to compare the multiple users to the best-case performance The data for each query classification coll and the performance...called a benchmark. The term benchmark originates from the markers used by sur - veyors in establishing common reference points for their measure...formatted databases. In order to further simplify the problem, we restrict our study to those DBMs which support the relational model. A sur - vey

  14. Endochondral vs. intramembranous demineralized bone matrices as implants for osseous defects.

    PubMed

    Nidoli, M C; Nielsen, F F; Melsen, B

    1999-05-01

    This study focuses on the difference in regenerative capacity between endochondral and intramembranous demineralized bone matrices (DBMs) when implanted into bony defects. It also focuses on the possible influence of the type of skeletal recipient site (orthotopic or heterotopic). Of 34 Wistar rats, 10 served as a source of DBM, and 24 were divided into two groups of 12 animals. In group A identical defects were produced in the parietal bones, whereas in group B the defects were produced in each radius. The right defects were implanted with endochondral DBM and the left defects were implanted with intramembranous DBM. Descriptive and/or histomorphometric analyses were performed by means of light and polarized microscopy, and radiography (group B). Right and left data were compared to disclose differences in bone-healing capacity. The quantitative results demonstrated that endochondral DBM displays a greater regenerative capacity than intramembranous DBM when implanted heterotopically. The different clinical performances of endochondral and intramembranous bone grafts might be explained on the basis of the mechanical rather than the osteoinductive principle. The qualitative results suggest that the type of bone deposition induced by the DBMs is not related to the type of implanted DBM. Recipient site characteristics and/or environmental factors seem decisive in the occurrence of either types of ossification.

  15. Database Objects vs Files: Evaluation of alternative strategies for managing large remote sensing data

    NASA Astrophysics Data System (ADS)

    Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram

    2010-05-01

    Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available. Another consideration is the strategy used for partitioning large data collections, and large datasets within collections, using round-robin vs hash partitioning vs range partitioning methods. Each has different characteristics in terms of spatial locality of data and resultant degree of declustering of the computations on the data. Furthermore, we have observed that, in practice, there can be large variations in the frequency of access to different parts of a large data collection and/or dataset, thereby creating "hotspots" in the data. We will evaluate the ability of different approaches for dealing effectively with such hotspots and alternative strategies for dealing with hotspots.

  16. GIS embedded hydrological modeling: the SID&GRID project

    NASA Astrophysics Data System (ADS)

    Borsi, I.; Rossetto, R.; Schifani, C.

    2012-04-01

    The SID&GRID research project, started April 2010 and funded by Regione Toscana (Italy) under the POR FSE 2007-2013, aims to develop a Decision Support System (DSS) for water resource management and planning based on open source and public domain solutions. In order to quantitatively assess water availability in space and time and to support the planning decision processes, the SID&GRID solution consists of hydrological models (coupling 3D existing and newly developed surface- and ground-water and unsaturated zone modeling codes) embedded in a GIS interface, applications and library, where all the input and output data are managed by means of DataBase Management System (DBMS). A graphical user interface (GUI) to manage, analyze and run the SID&GRID hydrological models based on open source gvSIG GIS framework (Asociación gvSIG, 2011) and a Spatial Data Infrastructure to share and interoperate with distributed geographical data is being developed. Such a GUI is thought as a "master control panel" able to guide the user from pre-processing spatial and temporal data, running the hydrological models, and analyzing the outputs. To achieve the above-mentioned goals, the following codes have been selected and are being integrated: 1. Postgresql/PostGIS (PostGIS, 2011) for the Geo Data base Management System; 2. gvSIG with Sextante (Olaya, 2011) geo-algorithm library capabilities and Grass tools (GRASS Development Team, 2011) for the desktop GIS; 3. Geoserver and Geonetwork to share and discover spatial data on the web according to Open Geospatial Consortium; 4. new tools based on the Sextante GeoAlgorithm framework; 5. MODFLOW-2005 (Harbaugh, 2005) groundwater modeling code; 6. MODFLOW-LGR (Mehl and Hill 2005) for local grid refinement; 7. VSF (Thoms et al., 2006) for the variable saturated flow component; 8. new developed routines for overland flow; 9. new algorithms in Jython integrated in gvSIG to compute the net rainfall rate reaching the soil surface, as input for the unsaturated/saturated flow model. At this stage of the research (which will end April 2013), two primary components of the master control panel are being developed: i. a SID&GRID toolbar integrated into gvSIG map context; ii. a new Sextante set of geo-algorithm to pre- and post-process the spatial data to run the hydrological models. The groundwater part of the code has been fully integrated and tested and 3D visualization tools are being developed. The LGR capability has been extended to the 3D solution of the Richards' equation in order to solve in detail the unsaturated zone where required. To be updated about the project, please follow us at the website: http://ut11.isti.cnr.it/SIDGRID/

  17. Chapter 11: Web-based Tools - VO Region Inventory Service

    NASA Astrophysics Data System (ADS)

    Good, J. C.

    As the size and number of datasets available through the VO grows, it becomes increasingly critical to have services that aid in locating and characterizing data pertinent to a particular scientific problem. At the same time, this same increase makes that goal more and more difficult to achieve. With a small number of datasets, it is feasible to simply retrieve the data itself (as the NVO DataScope service does). At intermediate scales, "count" DBMS searches (searches of the actual datasets which return record counts rather than full data subsets) sent to each data provider will work. However, neither of these approaches scale as the number of datasets expands into the hundreds or thousands. Dealing with the same problem internally, IRSA developed a compact and extremely fast scheme for determining source counts for positional catalogs (and in some cases image metadata) over arbitrarily large regions for multiple catalogs in a fraction of a second. To show applicability to the VO in general, this service has been extended with indices for all 4000+ catalogs in CDS Vizier (essentially all published catalogs and source tables). In this chapter, we will briefly describe the architecture of this service, and then describe how this can be used in a distributed system to retrieve rapid inventories of all VO holdings in a way that places an insignificant load on any data supplier. Further, we show and this tool can be used in conjunction with VO Registries and catalog services to zero in on those datasets that are appropriate to the user's needs. The initial implementation of this service consolidates custom binary index file structures (external to any DBMS and therefore portable) at a single site to minimize search times and implements the search interface as a simple CGI program. However, the architecture is amenable to distribution. The next phase of development will focus on metadata harvesting from data archives through a standard program interface and distribution of the search processing across multiple service providers for redundancy and parallelization.

  18. LSD: Large Survey Database framework

    NASA Astrophysics Data System (ADS)

    Juric, Mario

    2012-09-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures.

  19. A Generalized DBMS to Support Diversified Data.

    DTIC Science & Technology

    1987-07-21

    interest on bonds ). Hence. they require a definition of subtraction which yields 30 days as the answer to the above computation. Only a user-defined...STON85]. Alternately. one can follow the standard scheduler model [BERNS1] in which a module is callable by code in the access methods when a...direction for evolution . These could include when to cease investigating alternate plans. and the ability to specify one’s own optimizer parameters

  20. Orthobiologics in the augmentation of osteoporotic fractures.

    PubMed

    Watson, J Tracy; Nicolaou, Daemeon A

    2015-02-01

    Many orthobiologic adjuvants are available and widely utilized for general skeletal restoration. Their use for the specific task of osteoporotic fracture augmentation is less well recognized. Common conductive materials are reviewed for their value in this patient population including the large group of allograft adjuvants categorically known as the demineralized bone matrices (DBMs). Another large group of alloplastic materials is also examined-the calcium phosphate and sulfate ceramics. Both of these materials, when used for the proper indications, demonstrate efficacy for these patients. The inductive properties of bone morphogenic proteins (BMPs) and platelet concentrates show no clear advantages for this group of patients. Systemic agents including bisphosphonates, receptor activator of nuclear factor κβ ligand (RANKL) inhibitors, and parathyroid hormone augmentation all demonstrate positive effects with this fracture cohort. Newer modalities, such as trace ion bioceramic augmentation, are also reviewed for their positive effects on osteoporotic fracture healing.

  1. The development of digital library system for drug research information.

    PubMed

    Kim, H J; Kim, S R; Yoo, D S; Lee, S H; Suh, O K; Cho, J H; Shin, H T; Yoon, J P

    1998-01-01

    The sophistication of computer technology and information transmission on internet has made various cyber information repository available to information consumers. In the era of information super-highway, the digital library which can be accessed from remote sites at any time is considered the prototype of information repository. Using object-oriented DBMS, the very first model of digital library for pharmaceutical researchers and related professionals in Korea has been developed. The published research papers and researchers' personal information was included in the database. For database with research papers, 13 domestic journals were abstracted and scanned for full-text image files which can be viewed by Internet web browsers. The database with researchers' personal information was also developed and interlinked to the database with research papers. These database will be continuously updated and will be combined with world-wide information as the unique digital library in the field of pharmacy.

  2. Preliminary Version: Ada (Trade Name)/SQL: A Standard, Portable Ada-DBMS Interface.

    DTIC Science & Technology

    1987-04-01

    3.3.2) and renaming declarations (see IRM section 8.5). The only declarations that are permitted within a schema package are those that apply directly...end SQLOPERATIONS; bl r °o. 184 with SQLDEFINITIONS: use SQLDEFINITIONS: package DATEUNDERLYING is type CELLARTYPE is record I STAR, BIN, WINE ...34); WINE constant FIELD MAKEFIELD(" WINE "); PRODUCER constant FIELD = MAKEFIELD ("PRODUCER"); YEAR constant FIELD MAKEFIELD("YEAR"); BOTTLES constant

  3. National Computer Security Conference (15th) held in Baltimore, Maryland on October 13-16, 1992. Volume 1: Proceedings

    DTIC Science & Technology

    1992-10-16

    The Spread of Viruses Programs have been developed to attack some of these problems , primarily in the virus arena. Pick up any PC magazine and you...a particular DBMS-specific research area. These documents discuss the research problem , present an overview of relevant research and development ...MITRE has developed to date and a brief discussion those documents currently under development . For each companion document, the problem being

  4. Blind Seer: A Scalable Private DBMS

    DTIC Science & Technology

    2014-05-01

    searchable index terms per DB row, in time comparable to (insecure) MySQL (many practical queries can be privately executed with work 1.2-3 times slower...than MySQL , although some queries are costlier). We support a rich query set, including searching on arbitrary boolean formulas on keywords and ranges...index terms per DB row, in time comparable to (insecure) MySQL (many practical queries can be privately executed with work 1.2-3 times slower than MySQL

  5. VLBA Archive &Distribution Architecture

    NASA Astrophysics Data System (ADS)

    Wells, D. C.

    1994-01-01

    Signals from the 10 antennas of NRAO's VLBA [Very Long Baseline Array] are processed by a Correlator. The complex fringe visibilities produced by the Correlator are archived on magnetic cartridges using a low-cost architecture which is capable of scaling and evolving. Archive files are copied to magnetic media to be distributed to users in FITS format, using the BINTABLE extension. Archive files are labelled using SQL INSERT statements, in order to bind the DBMS-based archive catalog to the archive media.

  6. Rule-based topology system for spatial databases to validate complex geographic datasets

    NASA Astrophysics Data System (ADS)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  7. Hazardous chemical tracking system (HAZ-TRAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramlette, J D; Ewart, S M; Jones, C E

    Westinghouse Idaho Nuclear Company, Inc. (WINCO) developed and implemented a computerized hazardous chemical tracking system, referred to as Haz-Trac, for use at the Idaho Chemical Processing Plant (ICPP). Haz-Trac is designed to provide a means to improve the accuracy and reliability of chemical information, which enhances the overall quality and safety of ICPP operations. The system tracks all chemicals and chemical components from the time they enter the ICPP until the chemical changes form, is used, or becomes a waste. The system runs on a Hewlett-Packard (HP) 3000 Series 70 computer. The system is written in COBOL and uses VIEW/3000,more » TurboIMAGE/DBMS 3000, OMNIDEX, and SPEEDWARE. The HP 3000 may be accessed throughout the ICPP, and from remote locations, using data communication lines. Haz-Trac went into production in October, 1989. Currently, over 1910 chemicals and chemical components are tracked on the system. More than 2500 personnel hours were saved during the first six months of operation. Cost savings have been realized by reducing the time needed to collect and compile reporting information, identifying and disposing of unneeded chemicals, and eliminating duplicate inventories. Haz-Trac maintains information required by the Superfund Amendment Reauthorization Act (SARA), the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Occupational Safety and Health Administration (OSHA).« less

  8. Issues in Benchmark Metric Selection

    NASA Astrophysics Data System (ADS)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  9. Crystal structure of di-bromo-meth-oxy-seselin (DBMS), a photobiologically active pyran-ocoumarin.

    PubMed

    Bauri, A K; Foro, Sabine; Rahman, A F M M

    2017-05-01

    The title compound, C 15 H 14 Br 2 O 4 [systematic name: rac -(9 S ,10 R )-3,9-dibromo-10-methoxy-8,8-dimethyl-9,10-dihydropyrano[2,3- h ]chromen-2(8 H )-one], is a pyran-ocoumarin derivative formed by the bromination of seselin, which is a naturally occurring angular pyran-ocoumarin isolated from the Indian herb Trachyspermum stictocarpum . In the mol-ecule, the benzo-pyran ring system is essentially planar, with a maximum deviation of 0.044 (2) Å for the O atom. The di-hydro-pyran ring is in a half-chair conformation and the four essentially planar atoms of this ring form a dihedral angle of 4.6 (2)° with the benzo-pyran ring system. In the crystal, mol-ecules are linked by weak C-H⋯O hydrogen bonds, forming chains propagating along [010]. In addition, π-π stacking inter-actions, with centroid-centroid distances of 3.902 (2) and 3.908 (2) Å, link the hydrogen-bonded chains into layers parallel to (001).

  10. MappERS-C and MappERS-V. The crowd source for prevention and crisis support

    NASA Astrophysics Data System (ADS)

    Frigerio, Simone; Schenato, Luca; Bianchizza, Chiara; Del Bianco, Daniele

    2015-04-01

    The responsibilities within natural hazards at local/regional levels involve citizens and volunteers as first actors of civil protection and territorial management. The prevention implicates the capacities of professional operators and technical volunteers, but the priority implies now the involvement and awareness of the citizens over the territory they inhabit. The involvement of population creates context-specific strategies of territorial surveillance and management, skipping the limit to face risks only when they have to bear impacts on their lives. MAppERS (Mobile Application for Emergency Response and Support) is a EU project (funded under programme 2013-2015 Humanitarian Aid and Civil Protection, ECHO A5) which empowers "crowd-sourced mappers" through smart phone applications and sensors, with geo-tagged information, detailed gathered parameters, field-check survey in a contest of geospatial response. The process of development includes feedback from citizens, involving them in training courses on the monitoring as long term objective (raising public awareness and participation). The project deals with the development and testing of the smart phone applications (module MAppERS-V for volunteers, module MAppERS-C for citizens) according to Android SDK environment. A first research described a desk-based investigation on consequences of disasters impacts and costs of prevention strategies in pilot countries. Furthermore a review of state-of-the-art of database management systems (DBMS) in pilot countries and involvement of volunteers/citizens in data collection/monitoring collected basic info on data structure for the development. A desk-based research proposed communication methods/graphic solutions within mobile technologies for disaster management in pilot countries and available smartphone applications linked to centralized web/server database. A technical review is compulsory for a useful design-line for MappERS development, and it is linked with on-site feedback about volounteers and citizens needs within pilot groups activities. The app modules will be later re-designed according to the methodological and technical feedback gained during pilot study. Training curricula for citizens are planned to increase awareness, skills on smart phone utilities and efficient jargon for hazard contest. The expected results are: a) an easy-to-use interface for "human-data" in crisis support, b) a maximised utility of peer-produced data gathering, c) the development of human resources as technical tools d) a self-based awareness improvement.

  11. C-A1-03: Considerations in the Design and Use of an Oracle-based Virtual Data Warehouse

    PubMed Central

    Bredfeldt, Christine; McFarland, Lela

    2011-01-01

    Background/Aims The amount of clinical data available for research is growing exponentially. As it grows, increasing the efficiency of both data storage and data access becomes critical. Relational database management systems (rDBMS) such as Oracle are ideal solutions for managing longitudinal clinical data because they support large-scale data storage and highly efficient data retrieval. In addition, they can greatly simplify the management of large data warehouses, including security management and regular data refreshes. However, the HMORN Virtual Data Warehouse (VDW) was originally designed based on SAS datasets, and this design choice has a number of implications for both the design and use of an Oracle-based VDW. From a design standpoint, VDW tables are designed as flat SAS datasets, which do not take full advantage of Oracle indexing capabilities. From a data retrieval standpoint, standard VDW SAS scripts do not take advantage of SAS pass-through SQL capabilities to enable Oracle to perform the processing required to narrow datasets to the population of interest. Methods Beginning in 2009, the research department at Kaiser Permanente in the Mid-Atlantic States (KPMA) has developed an Oracle-based VDW according to the HMORN v3 specifications. In order to take advantage of the strengths of relational databases, KPMA introduced an interface layer to the VDW data, using views to provide access to standardized VDW variables. In addition, KPMA has developed SAS programs that provide access to SQL pass-through processing for first-pass data extraction into SAS VDW datasets for processing by standard VDW scripts. Results We discuss both the design and performance considerations specific to the KPMA Oracle-based VDW. We benchmarked performance of the Oracle-based VDW using both standard VDW scripts and an initial pre-processing layer to evaluate speed and accuracy of data return. Conclusions Adapting the VDW for deployment in an Oracle environment required minor changes to the underlying structure of the data. Further modifications of the underlying data structure would lead to performance enhancements. Maximally efficient data access for standard VDW scripts requires an extra step that involves restricting the data to the population of interest at the data server level prior to standard processing.

  12. Providing the Persistent Data Storage in a Software Engineering Environment Using Java/COBRA and a DBMS

    NASA Technical Reports Server (NTRS)

    Dhaliwal, Swarn S.

    1997-01-01

    An investigation was undertaken to build the software foundation for the WHERE (Web-based Hyper-text Environment for Requirements Engineering) project. The TCM (Toolkit for Conceptual Modeling) was chosen as the foundation software for the WHERE project which aims to provide an environment for facilitating collaboration among geographically distributed people involved in the Requirements Engineering process. The TCM is a collection of diagram and table editors and has been implemented in the C++ programming language. The C++ implementation of the TCM was translated into Java in order to allow the editors to be used for building various functionality of the WHERE project; the WHERE project intends to use the Web as its communication back- bone. One of the limitations of the translated software (TcmJava), which militated against its use in the WHERE project, was persistent data management mechanisms which it inherited from the original TCM; it was designed to be used in standalone applications. Before TcmJava editors could be used as a part of the multi-user, geographically distributed applications of the WHERE project, a persistent storage mechanism must be built which would allow data communication over the Internet, using the capabilities of the Web. An approach involving features of Java, CORBA (Common Object Request Broker), the Web, a middle-ware (Java Relational Binding (JRB)), and a database server was used to build the persistent data management infrastructure for the WHERE project. The developed infrastructure allows a TcmJava editor to be downloaded and run from a network host by using a JDK 1.1 (Java Developer's Kit) compatible Web-browser. The aforementioned editor establishes connection with a server by using the ORB (Object Request Broker) software and stores/retrieves data in/from the server. The server consists of a CORBA object or objects depending upon whether the data is to be made persistent on a single server or multiple servers. The CORBA object providing the persistent data server is implemented using the Java progranu-ning language. It uses the JRB to store/retrieve data in/from a relational database server. The persistent data management system provides transaction and user management facilities which allow multi-user, distributed access to the stored data in a secure manner.

  13. When size matters: differences in demineralized bone matrix particles affect collagen structure, mesenchymal stem cell behavior, and osteogenic potential.

    PubMed

    Dozza, B; Lesci, I G; Duchi, S; Della Bella, E; Martini, L; Salamanna, F; Falconi, M; Cinotti, S; Fini, M; Lucarelli, E; Donati, D

    2017-04-01

    Demineralized bone matrix (DBM) is a natural, collagen-based, osteoinductive biomaterial. Nevertheless, there are conflicting reports on the efficacy of this product. The purpose of this study was to evaluate whether DBM collagen structure is affected by particle size and can influence DBM cytocompatibility and osteoinductivity. Sheep cortical bone was ground and particles were divided in three fractions with different sizes, defined as large (L, 1-2 mm), medium (M, 0.5-1 mm), and small (S, <0.5 mm). After demineralization, the chemical-physical analysis clearly showed a particle size-dependent alteration in collagen structure, with DBM-M being altered but not as much as DBM-S. DBM-M displayed a preferable trend in almost all biological characteristics tested, although all DBM particles revealed an optimal cytocompatibility. Subcutaneous implantation of DBM particles into immunocompromised mice resulted in bone induction only for DBM-M. When sheep MSC were seeded onto particles before implantation, all DBM particles were able to induce new bone formation with the best incidence for DBM-M and DBM-S. In conclusion, the collagen alteration in DBM-M is likely the best condition to promote bone induction in vivo. Furthermore, the choice of 0.5-1 mm particles may enable to obtain more efficient and consistent results among different research groups in bone tissue-engineering applications. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 1019-1033, 2017. © 2017 Wiley Periodicals, Inc.

  14. Effects of Raindrop Shape Parameter on the Simulation of Plum Rains

    NASA Astrophysics Data System (ADS)

    Mei, H.; Zhou, L.; Li, X.; Huang, X.; Guo, W.

    2017-12-01

    The raindrop shape parameter of particle distribution is generally set as constant in a Double-moment Bulk Microphysics Scheme (DBMS) using Gama distribution function though which suggest huge differences in time and space according to observations. Based on Milbrandt 2-mon(MY) DBMS, four cases during Plum Rains season are simulated coupled with four empirical relationships between shape parameter (μr) and slope parameter of raindrop which have been concluded from observations of raindrop distributions. The analysis of model results suggest that μr have some influences on rainfall. Introducing the diagnostic formulas of μr may have some improvement on systematic biases of 24h accumulated rainfall and show some correction ability on local characteristics of rainfall distribution. Besides,the tendency to improve strong rainfall could be sensitive to μr. With the improvement of the diagnosis of μr using the empirically diagnostic formulas, μr increases generally in the middle- and lower-troposphere and decreases with the stronger rainfall. Its conclued that, the decline in raindrop water content and the increased raindrop mass-weighted average terminal velocity directly related to μr are the direct reasons of variations in the precipitation.On the other side, the environmental conditions including relative humidity and dynamical parameters are the key indirectly causes which has close relationships with the changes in cloud particles and rainfall distributions.Furthermore,the differences in the scale of improvement between the weak and heavy rainfall mainly come from the distinctions of response features about their variable fields respectively. The extent of variation in the features of cloud particles in warm clouds of heavy rainfall differs greatly from that of weak rainfall, though they share the same trend of variation. On the conditions of weak rainfall, the response of physical characteristics to μr performed consistent trends and some linear features. However, environmental conditions of relative humidity and dynamical parameters perform strong and vertically deep adjustments in the heavy precipitation with vigorous cloud systems. In this case, the microphysical processes and environmental conditions experience complex interactions with each other and no significant laws could be concluded.

  15. Introducing the Forensic Research/Reference on Genetics knowledge base, FROG-kb.

    PubMed

    Rajeevan, Haseena; Soundararajan, Usha; Pakstis, Andrew J; Kidd, Kenneth K

    2012-09-01

    Online tools and databases based on multi-allelic short tandem repeat polymorphisms (STRPs) are actively used in forensic teaching, research, and investigations. The Fst value of each CODIS marker tends to be low across the populations of the world and most populations typically have all the common STRP alleles present diminishing the ability of these systems to discriminate ethnicity. Recently, considerable research is being conducted on single nucleotide polymorphisms (SNPs) to be considered for human identification and description. However, online tools and databases that can be used for forensic research and investigation are limited. The back end DBMS (Database Management System) for FROG-kb is Oracle version 10. The front end is implemented with specific code using technologies such as Java, Java Servlet, JSP, JQuery, and GoogleCharts. We present an open access web application, FROG-kb (Forensic Research/Reference on Genetics-knowledge base, http://frog.med.yale.edu), that is useful for teaching and research relevant to forensics and can serve as a tool facilitating forensic practice. The underlying data for FROG-kb are provided by the already extensively used and referenced ALlele FREquency Database, ALFRED (http://alfred.med.yale.edu). In addition to displaying data in an organized manner, computational tools that use the underlying allele frequencies with user-provided data are implemented in FROG-kb. These tools are organized by the different published SNP/marker panels available. This web tool currently has implemented general functions possible for two types of SNP panels, individual identification and ancestry inference, and a prediction function specific to a phenotype informative panel for eye color. The current online version of FROG-kb already provides new and useful functionality. We expect FROG-kb to grow and expand in capabilities and welcome input from the forensic community in identifying datasets and functionalities that will be most helpful and useful. Thus, the structure and functionality of FROG-kb will be revised in an ongoing process of improvement. This paper describes the state as of early June 2012.

  16. Assessment of the SFC database for analysis and modeling

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.

    1994-01-01

    SFC is one of the four clusters that make up the Integrated Work Control System (IWCS), which will integrate the shuttle processing databases at Kennedy Space Center (KSC). The IWCS framework will enable communication among the four clusters and add new data collection protocols. The Shop Floor Control (SFC) module has been operational for two and a half years; however, at this stage, automatic links to the other 3 modules have not been implemented yet, except for a partial link to IOS (CASPR). SFC revolves around a DB/2 database with PFORMS acting as the database management system (DBMS). PFORMS is an off-the-shelf DB/2 application that provides a set of data entry screens and query forms. The main dynamic entity in the SFC and IOS database is a task; thus, the physical storage location and update privileges are driven by the status of the WAD. As we explored the SFC values, we realized that there was much to do before actually engaging in continuous analysis of the SFC data. Half way into this effort, it was realized that full scale analysis would have to be a future third phase of this effort. So, we concentrated on getting to know the contents of the database, and in establishing an initial set of tools to start the continuous analysis process. Specifically, we set out to: (1) provide specific procedures for statistical models, so as to enhance the TP-OAO office analysis and modeling capabilities; (2) design a data exchange interface; (3) prototype the interface to provide inputs to SCRAM; and (4) design a modeling database. These objectives were set with the expectation that, if met, they would provide former TP-OAO engineers with tools that would help them demonstrate the importance of process-based analyses. The latter, in return, will help them obtain the cooperation of various organizations in charting out their individual processes.

  17. Benchmarking distributed data warehouse solutions for storing genomic variant information

    PubMed Central

    Wiewiórka, Marek S.; Wysakowicz, Dawid P.; Okoniewski, Michał J.

    2017-01-01

    Abstract Genomic-based personalized medicine encompasses storing, analysing and interpreting genomic variants as its central issues. At a time when thousands of patientss sequenced exomes and genomes are becoming available, there is a growing need for efficient database storage and querying. The answer could be the application of modern distributed storage systems and query engines. However, the application of large genomic variant databases to this problem has not been sufficiently far explored so far in the literature. To investigate the effectiveness of modern columnar storage [column-oriented Database Management System (DBMS)] and query engines, we have developed a prototypic genomic variant data warehouse, populated with large generated content of genomic variants and phenotypic data. Next, we have benchmarked performance of a number of combinations of distributed storages and query engines on a set of SQL queries that address biological questions essential for both research and medical applications. In addition, a non-distributed, analytical database (MonetDB) has been used as a baseline. Comparison of query execution times confirms that distributed data warehousing solutions outperform classic relational DBMSs. Moreover, pre-aggregation and further denormalization of data, which reduce the number of distributed join operations, significantly improve query performance by several orders of magnitude. Most of distributed back-ends offer a good performance for complex analytical queries, while the Optimized Row Columnar (ORC) format paired with Presto and Parquet with Spark 2 query engines provide, on average, the lowest execution times. Apache Kudu on the other hand, is the only solution that guarantees a sub-second performance for simple genome range queries returning a small subset of data, where low-latency response is expected, while still offering decent performance for running analytical queries. In summary, research and clinical applications that require the storage and analysis of variants from thousands of samples can benefit from the scalability and performance of distributed data warehouse solutions. Database URL: https://github.com/ZSI-Bio/variantsdwh PMID:29220442

  18. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters.

    PubMed

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector) - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.

  19. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters

    PubMed Central

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software − ‘RED’ (RNA Editing sites Detector) − for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector. PMID:26930599

  20. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  1. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  2. Fourier-transform-infrared-spectroscopy based spectral-biomarker selection towards optimum diagnostic differentiation of oral leukoplakia and cancer.

    PubMed

    Banerjee, Satarupa; Pal, Mousumi; Chakrabarty, Jitamanyu; Petibois, Cyril; Paul, Ranjan Rashmi; Giri, Amita; Chatterjee, Jyotirmoy

    2015-10-01

    In search of specific label-free biomarkers for differentiation of two oral lesions, namely oral leukoplakia (OLK) and oral squamous-cell carcinoma (OSCC), Fourier-transform infrared (FTIR) spectroscopy was performed on paraffin-embedded tissue sections from 47 human subjects (eight normal (NOM), 16 OLK, and 23 OSCC). Difference between mean spectra (DBMS), Mann-Whitney's U test, and forward feature selection (FFS) techniques were used for optimising spectral-marker selection. Classification of diseases was performed with linear and quadratic support vector machine (SVM) at 10-fold cross-validation, using different combinations of spectral features. It was observed that six features obtained through FFS enabled differentiation of NOM and OSCC tissue (1782, 1713, 1665, 1545, 1409, and 1161 cm(-1)) and were most significant, able to classify OLK and OSCC with 81.3 % sensitivity, 95.7 % specificity, and 89.7 % overall accuracy. The 43 spectral markers extracted through Mann-Whitney's U Test were the least significant when quadratic SVM was used. Considering the high sensitivity and specificity of the FFS technique, extracting only six spectral biomarkers was thus most useful for diagnosis of OLK and OSCC, and to overcome inter and intra-observer variability experienced in diagnostic best-practice histopathological procedure. By considering the biochemical assignment of these six spectral signatures, this work also revealed altered glycogen and keratin content in histological sections which could able to discriminate OLK and OSCC. The method was validated through spectral selection by the DBMS technique. Thus this method has potential for diagnostic cost minimisation for oral lesions by label-free biomarker identification.

  3. Histological evaluation of an impacted bone graft substitute composed of a combination of mineralized and demineralized allograft in a sheep vertebral bone defect.

    PubMed

    Fujishiro, Takaaki; Bauer, Thomas W; Kobayashi, Naomi; Kobayashi, Hideo; Sunwoo, Moon Hae; Seim, Howard B; Turner, A Simon

    2007-09-01

    Demineralized bone matrix (DBMs) preparations are a potential alternative or supplement to autogenous bone graft, but many DBMs have not been adequately tested in clinically relevant animal models. The aim of current study was to compare the efficacy of a new bone graft substitute composed of a combination of mineralized and demineralized allograft, along with hyaluronic acid (AFT Bone Void Filler) with several other bone graft materials in a sheep vertebral bone void model. A drilled defect in the sheep vertebral body was filled with either the new DBM preparation, calcium sulfate (OsteoSet), autologous bone graft, or left empty. The sheep were euthanized after 6 or 12 weeks, and the defects were examined by histology and quantitative histomorphometry. The morphometry data were analyzed by one-way analysis of variance with the post hoc Tukey-Kramer test or the Student's t-test. All of the bone defects in the AFT DBM preparation group showed good new bone formation with variable amounts of residual DBM and mineralized bone graft. The DBM preparation group at 12 weeks contained significantly more new bone than the defects treated with calcium sulfate or left empty (respectively, p < 0.05, p < 0.01). There was no significant difference between the DBM and autograft groups. No adverse inflammatory reactions were associated with any of the three graft materials. The AFT preparation of a mixture of mineralized and demineralized allograft appears to be an effective autograft substitute as tested in this sheep vertebral bone void model.

  4. The efficacy of different commercially available demineralized bone matrix substances in an athymic rat model.

    PubMed

    Lee, Yu-Po; Jo, Mark; Luna, Mario; Chien, Bobby; Lieberman, Jay R; Wang, Jeffrey C

    2005-10-01

    Bone graft substitutes have been developed because there is a limited supply of autogenous graft and the harvesting of iliac crest bone graft is associated with significant morbidity. Currently, there are a number of different commercially available demineralized bone matrix (DBM) products available that have been prepared by different methods and have different carriers. The objective of this study was to compare eight different commercially available DBM products. Eight different DBMs were used to attempt a spinal fusion between the L4-L5 transverse processes in athymic rats. There were 10 rats in each group, and 5 rats were killed at both 4 and 8 weeks. Radiographic and histologic analyses were performed. Manual palpation testing was also performed. At 4 weeks, Osteofil Paste had the highest radiographic scores, whereas Grafton Putty had the best radiographic scores at 8 weeks. Conversely, the spines implanted with Allomatrix had the lowest radiographic scores at both 4 and 8 weeks. In regard to forming a spinal fusion confirmed by manual palpation, Osteofil Paste was the most effective at 4 weeks, whereas Grafton Flex and Grafton Putty had the highest rate of fusion at 8 weeks. Conversely, the lowest rates of fusion were seen in the Allomatrix and Grafton Crunch groups. Statistical analysis showed that there were significant differences among the groups seen on radiographs and by manual palpation. Qualitative differences could be appreciated between the groups histologically. Significant differences exist among commercially available DBMs in forming a spinal fusion in an athymic rat.

  5. A Scalable Data Integration and Analysis Architecture for Sensor Data of Pediatric Asthma.

    PubMed

    Stripelis, Dimitris; Ambite, José Luis; Chiang, Yao-Yi; Eckel, Sandrah P; Habre, Rima

    2017-04-01

    According to the Centers for Disease Control, in the United States there are 6.8 million children living with asthma. Despite the importance of the disease, the available prognostic tools are not sufficient for biomedical researchers to thoroughly investigate the potential risks of the disease at scale. To overcome these challenges we present a big data integration and analysis infrastructure developed by our Data and Software Coordination and Integration Center (DSCIC) of the NIBIB-funded Pediatric Research using Integrated Sensor Monitoring Systems (PRISMS) program. Our goal is to help biomedical researchers to efficiently predict and prevent asthma attacks. The PRISMS-DSCIC is responsible for collecting, integrating, storing, and analyzing real-time environmental, physiological and behavioral data obtained from heterogeneous sensor and traditional data sources. Our architecture is based on the Apache Kafka, Spark and Hadoop frameworks and PostgreSQL DBMS. A main contribution of this work is extending the Spark framework with a mediation layer, based on logical schema mappings and query rewriting, to facilitate data analysis over a consistent harmonized schema. The system provides both batch and stream analytic capabilities over the massive data generated by wearable and fixed sensors.

  6. Developing and Testing a 3d Cadastral Data Model a Case Study in Australia

    NASA Astrophysics Data System (ADS)

    Aien, A.; Kalantari, M.; Rajabifard, A.; Williamson, I. P.; Shojaei, D.

    2012-07-01

    Population growth, urbanization and industrialization place more pressure on land use with the need for increased space. To extend the use and functionality of the land, complex infrastructures are being built, both vertically and horizontally, layered and stacked. These three-dimensional (3D) developments affect the interests (Rights, Restrictions, and Responsibilities (RRRs)) attached to the underlying land. A 3D cadastre will assist in managing the effects of 3D development on a particular extent of land. There are many elements that contribute to developing a 3D cadastre, such as existing of 3D property legislations, 3D DBMS, 3D visualization. However, data modelling is one of the most important elements of a successful 3D cadastre. As architectural models of houses and high rise buildings help their users visualize the final product, 3D cadastre data model supports 3D cadastre users to understand the structure or behavior of the system and has a template that guides them to construct and implement the 3D cadastre. Many jurisdictions, organizations and software developers have built their own cadastral data model. Land Administration Domain Model (DIS-ISO 19152, The Netherlands) and ePlan (Intergovernmental Committee on Surveying and Mapping, Australia) are examples of existing data models. The variation between these data models is the result of different attitudes towards cadastres. However, there is a basic common thread among them all. Current cadastral data models use a 2D land-parcel concept and extend it to support 3D requirements. These data models cannot adequately manage and represent the spatial extent of 3D RRRs. Most of the current cadastral data models have been influenced by a very broad understanding of 3D cadastral concepts because better clarity in what needs to be represented and analysed in the cadastre needs to be established. This paper presents the first version of a 3D Cadastral Data Model (3DCDM_Version 1.0). 3DCDM models both the legal and physical extent of 3D properties and associated interests. The data model extends the traditional cadastral requirements to cover other applications such as urban planning and land valuation and taxation. A demonstration of a test system on the proposed data model is also presented. The test is based on a case study in Victoria, Australia to evaluate the effectiveness of the data model.

  7. Metadata to Support Data Warehouse Evolution

    NASA Astrophysics Data System (ADS)

    Solodovnikova, Darja

    The focus of this chapter is metadata necessary to support data warehouse evolution. We present the data warehouse framework that is able to track evolution process and adapt data warehouse schemata and data extraction, transformation, and loading (ETL) processes. We discuss the significant part of the framework, the metadata repository that stores information about the data warehouse, logical and physical schemata and their versions. We propose the physical implementation of multiversion data warehouse in a relational DBMS. For each modification of a data warehouse schema, we outline the changes that need to be made to the repository metadata and in the database.

  8. SQL Triggers Reacting on Time Events: An Extension Proposal

    NASA Astrophysics Data System (ADS)

    Behrend, Andreas; Dorau, Christian; Manthey, Rainer

    Being able to activate triggers at timepoints reached or after time intervals elapsed has been acknowledged by many authors as a valuable functionality of a DBMS. Recently, the interest in time-based triggers has been renewed in the context of data stream monitoring. However, up till now SQL triggers react to data changes only, even though research proposals and prototypes have been supporting several other event types, in particular time-based ones, since long. We therefore propose a seamless extension of the SQL trigger concept by time-based triggers, focussing on semantic issues arising from such an extension.

  9. Integrating the IA2 Astronomical Archive in the VO: The VO-Dance Engine

    NASA Astrophysics Data System (ADS)

    Molinaro, M.; Laurino, O.; Smareglia, R.

    2012-09-01

    Virtual Observatory (VO) protocols and standards are getting mature and the astronomical community asks for astrophysical data to be easily reachable. This means data centers have to intensify their efforts to provide the data they manage not only through proprietary portals and services but also through interoperable resources developed on the basis of the IVOA (International Virtual Observatory Alliance) recommendations. Here we present the work and ideas developed at the IA2 (Italian Astronomical Archive) data center hosted by the INAF-OATs (Italian Institute for Astrophysics - Trieste Astronomical Observatory) to reach this goal. The core point is the development of an application that from existing DB and archive structures can translate their content to VO compliant resources: VO-Dance (written in Java). This application, in turn, relies on a database (potentially DBMS independent) to store the translation layer information of each resource and auxiliary content (UCDs, field names, authorizations, policies, etc.). The last token is an administrative interface (currently developed using the Django python framework) to allow the data center administrators to set up and maintain resources. This deployment, platform independent, with database and administrative interface highly customizable, means the package, when stable and easily distributable, can be also used by single astronomers or groups to set up their own resources from their public datasets.

  10. Tissue engineering rib with the incorporation of biodegradable polymer cage and BMSCs/decalcified bone: an experimental study in a canine model.

    PubMed

    Tang, Hua; Wu, Bin; Qin, Xiong; Zhang, Lu; Kretlow, Jim; Xu, Zhifei

    2013-05-20

    The reconstruction of large bone defects, including rib defects, remains a challenge for surgeons. In this study, we used biodegradable polydioxanone (PDO) cages to tissue engineer ribs for the reconstruction of 4cm-long costal defects. PDO sutures were used to weave 6cm long and 1cm diameter cages. Demineralized bone matrix (DBM) which is a xenograft was molded into cuboids and seeded with second passage bone marrow mesenchymal stem cells (BMSCs) that had been osteogenically induced. Two DBM cuboids seeded with BMSCs were put into the PDO cage and used to reconstruct the costal defects. Radiographic examination including 3D reconstruction, histologic examination and mechanical test was performed after 24 postoperative weeks. All the experimental subjects survived. In all groups, the PDO cage had completely degraded after 24 weeks and been replaced by fibrous tissue. Better shape and radian were achieved in PDO cages filled with DBM and BMSCs than in the other two groups (cages alone, or cages filled with acellular DBM cuboids). When the repaired ribs were subjected to an outer force, the ribs in the PDO cage/DBMs/BMSCs group kept their original shape while ribs in the other two groups deformed. In the PDO cage/DBMs/BMSCs groups, we also observed bony union at all the construct interfaces while there was no bony union observed in the other two groups. This result was also confirmed by radiographic and histologic examination. This study demonstrates that biodegradable PDO cage in combination with two short BMSCs/DBM cuboids can repair large rib defects. The satisfactory repair rate suggests that this might be a feasible approach for large bone repair.

  11. Tissue engineering rib with the incorporation of biodegradable polymer cage and BMSCs/decalcified bone: an experimental study in a canine model

    PubMed Central

    2013-01-01

    Background The reconstruction of large bone defects, including rib defects, remains a challenge for surgeons. In this study, we used biodegradable polydioxanone (PDO) cages to tissue engineer ribs for the reconstruction of 4cm-long costal defects. Methods PDO sutures were used to weave 6cm long and 1cm diameter cages. Demineralized bone matrix (DBM) which is a xenograft was molded into cuboids and seeded with second passage bone marrow mesenchymal stem cells (BMSCs) that had been osteogenically induced. Two DBM cuboids seeded with BMSCs were put into the PDO cage and used to reconstruct the costal defects. Radiographic examination including 3D reconstruction, histologic examination and mechanical test was performed after 24 postoperative weeks. Results All the experimental subjects survived. In all groups, the PDO cage had completely degraded after 24 weeks and been replaced by fibrous tissue. Better shape and radian were achieved in PDO cages filled with DBM and BMSCs than in the other two groups (cages alone, or cages filled with acellular DBM cuboids). When the repaired ribs were subjected to an outer force, the ribs in the PDO cage/DBMs/BMSCs group kept their original shape while ribs in the other two groups deformed. In the PDO cage/DBMs/BMSCs groups, we also observed bony union at all the construct interfaces while there was no bony union observed in the other two groups. This result was also confirmed by radiographic and histologic examination. Conclusions This study demonstrates that biodegradable PDO cage in combination with two short BMSCs/DBM cuboids can repair large rib defects. The satisfactory repair rate suggests that this might be a feasible approach for large bone repair. PMID:23688344

  12. An approach for access differentiation design in medical distributed applications built on databases.

    PubMed

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  13. A WebGIS system on the base of satellite data processing system for marine application

    NASA Astrophysics Data System (ADS)

    Gong, Fang; Wang, Difeng; Huang, Haiqing; Chen, Jianyu

    2007-10-01

    From 2002 to 2004, a satellite data processing system for marine application had been built up in State Key Laboratory of Satellite Ocean Environment Dynamics (Second Institute of Oceanography, State Oceanic Administration). The system received satellite data from TERRA, AQUA, NOAA-12/15/16/17/18, FY-1D and automatically generated Level3 products and Level4 products(products of single orbit and merged multi-orbits products) deriving from Level0 data, which is controlled by an operational control sub-system. Currently, the products created by this system play an important role in the marine environment monitoring, disaster monitoring and researches. Now a distribution platform has been developed on this foundation, namely WebGIS system for querying and browsing of oceanic remote sensing data. This system is based upon large database system-Oracle. We made use of the space database engine of ArcSDE and other middleware to perform database operation in addition. J2EE frame was adopted as development model, and Oracle 9.2 DBMS as database background and server. Simply using standard browsers(such as IE6.0), users can visit and browse the public service information that provided by system, including browsing for oceanic remote sensing data, and enlarge, contract, move, renew, traveling, further data inquiry, attribution search and data download etc. The system is still under test now. Founding of such a system will become an important distribution platform of Chinese satellite oceanic environment products of special topic and category (including Sea surface temperature, Concentration of chlorophyll, and so on), for the exaltation of satellite products' utilization and promoting the data share and the research of the oceanic remote sensing platform.

  14. Overview of the NASA/RECON educational, research, and development activities of the Computer Science Departments of the University of Southwestern Louisiana and Southern University

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor)

    1984-01-01

    This document presents a brief overview of the scope of activities undertaken by the Computer Science Departments of the University of Southern Louisiana (USL) and Southern University (SU) pursuant to a contract with NASA. Presented are only basic identification data concerning the contract activities since subsequent entries within the Working Paper Series will be oriented specifically toward a detailed development and presentation of plans, methodologies, and results of each contract activity. Also included is a table of contents of the entire USL/DBMS NASA/RECON Working Paper Series.

  15. Astronomical Data Processing Using SciQL, an SQL Based Query Language for Array Data

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Scheers, B.; Kersten, M.; Ivanova, M.; Nes, N.

    2012-09-01

    SciQL (pronounced as ‘cycle’) is a novel SQL-based array query language for scientific applications with both tables and arrays as first class citizens. SciQL lowers the entrance fee of adopting relational DBMS (RDBMS) in scientific domains, because it includes functionality often only found in mathematics software packages. In this paper, we demonstrate the usefulness of SciQL for astronomical data processing using examples from the Transient Key Project of the LOFAR radio telescope. In particular, how the LOFAR light-curve database of all detected sources can be constructed, by correlating sources across the spatial, frequency, time and polarisation domains.

  16. A Hybrid Spatio-Temporal Data Indexing Method for Trajectory Databases

    PubMed Central

    Ke, Shengnan; Gong, Jun; Li, Songnian; Zhu, Qing; Liu, Xintao; Zhang, Yeting

    2014-01-01

    In recent years, there has been tremendous growth in the field of indoor and outdoor positioning sensors continuously producing huge volumes of trajectory data that has been used in many fields such as location-based services or location intelligence. Trajectory data is massively increased and semantically complicated, which poses a great challenge on spatio-temporal data indexing. This paper proposes a spatio-temporal data indexing method, named HBSTR-tree, which is a hybrid index structure comprising spatio-temporal R-tree, B*-tree and Hash table. To improve the index generation efficiency, rather than directly inserting trajectory points, we group consecutive trajectory points as nodes according to their spatio-temporal semantics and then insert them into spatio-temporal R-tree as leaf nodes. Hash table is used to manage the latest leaf nodes to reduce the frequency of insertion. A new spatio-temporal interval criterion and a new node-choosing sub-algorithm are also proposed to optimize spatio-temporal R-tree structures. In addition, a B*-tree sub-index of leaf nodes is built to query the trajectories of targeted objects efficiently. Furthermore, a database storage scheme based on a NoSQL-type DBMS is also proposed for the purpose of cloud storage. Experimental results prove that HBSTR-tree outperforms TB*-tree in some aspects such as generation efficiency, query performance and query type. PMID:25051028

  17. A hybrid spatio-temporal data indexing method for trajectory databases.

    PubMed

    Ke, Shengnan; Gong, Jun; Li, Songnian; Zhu, Qing; Liu, Xintao; Zhang, Yeting

    2014-07-21

    In recent years, there has been tremendous growth in the field of indoor and outdoor positioning sensors continuously producing huge volumes of trajectory data that has been used in many fields such as location-based services or location intelligence. Trajectory data is massively increased and semantically complicated, which poses a great challenge on spatio-temporal data indexing. This paper proposes a spatio-temporal data indexing method, named HBSTR-tree, which is a hybrid index structure comprising spatio-temporal R-tree, B*-tree and Hash table. To improve the index generation efficiency, rather than directly inserting trajectory points, we group consecutive trajectory points as nodes according to their spatio-temporal semantics and then insert them into spatio-temporal R-tree as leaf nodes. Hash table is used to manage the latest leaf nodes to reduce the frequency of insertion. A new spatio-temporal interval criterion and a new node-choosing sub-algorithm are also proposed to optimize spatio-temporal R-tree structures. In addition, a B*-tree sub-index of leaf nodes is built to query the trajectories of targeted objects efficiently. Furthermore, a database storage scheme based on a NoSQL-type DBMS is also proposed for the purpose of cloud storage. Experimental results prove that HBSTR-tree outperforms TB*-tree in some aspects such as generation efficiency, query performance and query type.

  18. Introducing the Forensic Research/Reference on Genetics knowledge base, FROG-kb

    PubMed Central

    2012-01-01

    Background Online tools and databases based on multi-allelic short tandem repeat polymorphisms (STRPs) are actively used in forensic teaching, research, and investigations. The Fst value of each CODIS marker tends to be low across the populations of the world and most populations typically have all the common STRP alleles present diminishing the ability of these systems to discriminate ethnicity. Recently, considerable research is being conducted on single nucleotide polymorphisms (SNPs) to be considered for human identification and description. However, online tools and databases that can be used for forensic research and investigation are limited. Methods The back end DBMS (Database Management System) for FROG-kb is Oracle version 10. The front end is implemented with specific code using technologies such as Java, Java Servlet, JSP, JQuery, and GoogleCharts. Results We present an open access web application, FROG-kb (Forensic Research/Reference on Genetics-knowledge base, http://frog.med.yale.edu), that is useful for teaching and research relevant to forensics and can serve as a tool facilitating forensic practice. The underlying data for FROG-kb are provided by the already extensively used and referenced ALlele FREquency Database, ALFRED (http://alfred.med.yale.edu). In addition to displaying data in an organized manner, computational tools that use the underlying allele frequencies with user-provided data are implemented in FROG-kb. These tools are organized by the different published SNP/marker panels available. This web tool currently has implemented general functions possible for two types of SNP panels, individual identification and ancestry inference, and a prediction function specific to a phenotype informative panel for eye color. Conclusion The current online version of FROG-kb already provides new and useful functionality. We expect FROG-kb to grow and expand in capabilities and welcome input from the forensic community in identifying datasets and functionalities that will be most helpful and useful. Thus, the structure and functionality of FROG-kb will be revised in an ongoing process of improvement. This paper describes the state as of early June 2012. PMID:22938150

  19. CLIPS, AppleEvents, and AppleScript: Integrating CLIPS with commercial software

    NASA Technical Reports Server (NTRS)

    Compton, Michael M.; Wolfe, Shawn R.

    1994-01-01

    Many of today's intelligent systems are comprised of several modules, perhaps written in different tools and languages, that together help solve the user's problem. These systems often employ a knowledge-based component that is not accessed directly by the user, but instead operates 'in the background' offering assistance to the user as necessary. In these types of modular systems, an efficient, flexible, and eady-to-use mechanism for sharing data between programs is crucial. To help permit transparent integration of CLIPS with other Macintosh applications, the AI Research Branch at NASA Ames Research Center has extended CLIPS to allow it to communicate transparently with other applications through two popular data-sharing mechanisms provided by the Macintosh operating system: Apple Events (a 'high-level' event mechanism for program-to-program communication), and AppleScript, a recently-released scripting language for the Macintosh. This capability permits other applications (running on either the same or a remote machine) to send a command to CLIPS, which then responds as if the command were typed into the CLIPS dialog window. Any result returned by the command is then automatically returned to the program that sent it. Likewise, CLIPS can send several types of Apple Events directly to other local or remote applications. This CLIPS system has been successfully integrated with a variety of commercial applications, including data collection programs, electronics forms packages, DBMS's, and email programs. These mechanisms can permit transparent user access to the knowledge base from within a commercial application, and allow a single copy of the knowledge base to service multiple users in a networked environment.

  20. Plasma and breast milk pharmacokinetics of emtricitabine, tenofovir and lamivudine using dried blood and breast milk spots in nursing African mother–infant pairs

    PubMed Central

    Waitt, Catriona; Olagunju, Adeniyi; Nakalema, Shadia; Kyohaire, Isabella; Owen, Andrew; Lamorde, Mohammed; Khoo, Saye

    2018-01-01

    Abstract Background Breast milk transfer of first-line ART from mother to infant is not fully understood. Objectives To determine the concentrations of lamivudine, emtricitabine and tenofovir in maternal blood, breast milk and infant blood from breastfeeding mother–infant pairs. Methods Intensive pharmacokinetic sampling of maternal dried blood spots (DBS), dried breast milk spots (DBMS) and infant DBS from 30 Ugandan and 29 Nigerian mothers receiving first-line ART and their infants was conducted. DBS and DBMS were collected pre-dose and at 5–6 timepoints up to 12 h following observed dosing. Infant DBS were sampled twice during this period. Lamivudine, emtricitabine and tenofovir were quantified using LC-MS/MS, with non-compartmental analysis to calculate key pharmacokinetic parameters. Results Peak concentrations in breast milk from women taking lamivudine and emtricitabine occurred later than in plasma (4–8 h compared with 2 h for lamivudine and 2–4 h for emtricitabine). Consequently, the milk-to-plasma (M:P) ratio of lamivudine taken once daily was 0.95 (0.82–1.15) for AUC0–12, whereas for AUC12–20 this was 3.04 (2.87–4.16). Lamivudine was detectable in 36% (14/39) of infants [median 17.7 (16.3–22.7) ng/mL]. For 200 mg of emtricitabine once daily, the median M:P ratio was 3.01 (2.06–3.38). Three infants (19%) had measurable emtricitabine [median 18.5 (17.6–20.8) ng/mL]. For 300 mg of tenofovir once daily, the median M:P ratio was 0.015 (0–0.03) and no infant had measurable tenofovir concentrations. Conclusions Emtricitabine and lamivudine accumulate in breast milk and were detected in breastfeeding infants. In contrast, tenofovir penetrates the breast milk to a small degree, but is undetectable in breastfeeding infants. PMID:29309634

  1. Plasma and breast milk pharmacokinetics of emtricitabine, tenofovir and lamivudine using dried blood and breast milk spots in nursing African mother-infant pairs.

    PubMed

    Waitt, Catriona; Olagunju, Adeniyi; Nakalema, Shadia; Kyohaire, Isabella; Owen, Andrew; Lamorde, Mohammed; Khoo, Saye

    2018-04-01

    Breast milk transfer of first-line ART from mother to infant is not fully understood. To determine the concentrations of lamivudine, emtricitabine and tenofovir in maternal blood, breast milk and infant blood from breastfeeding mother-infant pairs. Intensive pharmacokinetic sampling of maternal dried blood spots (DBS), dried breast milk spots (DBMS) and infant DBS from 30 Ugandan and 29 Nigerian mothers receiving first-line ART and their infants was conducted. DBS and DBMS were collected pre-dose and at 5-6 timepoints up to 12 h following observed dosing. Infant DBS were sampled twice during this period. Lamivudine, emtricitabine and tenofovir were quantified using LC-MS/MS, with non-compartmental analysis to calculate key pharmacokinetic parameters. Peak concentrations in breast milk from women taking lamivudine and emtricitabine occurred later than in plasma (4-8 h compared with 2 h for lamivudine and 2-4 h for emtricitabine). Consequently, the milk-to-plasma (M:P) ratio of lamivudine taken once daily was 0.95 (0.82-1.15) for AUC0-12, whereas for AUC12-20 this was 3.04 (2.87-4.16). Lamivudine was detectable in 36% (14/39) of infants [median 17.7 (16.3-22.7) ng/mL]. For 200 mg of emtricitabine once daily, the median M:P ratio was 3.01 (2.06-3.38). Three infants (19%) had measurable emtricitabine [median 18.5 (17.6-20.8) ng/mL]. For 300 mg of tenofovir once daily, the median M:P ratio was 0.015 (0-0.03) and no infant had measurable tenofovir concentrations. Emtricitabine and lamivudine accumulate in breast milk and were detected in breastfeeding infants. In contrast, tenofovir penetrates the breast milk to a small degree, but is undetectable in breastfeeding infants.

  2. A first evaluation of a pedagogical network for medical students at the University Hospital of Rennes.

    PubMed

    Fresnel, A; Jarno, P; Burgun, A; Delamarre, D; Denier, P; Cleret, M; Courtin, C; Seka, L P; Pouliquen, B; Cléran, L; Riou, C; Leduff, F; Lesaux, H; Duvauferrier, R; Le Beux, P

    1998-01-01

    A pedagogical network has been developed at University Hospital of Rennes from 1996. The challenge is to give medical information and informatics tools to all medical students in the clinical wards of the University Hospital. At first, nine wards were connected to the medical school server which is linked to the Internet. Client software electronic mail and WWW Netscape on Macintosh computers. Sever software is set up on Unix SUN providing a local homepage with selected pedagogical resources. These documents are stored in a DBMS database ORACLE and queries can be provided by specialty, authors or disease. The students can access a set of interactive teaching programs or electronic textbooks and can explore the Internet through the library information system and search engines. The teachers can send URL and indexation of pedagogical documents and can produce clinical cases: the database updating will be done by the users. This experience of using Web tools generated enthusiasm when we first introduced it to students. The evaluation shows that if the students can use this training early on, they will adapt the resources of the Internet to their own needs.

  3. Security of medical multimedia.

    PubMed

    Tzelepi, S; Pangalos, G; Nikolacopoulou, G

    2002-09-01

    The application of information technology to health care has generated growing concern about the privacy and security of medical information. Furthermore, data and communication security requirements in the field of multimedia are higher. In this paper we describe firstly the most important security requirements that must be fulfilled by multimedia medical data, and the security measures used to satisfy these requirements. These security measures are based mainly on modern cryptographic and watermarking mechanisms as well as on security infrastructures. The objective of our work is to complete this picture, exploiting the capabilities of multimedia medical data to define and implement an authorization model for regulating access to the data. In this paper we describe an extended role-based access control model by considering, within the specification of the role-permission relationship phase, the constraints that must be satisfied in order for the holders of the permission to use those permissions. The use of constraints allows role-based access control to be tailored to specifiy very fine-grained and flexible content-, context- and time-based access control policies. Other restrictions, such as role entry restriction also can be captured. Finally, the description of system architecture for a secure DBMS is presented.

  4. ALMA software architecture

    NASA Astrophysics Data System (ADS)

    Schwarz, Joseph; Raffi, Gianni

    2002-12-01

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe and North America. ALMA will consist of at least 64 12-meter antennas operating in the millimeter and sub-millimeter range. It will be located at an altitude of about 5000m in the Chilean Atacama desert. The primary challenge to the development of the software architecture is the fact that both its development and runtime environments will be distributed. Groups at different institutes will develop the key elements such as Proposal Preparation tools, Instrument operation, On-line calibration and reduction, and Archiving. The Proposal Preparation software will be used primarily at scientists' home institutions (or on their laptops), while Instrument Operations will execute on a set of networked computers at the ALMA Operations Support Facility. The ALMA Science Archive, itself to be replicated at several sites, will serve astronomers worldwide. Building upon the existing ALMA Common Software (ACS), the system architects will prepare a robust framework that will use XML-encoded entity objects to provide an effective solution to the persistence needs of this system, while remaining largely independent of any underlying DBMS technology. Independence of distributed subsystems will be facilitated by an XML- and CORBA-based pass-by-value mechanism for exchange of objects. Proof of concept (as well as a guide to subsystem developers) will come from a prototype whose details will be presented.

  5. Status of the CDS Services, SIMBAD, VizieR and Aladin

    NASA Astrophysics Data System (ADS)

    Genova, Francoise; Allen, M. G.; Bienayme, O.; Boch, T.; Bonnarel, F.; Cambresy, L.; Derriere, S.; Dubois, P.; Fernique, P.; Landais, G.; Lesteven, S.; Loup, C.; Oberto, A.; Ochsenbein, F.; Schaaff, A.; Vollmer, B.; Wenger, M.; Louys, M.; Davoust, E.; Jasniewicz, G.

    2006-12-01

    Major evolutions have been implemented in the three main CDS databases in 2006. SIMBAD 4, a new version of SIMBAD developed with Java and PostgreSQL, has been released. Il is much more flexible than the previous version and offers in particular full search capabilities on all parameters. Wild card can also be used in object names, which should ease searching for a given object in the frequent case of 'fuzzy' nomenclature. New information is progressively added, in particular a set of multiwavelength magnitudes (in progress), and other information from the Dictionnary of Nomenclature such as the list of object types attached to each object name (available), or hierarchy and associations (in progress). A new version of VizieR, also in the open source PostgreSQL DBMS, has been completed, in order to simplify mirroring. The master database at CDS currently remains in the present Sybase implementation. A new simplified interface will be demonstrated, providing a more user-friendly navigation while retaining the multiple browsing capabilities. A new release of the Aladin Sky Atlas offers new capabilities, like the management of multipart FITS files and of data cubes, construction and execution of macros for processing a list of targets, and improved navigation within an image plane. This new version also allows easy and efficient manipulation of very large (>108 pixels) images, support for solar images display, and direct access to SExtractor to perform source extraction on displayed images.

  6. Dual-use bimorph deformable mirrors

    NASA Astrophysics Data System (ADS)

    Griffith, M. S.; Laycock, L. C.; Bagshaw, J. M.; Rowe, D.

    2005-11-01

    Adaptive Optics (AO) is a critical underpinning technology for future optical countermeasures, laser delivery, target illumination and imaging systems. It measures and compensates for optical distortion caused by transmission through the atmosphere, resulting in the ability to deploy smaller lasers and identify targets at greater ranges. AO is also well established in ground based astronomy, and is finding applications in free space optical communications and ophthalmology. One of the key components in an AO system is the wavefront modifier, which acts on the incoming or outgoing beam to counter the effects of the atmosphere. BAE SYSTEMS ATC is developing multi-element Deformable Bimorph Mirrors (DBMs) for such applications. A traditional bimorph deformable mirror uses a set of edge electrodes outside the active area in order to meet the required boundary conditions for the active aperture. This inflicts a significant penalty in terms of bandwidth, which is inversely proportional to the square of the full mirror diameter. We have devised a number of novel mounting arrangements that reduce dead space and thus provide a much improved trade-off between bandwidth and stroke. These schemes include a novel method for providing vertical displacement at the periphery of the aperture, a method for providing a continuous compliant support underneath the bimorph mirror, and a method for providing a three point support underneath the bimorph. In all three cases, there is no requirement for edge electrodes to provide the boundary conditions, resulting in devices of much higher bandwidth. The target is to broaden the use of these types of mirror beyond the current limits of either low order/low bandwidth, to address the high order, high bandwidth systems required by long range, horizontal path applications. This paper will discuss the different mirror designs, and present experimental results for the most recently assembled mirrors.

  7. Column Store for GWAC: A High-cadence, High-density, Large-scale Astronomical Light Curve Pipeline and Distributed Shared-nothing Database

    NASA Astrophysics Data System (ADS)

    Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan

    2016-11-01

    The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.

  8. High Performance Analytics with the R3-Cache

    NASA Astrophysics Data System (ADS)

    Eavis, Todd; Sayeed, Ruhan

    Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.

  9. A hybrid 3D spatial access method based on quadtrees and R-trees for globe data

    NASA Astrophysics Data System (ADS)

    Gong, Jun; Ke, Shengnan; Li, Xiaomin; Qi, Shuhua

    2009-10-01

    3D spatial access method for globe data is very crucial technique for virtual earth. This paper presents a brand-new maintenance method to index 3d objects distributed on the whole surface of the earth, which integrates the 1:1,000,000- scale topographic map tiles, Quad-tree and R-tree. Furthermore, when traditional methods are extended into 3d space, the performance of spatial index deteriorates badly, for example 3D R-tree. In order to effectively solve this difficult problem, a new algorithm of dynamic R-tree is put forward, which includes two sub-procedures, namely node-choosing and node-split. In the node-choosing algorithm, a new strategy is adopted, not like the traditional mode which is from top to bottom, but firstly from bottom to top then from top to bottom. This strategy can effectively solve the negative influence of node overlap. In the node-split algorithm, 2-to-3 split mode substitutes the traditional 1-to-2 mode, which can better concern the shape and size of nodes. Because of the rational tree shape, this R-tree method can easily integrate the concept of LOD. Therefore, it will be later implemented in commercial DBMS and adopted in time-crucial 3d GIS system.

  10. Silicon photonic integrated circuits with electrically programmable non-volatile memory functions.

    PubMed

    Song, J-F; Lim, A E-J; Luo, X-S; Fang, Q; Li, C; Jia, L X; Tu, X-G; Huang, Y; Zhou, H-F; Liow, T-Y; Lo, G-Q

    2016-09-19

    Conventional silicon photonic integrated circuits do not normally possess memory functions, which require on-chip power in order to maintain circuit states in tuned or field-configured switching routes. In this context, we present an electrically programmable add/drop microring resonator with a wavelength shift of 426 pm between the ON/OFF states. Electrical pulses are used to control the choice of the state. Our experimental results show a wavelength shift of 2.8 pm/ms and a light intensity variation of ~0.12 dB/ms for a fixed wavelength in the OFF state. Theoretically, our device can accommodate up to 65 states of multi-level memory functions. Such memory functions can be integrated into wavelength division mutiplexing (WDM) filters and applied to optical routers and computing architectures fulfilling large data downloading demands.

  11. Retrieving high-resolution images over the Internet from an anatomical image database

    NASA Astrophysics Data System (ADS)

    Strupp-Adams, Annette; Henderson, Earl

    1999-12-01

    The Visible Human Data set is an important contribution to the national collection of anatomical images. To enhance the availability of these images, the National Library of Medicine has supported the design and development of a prototype object-oriented image database which imports, stores, and distributes high resolution anatomical images in both pixel and voxel formats. One of the key database modules is its client-server Internet interface. This Web interface provides a query engine with retrieval access to high-resolution anatomical images that range in size from 100KB for browser viewable rendered images, to 1GB for anatomical structures in voxel file formats. The Web query and retrieval client-server system is composed of applet GUIs, servlets, and RMI application modules which communicate with each other to allow users to query for specific anatomical structures, and retrieve image data as well as associated anatomical images from the database. Selected images can be downloaded individually as single files via HTTP or downloaded in batch-mode over the Internet to the user's machine through an applet that uses Netscape's Object Signing mechanism. The image database uses ObjectDesign's object-oriented DBMS, ObjectStore that has a Java interface. The query and retrieval systems has been tested with a Java-CDE window system, and on the x86 architecture using Windows NT 4.0. This paper describes the Java applet client search engine that queries the database; the Java client module that enables users to view anatomical images online; the Java application server interface to the database which organizes data returned to the user, and its distribution engine that allow users to download image files individually and/or in batch-mode.

  12. Small-Scale, Local Area, and Transitional Millimeter Wave Propagation for 5G Communications

    NASA Astrophysics Data System (ADS)

    Rappaport, Theodore S.; MacCartney, George R.; Sun, Shu; Yan, Hangsong; Deng, Sijia

    2017-12-01

    This paper studies radio propagation mechanisms that impact handoffs, air interface design, beam steering, and MIMO for 5G mobile communication systems. Knife edge diffraction (KED) and a creeping wave linear model are shown to predict diffraction loss around typical building objects from 10 to 26 GHz, and human blockage measurements at 73 GHz are shown to fit a double knife-edge diffraction (DKED) model which incorporates antenna gains. Small-scale spatial fading of millimeter wave received signal voltage amplitude is generally Ricean-distributed for both omnidirectional and directional receive antenna patterns under both line-of-sight (LOS) and non-line-of-sight (NLOS) conditions in most cases, although the log-normal distribution fits measured data better for the omnidirectional receive antenna pattern in the NLOS environment. Small-scale spatial autocorrelations of received voltage amplitudes are shown to fit sinusoidal exponential and exponential functions for LOS and NLOS environments, respectively, with small decorrelation distances of 0.27 cm to 13.6 cm (smaller than the size of a handset) that are favorable for spatial multiplexing. Local area measurements using cluster and route scenarios show how the received signal changes as the mobile moves and transitions from LOS to NLOS locations, with reasonably stationary signal levels within clusters. Wideband mmWave power levels are shown to fade from 0.4 dB/ms to 40 dB/s, depending on travel speed and surroundings.

  13. Implementing CUAHSI and SWE observation data models in the long-term monitoring infrastructure TERENO

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Stender, V.; Schroeder, M.

    2013-12-01

    Terrestrial Environmental Observatories (TERENO) is an interdisciplinary and long-term research project spanning an Earth observation network across Germany. It includes four test sites within Germany from the North German lowlands to the Bavarian Alps and is operated by six research centers of the Helmholtz Association. The contribution by the participating research centers is organized as regional observatories. The challenge for TERENO and its observatories is to integrate all aspects of data management, data workflows, data modeling and visualizations into the design of a monitoring infrastructure. TERENO Northeast is one of the sub-observatories of TERENO and is operated by the German Research Centre for Geosciences GFZ in Potsdam. This observatory investigates geoecological processes in the northeastern lowland of Germany by collecting large amounts of environmentally relevant data. The success of long-term projects like TERENO depends on well-organized data management, data exchange between the partners involved and on the availability of the captured data. Data discovery and dissemination are facilitated not only through data portals of the regional TERENO observatories but also through a common spatial data infrastructure TEODOOR. TEODOOR bundles the data, provided by the different web services of the single observatories, and provides tools for data discovery, visualization and data access. The TERENO Northeast data infrastructure integrates data from more than 200 instruments and makes the data available through standard web services. Data are stored following the CUAHSI observation data model in combination with the 52° North Sensor Observation Service data model. The data model was implemented using the PostgreSQL/PostGIS DBMS. Especially in a long-term project, such as TERENO, care has to be taken in the data model. We chose to adopt the CUAHSI observational data model because it is designed to store observations and descriptive information (metadata) about the data values in combination with information about the sensor systems. Also the CUAHSI model is supported by a large and active international user community. The 52° North SOS data model can be modeled as a sub-set of the CUHASI data model. In our implementation the 52° North SWE data model is implemented as database views of the CUHASI model to avoid redundant data storage. An essential aspect in TERENO Northeast is the use of standard OGS web services to facilitate data exchange and interoperability. A uniform treatment of sensor data can be realized through OGC Sensor Web Enablement (SWE) which makes a number of standards and interface definitions available: Observation & Measurement (O&M) model for the description of observations and measurements, Sensor Model Language (SensorML) for the description of sensor systems, Sensor Observation Service (SOS) for obtaining sensor observations, Sensor Planning Service (SPS) for tasking sensors, Web Notification Service (WNS) for asynchronous dialogues and Sensor Alert Service (SAS) for sending alerts.

  14. Analyzing CRISM hyperspectral imagery using PlanetServer.

    NASA Astrophysics Data System (ADS)

    Figuera, Ramiro Marco; Pham Huu, Bang; Minin, Mikhail; Flahaut, Jessica; Halder, Anik; Rossi, Angelo Pio

    2017-04-01

    Mineral characterization of planetary surfaces bears great importance for space exploration. In order to perform it, orbital hyperspectral imagery is widely used. In our research we use Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) [1] TRDR L observations with a spectral range of 1 to 4 µm. PlanetServer comprises a server, a web client and a Python client/API. The server side uses the Array DataBase Management System (DBMS) Raster Data Manager (Rasdaman) Community Edition [2]. OGC standards such as the Web Coverage Processing Service (WCPS) [3], an SQL-like language capable to query information along the image cube, are implemented in the PetaScope component [4]. The client side uses NASA's Web World Wind [5] allowing the user to access the data in an intuitive way. The client consists of a globe where all cubes are deployed, a main menu where projections, base maps and RGB combinations are provided, and a plot dock where the spectral information is shown. The RGB combinator tool allows to do band combination such as the CRISM products [6] using WCPS. The spectral information is retrieved using WCPS and shown in the plot dock/widget. The USGS splib06a library [7] is available to compare CRISM vs. laboratory spectra. The Python API provides an environment to create RGB combinations that can be embedded into existing pipelines. All employed libraries and tools are open source and can be easily adapted to other datasets. PlanetServer stands as a promising tool for spectral analysis on planetary bodies. M3/Moon and OMEGA datasets will be soon available. [1] S. Murchie et al., "Compact Connaissance Imaging Spectrometer for Mars (CRISM) on Mars Reconnaissance Orbiter (MRO)," J. Geophys. Res. E Planets,2007. [2] P. Baumann, A. Dehmel, P. Furtado, R. Ritsch, and N. Widmann, "The multidimensional database system RasDaMan," ACM SIGMOD Rec., vol. 27, no. 2, pp. 575-577, Jun. 1998. [3] P. Baumann, "The OGC web coverage processing service (WCPS) standard," Geoinformatica, vol. 14, no. 4, Jul. 2010. [4] A. Aiordǎchioaie and P. Baumann, "PetaScope: An open-source implementation of the OGC WCS Geo service standards suite," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6187 LNCS, pp. 160-168, Jun. 2010. [5] P. Hogan, C. Maxwell, R. Kim, and T. Gaskins, "World Wind 3D Earth Viewing," Apr. 2007. [6] C. E. Viviano-Beck et al., "Revised CRISM spectral parameters and summary products based on the currently detected mineral diversity on Mars," J. Geophys. Res. E Planets, vol. 119, no. 6, pp. 1403-1431, Jun. 2014. [7] R. N. Clark et al., "USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231," 2007. [Online]. Available: http://speclab.cr.usgs.gov/spectral.lib06.

  15. Safety and efficacy of use of demineralised bone matrix in orthopaedic and trauma surgery.

    PubMed

    Dinopoulos, Haralampos T H; Giannoudis, Peter V

    2006-11-01

    Demineralised bone matrix (DBM) acts as an osteoconductive, and possibly as an osteoinductive, material. It is widely used in orthopaedic, neurosurgical, plastic and dental areas. More than 500,000 bone grafting procedures with DBM are performed annually in the US. It does not offer structural support, but it is well suited for filling bone defects and cavities. The osteoinductive nature of DBM is presumably attributed to the presence of matrix-associated bone morphogenetic proteins (BMPs) and growth factors, which are made available to the host environment by the demineralisation process. Clinical results have not been uniformly favourable; however, a variable clinical response is attributed partly to nonuniform processing methods found among numerous bone banks and commercial suppliers. DBMs remain reasonably safe and effective products. The ultimate safe bone-graft substitute, one that is osteoconductive, osteoinductive, osteogenic and mechanically strong, remains elusive.

  16. Repair of massively defected hemi-joints using demineralized osteoarticular allografts with protected cartilage.

    PubMed

    Li, Siming; Yang, Xiaohong; Tang, Shenghui; Zhang, Xunmeng; Feng, Zhencheng; Cui, Shuliang

    2015-08-01

    Surgical replacement of massively defected joints necessarily relies on osteochondral grafts effective to both of bone and cartilage. Demineralized bone matrix (DBM) retains the osteoconductivity but destroys viable chondrocytes in the cartilage portion essential for successful restoration of defected joints. This study prepared osteochondral grafts of DBM with protected cartilage. Protected cartilage portions was characterized by cellular and molecular biology and the grafts were allogenically used for grafting. Protected cartilage showed similar histomorphological structure and protected proteins estimated by total proteins and cartilage specific proteins as in those of fresh controls when DBMs were generated in bone portions. Such grafts were successfully used for simultaneously repair of bone and cartilage in massively defected osteoarticular joints within 16 weeks post-surgery. These results present an allograft with clinical potential for simultaneous restoration of bone and cartilage in defected joints.

  17. Electroantennogram Responses to Plant Volatiles Associated with Fenvalerate Resistance in the Diamondback Moth, Plutella xylostella (Lepidoptera: Plutellidae).

    PubMed

    Houjun, Tian; Lin, Shuo; Chen, Yong; Chen, Yixin; Zhao, Jianwei; Gu, Xiaojun; Wei, Hui

    2018-05-28

    The diamondback moth (DBM), Plutella xylostella (L.) (Lepidoptera: Plutellidae), is the main destructive insect pest of brassica vegetables around the world, and has developed resistance to numerous insecticides. Although host plant volatiles are important in pest control, the mechanism of low-level insecticide resistance in P. xylostella due to plant volatiles has not been examined. Here, electroantennograms (EAGs) were used to compare the responses of adult male and female DBMs of a susceptible strain (S-strain) and a derived resistant strain, Fen-R-strain (6.52-fold more resistant than the S-strain), to different concentrations of nine plant volatiles. We found significantly different relative EAG responses between S-strain and Fen-R-strain males to different concentrations of methyl jasmonate, methyl salicylate, and octanal. The relative EAG responses of S-strain and Fen-R-strain females to different concentrations of β-myrcene, methyl jasmonate, methyl salicylate, and allyl isothiocyanate were significantly different. Fen-R-strain females showed lower EAG responses to most of the tested plant volatiles (at concentrations of 1:10) than males, except for allyl isothiocyanate. A larger difference in relative EAG response to α-farnesene and β-myrcene was found between S-strain and Fen-R-strain females than between males of the two strains. A larger difference in relative EAG response to octanal, nonanal, and octan-1-ol was found between S-strain and Fen-R-strain males than between females of the two strains. These results illustrate the relationship between the function of plant volatiles and resistance in an insect pest species, and provide a scientific basis for resistance evolutionary theory in pest management research.

  18. Complex multidisciplinary system composition for aerospace vehicle conceptual design

    NASA Astrophysics Data System (ADS)

    Gonzalez, Lex

    Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.

  19. On-line applications of numerical models in the Black Sea GIS

    NASA Astrophysics Data System (ADS)

    Zhuk, E.; Khaliulin, A.; Zodiatis, G.; Nikolaidis, A.; Nikolaidis, M.; Stylianou, Stavros

    2017-09-01

    The Black Sea Geographical Information System (GIS) is developed based on cutting edge information technologies, and provides automated data processing and visualization on-line. Mapserver is used as a mapping service; the data are stored in MySQL DBMS; PHP and Python modules are utilized for data access, processing, and exchange. New numerical models can be incorporated in the GIS environment as individual software modules, compiled for a server-based operational system, providing interaction with the GIS. A common interface allows setting the input parameters; then the model performs the calculation of the output data in specifically predefined files and format. The calculation results are then passed to the GIS for visualization. Initially, a test scenario of integration of a numerical model into the GIS was performed, using software, developed to describe a two-dimensional tsunami propagation in variable basin depth, based on a linear long surface wave model which is legitimate for more than 5 m depth. Furthermore, the well established oil spill and trajectory 3-D model MEDSLIK (http://www.oceanography.ucy.ac.cy/medslik/) was integrated into the GIS with more advanced GIS functionality and capabilities. MEDSLIK is able to forecast and hind cast the trajectories of oil pollution and floating objects, by using meteo-ocean data and the state of oil spill. The MEDSLIK module interface allows a user to enter all the necessary oil spill parameters, i.e. date and time, rate of spill or spill volume, forecasting time, coordinates, oil spill type, currents, wind, and waves, as well as the specification of the output parameters. The entered data are passed on to MEDSLIK; then the oil pollution characteristics are calculated for pre-defined time steps. The results of the forecast or hind cast are then visualized upon a map.

  20. Quantitative measurement of a candidate serum biomarker peptide derived from α2-HS-glycoprotein, and a preliminary trial of multidimensional peptide analysis in females with pregnancy-induced hypertension.

    PubMed

    Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko

    2018-03-01

    Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.

  1. The VO-Dance web application at the IA2 data center

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo

    2012-09-01

    Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.

  2. Evaluation of a Soil Moisture Data Assimilation System Over West Africa

    NASA Astrophysics Data System (ADS)

    Bolten, J. D.; Crow, W.; Zhan, X.; Jackson, T.; Reynolds, C.

    2009-05-01

    A crucial requirement of global crop yield forecasts by the U.S. Department of Agriculture (USDA) International Production Assessment Division (IPAD) is the regional characterization of surface and sub-surface soil moisture. However, due to the spatial heterogeneity and dynamic nature of precipitation events and resulting soil moisture, accurate estimation of regional land surface-atmosphere interactions based sparse ground measurements is difficult. IPAD estimates global soil moisture using daily estimates of minimum and maximum temperature and precipitation applied to a modified Palmer two-layer soil moisture model which calculates the daily amount of soil moisture withdrawn by evapotranspiration and replenished by precipitation. We attempt to improve upon the existing system by applying an Ensemble Kalman filter (EnKF) data assimilation system to integrate surface soil moisture retrievals from the NASA Advanced Microwave Scanning Radiometer (AMSR-E) into the USDA soil moisture model. This work aims at evaluating the utility of merging satellite-retrieved soil moisture estimates with the IPAD two-layer soil moisture model used within the DBMS. We present a quantitative analysis of the assimilated soil moisture product over West Africa (9°N- 20°N; 20°W-20°E). This region contains many key agricultural areas and has a high agro- meteorological gradient from desert and semi-arid vegetation in the North, to grassland, trees and crops in the South, thus providing an ideal location for evaluating the assimilated soil moisture product over multiple land cover types and conditions. A data denial experimental approach is utilized to isolate the added utility of integrating remotely-sensed soil moisture by comparing assimilated soil moisture results obtained using (relatively) low-quality precipitation products obtained from real-time satellite imagery to baseline model runs forced with higher quality rainfall. An analysis of root-zone anomalies for each model simulation suggests that the assimilation of AMSR-E surface soil moisture retrievals can add significant value to USDA root-zone predictions derived from real-time satellite precipitation products.

  3. Incremental Query Rewriting with Resolution

    NASA Astrophysics Data System (ADS)

    Riazanov, Alexandre; Aragão, Marcelo A. T.

    We address the problem of semantic querying of relational databases (RDB) modulo knowledge bases using very expressive knowledge representation formalisms, such as full first-order logic or its various fragments. We propose to use a resolution-based first-order logic (FOL) reasoner for computing schematic answers to deductive queries, with the subsequent translation of these schematic answers to SQL queries which are evaluated using a conventional relational DBMS. We call our method incremental query rewriting, because an original semantic query is rewritten into a (potentially infinite) series of SQL queries. In this chapter, we outline the main idea of our technique - using abstractions of databases and constrained clauses for deriving schematic answers, and provide completeness and soundness proofs to justify the applicability of this technique to the case of resolution for FOL without equality. The proposed method can be directly used with regular RDBs, including legacy databases. Moreover, we propose it as a potential basis for an efficient Web-scale semantic search technology.

  4. VO-Dance an IVOA tools to easy publish data into VO and it's extension on planetology request

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Capria, M. T.; Molinaro, M.

    2012-09-01

    Data publishing through the self standing portals can be joined to VO resource publishing, i.e. astronomical resources deployed through VO compliant services. Since the IVOA (International Virtual Observatory Alliance) provides many protocols and standards for the various data flavors (images, spectra, catalogues … ), and since the data center has as a goal to grow up in number of hosted archives and services providing, the idea arose to find a way to easily deploy and maintain VO resources. VO-Dance is a java web application developed at IA2 that addresses this idea creating, in a dynamical way, VO resources out of database tables or views. It is structured to be potentially DBMS and platform independent and consists of 3 main tokens, an internal DB to store resources description and model metadata information, a restful web application to deploy the resources to the VO community. It's extension to planetology request is under study to best effort INAF software development and archive efficiency.

  5. Scaling up spike-and-slab models for unsupervised feature learning.

    PubMed

    Goodfellow, Ian J; Courville, Aaron; Bengio, Yoshua

    2013-08-01

    We describe the use of two spike-and-slab models for modeling real-valued data, with an emphasis on their applications to object recognition. The first model, which we call spike-and-slab sparse coding (S3C), is a preexisting model for which we introduce a faster approximate inference algorithm. We introduce a deep variant of S3C, which we call the partially directed deep Boltzmann machine (PD-DBM) and extend our S3C inference algorithm for use on this model. We describe learning procedures for each. We demonstrate that our inference procedure for S3C enables scaling the model to unprecedented large problem sizes, and demonstrate that using S3C as a feature extractor results in very good object recognition performance, particularly when the number of labeled examples is low. We show that the PD-DBM generates better samples than its shallow counterpart, and that unlike DBMs or DBNs, the PD-DBM may be trained successfully without greedy layerwise training.

  6. Compression of Probabilistic XML Documents

    NASA Astrophysics Data System (ADS)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  7. The use of bathymetric data in society and science: a review from the Baltic Sea.

    PubMed

    Hell, Benjamin; Broman, Barry; Jakobsson, Lars; Jakobsson, Martin; Magnusson, Ake; Wiberg, Patrik

    2012-03-01

    Bathymetry, the underwater topography, is a fundamental property of oceans, seas, and lakes. As such it is important for a wide range of applications, like physical oceanography, marine geology, geophysics and biology or the administration of marine resources. The exact requirements users may have regarding bathymetric data are, however, unclear. Here, the results of a questionnaire survey and a literature review are presented, concerning the use of Baltic Sea bathymetric data in research and for societal needs. It is demonstrated that there is a great need for detailed bathymetric data. Despite the abundance of high-quality bathymetric data that are produced for safety of navigation purposes, the digital bathymetric models publicly available to date cannot satisfy this need. Our study shows that DBMs based on data collected for safety of navigation could substantially improve the base data for administrative decision making as well as the possibilities for marine research in the Baltic Sea.

  8. Human radiation studies: Remembering the early years. Oral history of biochemist John Randolph Totter, Ph.D., January 23, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-09-01

    This document is a transcript of an interview of Dr. John Randolph Tottler by representatives of the US DOE Office of Human Radiation Experiments. Dr. Tottler was selected for this interview because of his career with the Atomic Energy Commission Division of Biology and Medicine (DBM), particularly as its director from 1967 to 1972. After a short biographical sketch Dr. Tottler discusses his remembrances on a wide range topics including nucleic acid and leukemia research at Oak Ridge, AEC biochemistry training in South America, DBM`s research focus on radiation effects, early leadership of DBM, relations with the US Public Healthmore » Service, controversies on low-level radiation, iodine from fallout, on John Gofman, and Project Plowshare, funding for AEC Research Programs and for international research, testicular irradiation of prisoners in Washington State and Oregon, Plutonium injections, ethics of government radiation research, and opinions of public misperceptions about radiation and cancer.« less

  9. Modern Technologies aspects for Oceanographic Data Management and Dissemination : The HNODC Implementation

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.

    2009-04-01

    The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.

  10. Building a database for long-term monitoring of benthic macrofauna in the Pertuis-Charentais (2004-2014).

    PubMed

    Philippe, Anne S; Plumejeaud-Perreau, Christine; Jourde, Jérôme; Pineau, Philippe; Lachaussée, Nicolas; Joyeux, Emmanuel; Corre, Frédéric; Delaporte, Philippe; Bocher, Pierrick

    2017-01-01

    Long-term benthic monitoring is rewarding in terms of science, but labour-intensive, whether in the field, the laboratory, or behind the computer. Building and managing databases require multiple skills, including consistency over time as well as organisation via a systematic approach. Here, we introduce and share our spatially explicit benthic database, comprising 11 years of benthic data. It is the result of intensive benthic sampling that has been conducted on a regular grid (259 stations) covering the intertidal mudflats of the Pertuis-Charentais (Marennes-Oléron Bay and Aiguillon Bay). Samples were taken by foot or by boats during winter depending on tidal height, from December 2003 to February 2014. The present dataset includes abundances and biomass densities of all mollusc species of the study regions and principal polychaetes as well as their length, accessibility to shorebirds, energy content and shell mass when appropriate and available. This database has supported many studies dealing with the spatial distribution of benthic invertebrates and temporal variations in food resources for shorebird species as well as latitudinal comparisons with other databases. In this paper, we introduce our benthos monitoring, share our data, and present a "guide of good practices" for building, cleaning and using it efficiently, providing examples of results with associated R code. The dataset has been formatted into a geo-referenced relational database, using PostgreSQL open-source DBMS. We provide density information, measurements, energy content and accessibility of thirteen bivalve, nine gastropod and two polychaete taxa (a total of 66,620 individuals)​ for 11 consecutive winters. Figures and maps are provided to describe how the dataset was built, cleaned, and how it can be used. This dataset can again support studies concerning spatial and temporal variations in species abundance, interspecific interactions as well as evaluations of the availability of food resources for small- and medium size shorebirds and, potentially, conservation and impact assessment studies.

  11. LBT Distributed Archive: Status and Features

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Smareglia, R.; Thompson, D.; Grede, G.

    2011-07-01

    After the first release of the LBT Distributed Archive, this successful collaboration is continuing within the LBT corporation. The IA2 (Italian Center for Astronomical Archive) team had updated the LBT DA with new features in order to facilitate user data retrieval while abiding by VO standards. To facilitate the integration of data from any new instruments, we have migrated to a new database, developed new data distribution software, and enhanced features in the LBT User Interface. The DBMS engine has been changed to MySQL. Consequently, the data handling software now uses java thread technology to update and synchronize the main storage archives on Mt. Graham and in Tucson, as well as archives in Trieste and Heidelberg, with all metadata and proprietary data. The LBT UI has been updated with additional features allowing users to search by instrument and some of the more important characteristics of the images. Finally, instead of a simple cone search service over all LBT image data, new instrument specific SIAP and cone search services have been developed. They will be published in the IVOA framework later this fall.

  12. Use of mutation spectra analysis software.

    PubMed

    Rogozin, I; Kondrashov, F; Glazko, G

    2001-02-01

    The study and comparison of mutation(al) spectra is an important problem in molecular biology, because these spectra often reflect on important features of mutations and their fixation. Such features include the interaction of DNA with various mutagens, the function of repair/replication enzymes, and properties of target proteins. It is known that mutability varies significantly along nucleotide sequences, such that mutations often concentrate at certain positions, called "hotspots," in a sequence. In this paper, we discuss in detail two approaches for mutation spectra analysis: the comparison of mutation spectra with a HG-PUBL program, (FTP: sunsite.unc.edu/pub/academic/biology/dna-mutations/hyperg) and hotspot prediction with the CLUSTERM program (www.itba.mi.cnr.it/webmutation; ftp.bionet.nsc.ru/pub/biology/dbms/clusterm.zip). Several other approaches for mutational spectra analysis, such as the analysis of a target protein structure, hotspot context revealing, multiple spectra comparisons, as well as a number of mutation databases are briefly described. Mutation spectra in the lacI gene of E. coli and the human p53 gene are used for illustration of various difficulties of such analysis. Copyright 2001 Wiley-Liss, Inc.

  13. Planetary mapping—The datamodel's perspective and GIS framework

    NASA Astrophysics Data System (ADS)

    van Gasselt, S.; Nass, A.

    2011-09-01

    Demands for a broad range of integrated geospatial data-analysis tools and methods for planetary data organization have been growing considerably since the late 1990s when a plethora of missions equipped with new instruments entered planetary orbits or landed on the surface. They sent back terabytes of new data which soon became accessible for the scientific community and public and which needed to be organized. On the terrestrial side, issues of data access, organization and utilization for scientific and economic analyses are handled by using a range of well-established geographic information systems (GIS) that also found their way into the field of planetary sciences in the late 1990s. We here address key issues concerning the field of planetary mapping by making use of established GIS environments and discuss methods of addressing data organization and mapping requirements by using an easily integrable datamodel that is - for the time being - designed as file-geodatabase (FileGDB) environment in ESRI's ArcGIS. A major design-driving requirement for this datamodel is its extensibility and scalability for growing scientific as well as technical needs, e.g., the utilization of such a datamodel for surface mapping of different planetary objects as defined by their respective reference system and by using different instrument data. Furthermore, it is a major goal to construct a generic model which allows to perform combined geologic as well as geomorphologic mapping tasks making use of international standards without loss of information and by maintaining topologic integrity. An integration of such a datamodel within a geospatial DBMS context can practically be performed by individuals as well as groups without having to deal with the details of administrative tasks and data ingestion issues. Besides the actual mapping, key components of such a mapping datamodel deal with the organization and search for image-sensor data and previous mapping efforts, as well as the proper organization of cartographic representations and assignments of geologic/geomorphologic units within their stratigraphic context.

  14. a 3d Information System for the Documentation of Archaeologica L Excavations

    NASA Astrophysics Data System (ADS)

    Ardissone, P.; Bornaz, L.; Degattis, G.; Domaine, R.

    2013-07-01

    Documentation of archaeological and cultural heritage sites is at the heart of the archaeological process and an important component in cultural heritage research, presentation and restorations. In 2012 the Superintendence of Cultural Heritage of Aosta Valley - IT (Soprintendenza per i Beni e le Attività Culturali della Region e Autonoma Valle d'Aosta) carried out a complex archaeological excavation in a composite archaeological context, situated an urban background: the Aosta city centre. This archaeological excavation has been characterized by the classical urban archaeological issues: little space, short time, complex stratigraphy. Moreover the investigations have come out several structures and walls that required safety and conservation measures. Ad hoc 3D solutions carried out a complete 3D survey of the area in 10 different time/situations of the Archaeological digs, chosen in collaborations with the archaeological staff. In this way a multi temporal 3D description of the site has been provided for the archaeological analysis and for the project of the restorations activities. The 3D surveys has been carried out integrating GPS, laser scanner technology and photogrammetry. In order to meet the needs of the site, and its complex logistics and to obtain products that guarantee the high quality and detail required for archaeological analysis, we have developed different procedures and methodologies: hdr imaging for 3D model with correct, consistent and uniform colours, noise filtering and people filtering, for the removal of interference between laser instrument and object of the survey, Advanced laser scanner triangulation, in order to consider both artificial and natural tie points, for a correct registration of a huge amount of scans. Single image orientation on 3D data, in order to integrate the laser data with data coming from digital photogrammetry (faster on the field than the laser scanner survey, than used in certain situations). The results of all these methodologies and procedures will be presented and described in the article. For the documentation of the archaeological excavations and for the management of the conservation activities (condition assessment, planning, and conservation work). Ad Hoc 3D solutions has costumized 2 special plug-ins of its own software platform Ad Hoc: Ad Hoc Archaeology and Ad Hoc Conservation. The software platform integrates a 3D database management system. All information (measurements, plotting, areas of interests…) are organized according to their correct 3D position. They can be queried using attributes, geometric characteristics or their spatial position. The Ad Hoc Archaeology plug-in allows archeologists to fill out UUSS sheets in an internal database, put them in the correct location within the 3D model of the site, define the mutual relations between the UUSS, divide the different archaeological phases. A simple interface will facilitate the construction of the stratigraphic chart (matrix), in a 3D environment as well (matrix 3D). The Ad Hoc Conservation plug-in permits conservators and restorers to create relationships between the different approaches and descriptions of the same parts of the monument, i.e.: between stratigraphyc units or historical phases and architectural components and/or decay pathologies. The 3D DBMS conservation module uses a codified terminology based on "ICOMOS illustrated glossary of stone deterioration" and other glossary. Specific tools permits restorers to compute correctly surfaces and volumes. In this way decay extension and intensity can be measured with high precision and with an high level of detail, for a correct time and costs estimation of each conservation step.

  15. A Science Cloud: OneSpaceNet

    NASA Astrophysics Data System (ADS)

    Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.

    2010-12-01

    Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.

  16. Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram

    2013-01-01

    The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less

  17. Cytoprotective dibenzoylmethane derivatives protect cells from oxidative stress-induced necrotic cell death.

    PubMed

    Hegedűs, Csaba; Lakatos, Petra; Kiss-Szikszai, Attila; Patonay, Tamás; Gergely, Szabolcs; Gregus, Andrea; Bai, Péter; Haskó, György; Szabó, Éva; Virág, László

    2013-06-01

    Screening of a small in-house library of 1863 compounds identified 29 compounds that protected Jurkat cells from hydrogen peroxide-induced cytotoxicity. From the cytoprotective compounds eleven proved to possess antioxidant activity (ABTS radical scavenger effect) and two were found to inhibit poly(ADP-ribosyl)ation (PARylation), a cytotoxic pathway operating in severely injured cells. Four cytoprotective dibenzoylmethane (DBM) derivatives were investigated in more detail as they did not scavenge hydrogen peroxide nor did they inhibit PARylation. These compounds protected cells from necrotic cell death while caspase activation, a parameter of apoptotic cell death was not affected. Hydrogen peroxide activated extracellular signal regulated kinase (ERK1/2) and p38 MAP kinases but not c-Jun N-terminal kinase (JNK). The cytoprotective DBMs suppressed the activation of Erk1/2 but not that of p38. Cytoprotection was confirmed in another cell type (A549 lung epithelial cells), indicating that the cytoprotective effect is not cell type specific. In conclusion we identified DBM analogs as a novel class of cytoprotective compounds inhibiting ERK1/2 kinase and protecting from necrotic cell death by a mechanism independent of poly(ADP-ribose) polymerase inhibition. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases

    NASA Astrophysics Data System (ADS)

    Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie

    2018-01-01

    Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade performance. Thus, for the use patterns studied here the database performance is not critically dependent on the exact choices of index or level.

  19. 5 CFR 430.304 - SES performance management systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false SES performance management systems. 430... PERFORMANCE MANAGEMENT Managing Senior Executive Performance § 430.304 SES performance management systems. (a... or more performance management systems for its senior executives. (b) Performance management systems...

  20. 5 CFR 430.304 - SES performance management systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false SES performance management systems. 430... PERFORMANCE MANAGEMENT Managing Senior Executive Performance § 430.304 SES performance management systems. (a... or more performance management systems for its senior executives. (b) Performance management systems...

  1. 5 CFR 430.304 - SES performance management systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false SES performance management systems. 430... PERFORMANCE MANAGEMENT Managing Senior Executive Performance § 430.304 SES performance management systems. (a... or more performance management systems for its senior executives. (b) Performance management systems...

  2. 5 CFR 430.304 - SES performance management systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false SES performance management systems. 430... PERFORMANCE MANAGEMENT Managing Senior Executive Performance § 430.304 SES performance management systems. (a... or more performance management systems for its senior executives. (b) Performance management systems...

  3. 5 CFR 930.301 - Information systems security awareness training program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... training in system/application life cycle management, risk management, and contingency planning. (4) Chief... security management, system/application life cycle management, risk management, and contingency planning... management; and management and implementation level training in system/application life cycle management...

  4. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  5. Development of Participative Management System in Learning Environment Management for Small Sized Primary Schools

    ERIC Educational Resources Information Center

    Hernthaisong, Prasertsak; Sirisuthi, Chaiyuth; Wisetrinthong, Kanjana

    2017-01-01

    The research aimed to: 1) study the factors of a participative management system in learning environment management, 2) study the current situation, desirable outcomes, and further needs for developing a participative management system in learning management, 3) develop a working participative management system, and 4) assess the system's…

  6. Resource Management for Distributed Parallel Systems

    NASA Technical Reports Server (NTRS)

    Neuman, B. Clifford; Rao, Santosh

    1993-01-01

    Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.

  7. 48 CFR 245.105 - Contractors' property management system compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... management system compliance. 245.105 Section 245.105 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT GOVERNMENT PROPERTY General 245.105 Contractors' property management system compliance. (a) Definitions— (1) Acceptable property management system...

  8. A system management methodology for building successful resource management systems

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Willoughby, John K.

    1989-01-01

    This paper presents a system management methodology for building successful resource management systems that possess lifecycle effectiveness. This methodology is based on an analysis of the traditional practice of Systems Engineering Management as it applies to the development of resource management systems. The analysis produced fifteen significant findings presented as recommended adaptations to the traditional practice of Systems Engineering Management to accommodate system development when the requirements are incomplete, unquantifiable, ambiguous and dynamic. Ten recommended adaptations to achieve operational effectiveness when requirements are incomplete, unquantifiable or ambiguous are presented and discussed. Five recommended adaptations to achieve system extensibility when requirements are dynamic are also presented and discussed. The authors conclude that the recommended adaptations to the traditional practice of Systems Engineering Management should be implemented for future resource management systems and that the technology exists to build these systems extensibly.

  9. 41 CFR 101-39.104 - Notice of establishment of a fleet management system.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... management system in order to work out any problems pertaining to establishing and operating fleet management... of a fleet management system. 101-39.104 Section 101-39.104 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION...

  10. 41 CFR 101-39.104 - Notice of establishment of a fleet management system.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... management system in order to work out any problems pertaining to establishing and operating fleet management... of a fleet management system. 101-39.104 Section 101-39.104 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION...

  11. 41 CFR 101-39.104 - Notice of establishment of a fleet management system.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... management system in order to work out any problems pertaining to establishing and operating fleet management... of a fleet management system. 101-39.104 Section 101-39.104 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION...

  12. 41 CFR 101-39.104 - Notice of establishment of a fleet management system.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... management system in order to work out any problems pertaining to establishing and operating fleet management... of a fleet management system. 101-39.104 Section 101-39.104 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION...

  13. 5 CFR 930.301 - Information systems security awareness training program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... training in system/application life cycle management, risk management, and contingency planning. (4) Chief... security management, system/application life cycle management, risk management, and contingency planning..., risk management, and contingency planning. (b) Provide the Federal information systems security...

  14. 5 CFR 930.301 - Information systems security awareness training program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... training in system/application life cycle management, risk management, and contingency planning. (4) Chief... security management, system/application life cycle management, risk management, and contingency planning..., risk management, and contingency planning. (b) Provide the Federal information systems security...

  15. 14 CFR 1212.704 - System manager.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false System manager. 1212.704 Section 1212.704... Authority and Responsibilities § 1212.704 System manager. (a) Each system manager is responsible for the following with regard to the system of records over which the system manager has cognizance: (1) Overall...

  16. 77 FR 43100 - Privacy Act of 1974; Department of Homeland Security, Federal Emergency Management Agency-009...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ... Information System (NEMIS)--Mitigation (MT) Electronic Grants Management System of Records,'' and retitle it... Information System (NEMIS)--Mitigation (MT) Electronic Grants Management System of Records (69 FR 75079... Management Information System (NEMIS)--Mitigation (MT) Electronic Grants Management System (NEMIS--MT eGrants...

  17. 14 CFR 1212.705 - System manager.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 5 2013-01-01 2013-01-01 false System manager. 1212.705 Section 1212.705... Authority and Responsibilities § 1212.705 System manager. (a) Each system manager is responsible for the following with regard to the system of records over which the system manager has cognizance: (1) Overall...

  18. 14 CFR 1212.704 - System manager.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true System manager. 1212.704 Section 1212.704... Authority and Responsibilities § 1212.704 System manager. (a) Each system manager is responsible for the following with regard to the system of records over which the system manager has cognizance: (1) Overall...

  19. 14 CFR 1212.704 - System manager.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false System manager. 1212.704 Section 1212.704... Authority and Responsibilities § 1212.704 System manager. (a) Each system manager is responsible for the following with regard to the system of records over which the system manager has cognizance: (1) Overall...

  20. Information Security Management - Part Of The Integrated Management System

    NASA Astrophysics Data System (ADS)

    Manea, Constantin Adrian

    2015-07-01

    The international management standards allow their integrated approach, thereby combining aspects of particular importance to the activity of any organization, from the quality management systems or the environmental management of the information security systems or the business continuity management systems. Although there is no national or international regulation, nor a defined standard for the Integrated Management System, the need to implement an integrated system occurs within the organization, which feels the opportunity to integrate the management components into a cohesive system, in agreement with the purpose and mission publicly stated. The issues relating to information security in the organization, from the perspective of the management system, raise serious questions to any organization in the current context of electronic information, reason for which we consider not only appropriate but necessary to promote and implement an Integrated Management System Quality - Environment - Health and Operational Security - Information Security

  1. 41 CFR 109-1.5205 - Personal property management system changes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management system changes. 109-1.5205 Section 109-1.5205 Public Contracts and Property Management Federal Property Management Regulations System (Continued) DEPARTMENT OF ENERGY PROPERTY MANAGEMENT REGULATIONS... Personal property management system changes. Any proposed significant change to a designated contractor's...

  2. Advanced Distribution Management Systems | Grid Modernization | NREL

    Science.gov Websites

    Advanced Distribution Management Systems Advanced Distribution Management Systems Electric utilities are investing in updated grid technologies such as advanced distribution management systems to management testbed for cyber security in power systems. The "advanced" elements of advanced

  3. 20 CFR 632.76 - Program management systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Program management systems. 632.76 Section... management systems. (a) All Native American grantees shall establish management information systems to... for the overall management of all programs including: (1) Eligibility verification systems as...

  4. A systems engineering management approach to resource management applications

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller

    1989-01-01

    The author presents a program management response to the following question: How can the traditional practice of systems engineering management, including requirements specification, be adapted, enhanced, or modified to build future planning and scheduling systems for effective operations? The systems engineering management process, as traditionally practiced, is examined. Extensible resource management systems are discussed. It is concluded that extensible systems are a partial solution to problems presented by requirements that are incomplete, partially immeasurable, and often dynamic. There are positive indications that resource management systems have been characterized and modeled sufficiently to allow their implementation as extensible systems.

  5. 41 CFR 101-39.104-1 - Consolidations into a fleet management system.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... fleet management system. 101-39.104-1 Section 101-39.104-1 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS AVIATION, TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.1-Establishment, Modification, and...

  6. 44 CFR 13.20 - Standards for financial management systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... management systems. 13.20 Section 13.20 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT... Standards for financial management systems. (a) A State must expand and account for grant funds in... financial management systems of other grantees and subgrantees must meet the following standards: (1...

  7. 14 CFR § 1212.705 - System manager.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 5 2014-01-01 2014-01-01 false System manager. § 1212.705 Section § 1212... NASA Authority and Responsibilities § 1212.705 System manager. (a) Each system manager is responsible for the following with regard to the system of records over which the system manager has cognizance...

  8. 48 CFR 2452.242-71 - Contract management system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Contract management system... 2452.242-71 Contract management system. As prescribed in 2442.1107, insert the following clause: Contract Management System (FEB 2006) (a) The contractor shall use contract management baseline planning...

  9. The effect of carrier type on bone regeneration of demineralized bone matrix in vivo.

    PubMed

    Tavakol, Shima; Khoshzaban, Ahad; Azami, Mahmoud; Kashani, Iraj Ragerdi; Tavakol, Hani; Yazdanifar, Mahbube; Sorkhabadi, Seyed Mahdi Rezayat

    2013-11-01

    Demineralized bone matrix (DBM) is a bone substitute biomaterial used as an excellent grafting material. Some factors such as carrier type might affect the healing potential of this material. The background data discuss the present status of the field: Albumin as a main protein in blood and carboxymethyl cellulose (CMC) were applied frequently in the DBM gels. We investigated the bone-repairing properties of 2 DBMs with different carriers. Bone regeneration in 3 groups of rat calvaria treated with DBM from the Iranian Tissue Bank Research and Preparation Center, DBM from Hans Biomed Corporation, and an empty cavity was studied. Albumin and CMC as carriers were used. The results of bone regeneration in the samples after 1, 4, and 8 weeks of implantation were compared. The block of the histologic samples was stained with hematoxylin and eosin, and the percentage area of bone formation was calculated using the histomorphometry method. The results of in vivo tests showed a significantly stronger new regenerated bone occupation in the DBM with albumin carrier compared with the one with CMC 8 weeks after the implantation. The 2 types of DBM had a significant difference in bone regeneration. This difference is attributed to the type of carriers. Albumin could improve mineralization and bioactivity compared with CMC.

  10. Comparative evaluation of the osteoinductivity of two formulations of human demineralized bone matrix.

    PubMed

    Takikawa, Satoshi; Bauer, Thomas W; Kambic, Helen; Togawa, Daisuke

    2003-04-01

    In the United States, demineralized bone matrix (DBM) is considered a transplantable tissue and therefore is regulated primarily by the American Association of Tissue Banks. Even though DBM is not subjected to the same regulations relative to performance claims as medical devices are, one would expect different processing methods might yield DBM preparations of different osteoinductive potential. The purpose of this study was to use an established athymic rat model to compare the osteoinductive properties of two commercially available human DBMs prepared using different methods but having essentially identical product claims. Sixteen female athymic rats were used to test equivalent volumes of two lots each of Grafton Putty (Osteotech, Inc., Eatontown, NJ), Osteofil (Regeneration Technologies, Inc., Alachua, FL), and rat DBM. At 28 days after implantation, qualitative and semiquantitative microscopy showed no significant differences in bone formation between the two lots from each source, but rat DBM produced significantly more bone than Grafton, which produced significantly more bone than Osteofil. Our results suggest that methods of graft processing may represent a greater source of variability than do differences among individual donors. Whether these differences relate to methods of demineralization, carrier, dose of DBM per volume, or to some other factor remains to be determined. Copyright 2003 Wiley Periodicals, Inc.

  11. Automating the Air Force Retail-Level Equipment Management Process: An Application of Microcomputer-Based Information Systems Techniques

    DTIC Science & Technology

    1988-09-01

    could use the assistance of a microcomputer-based management information system . However, adequate system design and development requires an in-depth...understanding of the Equipment Management Section and the environment in which it functions were asked and answered. Then, a management information system was...designed, developed, and tested. The management information system is called the Equipment Management Information System (EMIS).

  12. Warrant Officer Orientation Course (WOOC) Evaluation

    DTIC Science & Technology

    1981-10-01

    Army Mainte- nance Management System, Security Awareness, Organizational Effectiveness, Introduction to Management , Enlisted Personnel Management...Orientation Introduction to Management Professional Ethics USA Officer Evaluation Reporting System (OES) Military Correspondence Military...Organizational Effectiveness, Introduction to Management , Enlisted Personnal Management System, and The Army Functional Files System and The Army

  13. 23 CFR 970.214 - Federal lands congestion management system (CMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Federal lands congestion management system (CMS). 970... LANDS HIGHWAYS NATIONAL PARK SERVICE MANAGEMENT SYSTEMS National Park Service Management Systems § 970.214 Federal lands congestion management system (CMS). (a) For purposes of this section, congestion...

  14. 5 CFR 297.103 - Designations of authority by system manager.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Designations of authority by system manager. 297.103 Section 297.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... system manager. The responsible Office system manager having jurisdiction over a system of records may...

  15. 25 CFR 276.7 - Standards for grantee financial management systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Standards for grantee financial management systems. 276.7... grantee financial management systems. (a) Grantee financial management systems for grants and subgrantee financial management systems for subgrants shall provide for: (1) Accurate, current, and complete disclosure...

  16. 41 CFR 101-39.104-1 - Consolidations into a fleet management system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... fleet management system. 101-39.104-1 Section 101-39.104-1 Public Contracts and Property Management..., TRANSPORTATION, AND MOTOR VEHICLES 39-INTERAGENCY FLEET MANAGEMENT SYSTEMS 39.1-Establishment, Modification, and Discontinuance of Interagency Fleet Management Systems § 101-39.104-1 Consolidations into a fleet management...

  17. Distribution Management System Volt/VAR Evaluation | Grid Modernization |

    Science.gov Websites

    NREL Distribution Management System Volt/VAR Evaluation Distribution Management System Volt/VAR Evaluation This project involves building a prototype distribution management system testbed that links a GE Grid Solutions distribution management system to power hardware-in-the-loop testing. This setup is

  18. 33 CFR 96.220 - What makes up a safety management system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... SECURITY VESSEL OPERATING REGULATIONS RULES FOR THE SAFE OPERATION OF VESSELS AND SAFETY MANAGEMENT SYSTEMS Company and Vessel Safety Management Systems § 96.220 What makes up a safety management system? (a) The safety management system must document the responsible person's— (1) Safety and pollution prevention...

  19. 23 CFR 973.210 - Indian lands bridge management system (BMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Indian lands bridge management system (BMS). 973.210... HIGHWAYS MANAGEMENT SYSTEMS PERTAINING TO THE BUREAU OF INDIAN AFFAIRS AND THE INDIAN RESERVATION ROADS PROGRAM Bureau of Indian Affairs Management Systems § 973.210 Indian lands bridge management system (BMS...

  20. 23 CFR 973.214 - Indian lands congestion management system (CMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Indian lands congestion management system (CMS). 973.214... HIGHWAYS MANAGEMENT SYSTEMS PERTAINING TO THE BUREAU OF INDIAN AFFAIRS AND THE INDIAN RESERVATION ROADS PROGRAM Bureau of Indian Affairs Management Systems § 973.214 Indian lands congestion management system...

  1. 48 CFR 52.234-4 - Earned Value Management System.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Earned Value Management....234-4 Earned Value Management System. As prescribed in 34.203(c), insert the following clause: Earned Value Management System (JUL 2006) (a) The Contractor shall use an earned value management system (EVMS...

  2. 48 CFR 52.234-4 - Earned Value Management System.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Earned Value Management....234-4 Earned Value Management System. As prescribed in 34.203(c), insert the following clause: Earned Value Management System (JUL 2006) (a) The Contractor shall use an earned value management system (EVMS...

  3. 48 CFR 52.234-4 - Earned Value Management System.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Earned Value Management....234-4 Earned Value Management System. As prescribed in 34.203(c), insert the following clause: Earned Value Management System (MAY 2014) (a) The Contractor shall use an earned value management system (EVMS...

  4. 48 CFR 52.234-4 - Earned Value Management System.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Earned Value Management....234-4 Earned Value Management System. As prescribed in 34.203(c), insert the following clause: Earned Value Management System (JUL 2006) (a) The Contractor shall use an earned value management system (EVMS...

  5. 78 FR 37676 - Federal Acquisition Regulation; System for Award Management Name Change, Phase 1 Implementation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ....204-13 System for Award Management Maintenance. * * * * * System for Award Management Maintenance (Jul...; System for Award Management Name Change, Phase 1 Implementation AGENCY: Department of Defense (DoD... System for Award Management (SAM) database. DATES: Effective Date: July 22, 2013. FOR FURTHER INFORMATION...

  6. 48 CFR 52.234-4 - Earned Value Management System.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Earned Value Management....234-4 Earned Value Management System. As prescribed in 34.203(c), insert the following clause: Earned Value Management System (JUL 2006) (a) The Contractor shall use an earned value management system (EVMS...

  7. Information Requirements for a Procurement Management Information System.

    DTIC Science & Technology

    1975-08-01

    Management Information System is...described and some justification for this type of procurement management information system is presented. A literature search was made to determine...information systems. If information requirements are correctly identified and satisfied by a procurement management information system , contract administration and procurement management can be

  8. Experience Of Implementing The Integrated Management System In Manufacturing Companies In Slovakia

    NASA Astrophysics Data System (ADS)

    Lestyánszka Škůrková, Katarína; Kučerová, Marta; Fidlerová, Helena

    2015-06-01

    In corporate practice, the term of Integrated Management System means a system the aim of which is to manage an organization regarding the quality, environment, health and safety at work. In the first phase of the VEGA project No. 1/0448/13 "Transformation of ergonomics program into the company management structure through interaction and utilization QMS, EMS, HSMS", we focused on obtaining information about the way or procedure of implementing the integrated management systems in manufacturing companies in Slovakia. The paper considers characteristics of integrated management system, specifies the possibilities for successive integration of the management systems and also describes the essential aspects of the practical implementation of integrated management systems in companies in Slovakia.

  9. Strategic management of health care information systems: nurse managers' perceptions.

    PubMed

    Lammintakanen, Johanna; Kivinen, Tuula; Saranto, Kaija; Kinnunen, Juha

    2009-01-01

    The aim of this study is to describe nurse managers' perceptions of the strategic management of information systems in health care. Lack of strategic thinking is a typical feature in health care and this may also concern information systems. The data for this study was collected by eight focus group interviews including altogether 48 nurse managers from primary and specialised health care. Five main categories described the strategic management of information systems in health care; IT as an emphasis of strategy; lack of strategic management of information systems; the importance of management; problems in privacy protection; and costs of IT. Although IT was emphasised in the strategies of many health care organisations, a typical feature was a lack of strategic management of information systems. This was seen both as an underutilisation of IT opportunities in health care organisations and as increased workload from nurse managers' perspective. Furthermore, the nurse managers reported that implementation of IT strengthened their managerial roles but also required stronger management. In conclusion, strategic management of information systems needs to be strengthened in health care and nurse managers should be more involved in this process.

  10. Hydrophobicity as a design criterion for polymer scaffolds in bone tissue engineering.

    PubMed

    Jansen, Edwin J P; Sladek, Raymond E J; Bahar, Hila; Yaffe, Avinoam; Gijbels, Marion J; Kuijer, Roel; Bulstra, Sjoerd K; Guldemond, Nick A; Binderman, Itzhak; Koole, Leo H

    2005-07-01

    Porous polymeric scaffolds play a key role in most tissue-engineering strategies. A series of non-degrading porous scaffolds was prepared, based on bulk-copolymerisation of 1-vinyl-2-pyrrolidinone (NVP) and n-butyl methacrylate (BMA), followed by a particulate-leaching step to generate porosity. Biocompatibility of these scaffolds was evaluated in vitro and in vivo. Furthermore, the scaffold materials were studied using the so-called demineralised bone matrix (DBM) as an evaluation system in vivo. The DBM, which is essentially a part of a rat femoral bone after processing with mineral acid, provides a suitable environment for ectopic bone formation, provided that the cavity of the DBM is filled with bone marrow prior to subcutaneous implantation in the thoracic region of rats. Various scaffold materials, differing with respect to composition and, hence, hydrophilicity, were introduced into the centre of DBMs. The ends were closed with rat bone marrow, and ectopic bone formation was monitored after 4, 6, and 8 weeks, both through X-ray microradiography and histology. The 50:50 scaffold particles were found to readily accommodate formation of bone tissue within their pores, whereas this was much less the case for the more hydrophilic 70:30 counterpart scaffolds. New healthy bone tissue was encountered inside the pores of the 50:50 scaffold material, not only at the periphery of the constructs but also in the center. Active osteoblast cells were found at the bone-biomaterial interfaces. These data indicate that the hydrophobicity of the biomaterial is, most likely, an important design criterion for polymeric scaffolds which should promote the healing of bone defects. Furthermore, it is argued that stable, non-degrading porous biomaterials, like those used in this study, provide an important tool to expand our comprehension of the role of biomaterials in scaffold-based tissue engineering approaches.

  11. 32 CFR 239.10 - Management Controls.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Systems. Headquarters, USACE has an existing information management system that manages all information related to the HAP program. (1) HAPMIS. The Homeowners Assistance Program Management Information System... to this program and the management in formation system to protect the privacy information of Expanded...

  12. 32 CFR 239.10 - Management controls.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... systems. Headquarters, USACE has an existing information management system that manages all information related to the HAP program. (1) HAPMIS. The Homeowners Assistance Program Management Information System... to this program and the management information system to protect the privacy of Expanded HAP...

  13. 32 CFR 239.10 - Management controls.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... systems. Headquarters, USACE has an existing information management system that manages all information related to the HAP program. (1) HAPMIS. The Homeowners Assistance Program Management Information System... to this program and the management information system to protect the privacy of Expanded HAP...

  14. 32 CFR 239.10 - Management controls.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... systems. Headquarters, USACE has an existing information management system that manages all information related to the HAP program. (1) HAPMIS. The Homeowners Assistance Program Management Information System... to this program and the management information system to protect the privacy of Expanded HAP...

  15. 23 CFR 973.208 - Indian lands pavement management system (PMS).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Indian lands pavement management system (PMS). 973.208... PROGRAM Bureau of Indian Affairs Management Systems § 973.208 Indian lands pavement management system (PMS... concepts described in the AASHTO's “Pavement Management Guide.” 1 1 “Pavement Management Guide,” AASHTO...

  16. 23 CFR 973.208 - Indian lands pavement management system (PMS).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Indian lands pavement management system (PMS). 973.208... PROGRAM Bureau of Indian Affairs Management Systems § 973.208 Indian lands pavement management system (PMS... concepts described in the AASHTO's “Pavement Management Guide.” 1 1 “Pavement Management Guide,” AASHTO...

  17. 23 CFR 973.208 - Indian lands pavement management system (PMS).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Indian lands pavement management system (PMS). 973.208... PROGRAM Bureau of Indian Affairs Management Systems § 973.208 Indian lands pavement management system (PMS... concepts described in the AASHTO's “Pavement Management Guide.” 1 1 “Pavement Management Guide,” AASHTO...

  18. 23 CFR 973.208 - Indian lands pavement management system (PMS).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Indian lands pavement management system (PMS). 973.208... PROGRAM Bureau of Indian Affairs Management Systems § 973.208 Indian lands pavement management system (PMS... concepts described in the AASHTO's “Pavement Management Guide.” 1 1 “Pavement Management Guide,” AASHTO...

  19. 23 CFR 973.208 - Indian lands pavement management system (PMS).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Indian lands pavement management system (PMS). 973.208... PROGRAM Bureau of Indian Affairs Management Systems § 973.208 Indian lands pavement management system (PMS... concepts described in the AASHTO's “Pavement Management Guide.” 1 1 “Pavement Management Guide,” AASHTO...

  20. [Relationship Between Members Satisfaction with Service Club Management Processes and Perception of Club Management System.

    ERIC Educational Resources Information Center

    Dawson, Frances Trigg

    A study was made to determine the relationships between (1) satisfaction of members with service club management processes and member's perception of management systems, (2) perception of service club management system to selected independent variables, and (3) satisfaction to perception of service club management systems with independent…

  1. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2014-07-01 2014-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  2. 48 CFR 1052.234-72 - Core Earned Value Management System AUG 2011

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Management System AUG 2011 1052.234-72 Section 1052.234-72 Federal Acquisition Regulations System DEPARTMENT... and Clauses 1052.234-72 Core Earned Value Management System AUG 2011 As prescribed in DTAR 1034.203... an earned value management system (EVMS). (a) The Contractor shall use an earned value management...

  3. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2013-07-01 2012-07-01 true What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  4. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2011-07-01 2011-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  5. 48 CFR 1052.234-72 - Core Earned Value Management System AUG 2011

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Management System AUG 2011 1052.234-72 Section 1052.234-72 Federal Acquisition Regulations System DEPARTMENT... and Clauses 1052.234-72 Core Earned Value Management System AUG 2011 As prescribed in DTAR 1034.203... an earned value management system (EVMS). (a) The Contractor shall use an earned value management...

  6. 48 CFR 1052.234-72 - Core Earned Value Management System AUG 2011

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Management System AUG 2011 1052.234-72 Section 1052.234-72 Federal Acquisition Regulations System DEPARTMENT... and Clauses 1052.234-72 Core Earned Value Management System AUG 2011 As prescribed in DTAR 1034.203... an earned value management system (EVMS). (a) The Contractor shall use an earned value management...

  7. 41 CFR 302-1.100 - What is a comprehensive, automated relocation management system?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system? A comprehensive, automated relocation management system is a system that integrates into a single... 41 Public Contracts and Property Management 4 2012-07-01 2012-07-01 false What is a comprehensive, automated relocation management system? 302-1.100 Section 302-1.100 Public Contracts and Property Management...

  8. Environmental Files and Data Bases. Part A. Introduction and Oceanographic Management Information System.

    DTIC Science & Technology

    1981-09-01

    Management Information System Naval Oceanography Program Naval Oceanographic Requirements Acoustic Reference Service Research Vehicle...THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM . .. .... 2-1 3. ACOUSTIC DATA .. .. .... ......... ...... 3-1 4. GEOLOGICAL AND GEOPHYSICAL DATA...36 CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM 2-i CHAPTER 2 THE OCEANOGRAPHIC MANAGEMENT INFORMATION SYSTEM CONTENTS Page

  9. 76 FR 76917 - Homeless Management Information Systems Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-09

    ...-P-01] Homeless Management Information Systems Requirements AGENCY: Office of the Assistant Secretary... for the establishment of regulations for Homeless Management Information Systems (HMIS), which are the... community development, Homeless, Information technology system, Management system, Nonprofit organizations...

  10. 20 CFR 632.76 - Program management systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... NATIVE AMERICAN EMPLOYMENT AND TRAINING PROGRAMS Program Design and Management § 632.76 Program management systems. (a) All Native American grantees shall establish management information systems to... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Program management systems. 632.76 Section...

  11. A Data-Based Financial Management Information System (FMIS) for Administrative Sciences Department

    DTIC Science & Technology

    1990-12-01

    Financial Management Information System that would result in improved management of financial assets, better use of clerical skills, and more detailed...develops and implements a personal computer-based Management Information System for the Management of the many funding accounts controlled by the...different software programs, into a single all-encompassing Management Information System . The system was written using dBASE IV and is currently operational.

  12. The influence of enterprise resource planning (ERP) systems' performance on earnings management

    NASA Astrophysics Data System (ADS)

    Tsai, Wen-Hsien; Lee, Kuen-Chang; Liu, Jau-Yang; Lin, Sin-Jin; Chou, Yu-Wei

    2012-11-01

    We analyse whether there is a linkage between performance measures of enterprise resource planning (ERP) systems and earnings management. We find that earnings management decreases with the higher performance of ERP systems. The empirical result is as expected. We further analyse how the dimension of the DeLone and McLean model of information systems success affects earnings management. We find that the relationship between the performance of ERP systems and earnings management depends on System Quality after ERP implementation. The more System Quality improves, the more earnings management is reduced.

  13. 48 CFR 245.105 - Contractor's property management system compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... management system compliance. 245.105 Section 245.105 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT GOVERNMENT PROPERTY General 245.105 Contractor's property management system compliance. The assigned property administrator shall perform...

  14. Integrated Computer System of Management in Logistics

    NASA Astrophysics Data System (ADS)

    Chwesiuk, Krzysztof

    2011-06-01

    This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.

  15. The Art World's Concept of Negative Space Applied to System Safety Management

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald (Ronnie)

    2005-01-01

    Tools from several different disciplines can improve system safety management. This paper relates the Art World with our system safety world, showing useful art schools of thought applied to system safety management, developing an art theory-system safety bridge. This bridge is then used to demonstrate relations with risk management, the legal system, personnel management and basic management (establishing priorities). One goal of this presentation/paper is simply to be a fun diversion from the many technical topics presented during the conference.

  16. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents

    DTIC Science & Technology

    2009-06-01

    this information was not migrated to the new data- base . The responsible offices were told to destroy the old cards, and thus, vast amounts of...then necessary to examine the online service-specific records management systems , namely Army Records Information Management System (ARIMS ), Air...Force Records Information Management System (AFRIMS), and the Navy Records Management System .3 Each system

  17. 41 CFR 102-34.340 - Do we need a fleet management information system?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... management information system? 102-34.340 Section 102-34.340 Public Contracts and Property Management Federal... VEHICLE MANAGEMENT Federal Fleet Report § 102-34.340 Do we need a fleet management information system? Yes, you must have a fleet management information system at the department or agency level that — (a...

  18. 41 CFR 102-34.340 - Do we need a fleet management information system?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management information system? 102-34.340 Section 102-34.340 Public Contracts and Property Management Federal... VEHICLE MANAGEMENT Federal Fleet Report § 102-34.340 Do we need a fleet management information system? Yes, you must have a fleet management information system at the department or agency level that — (a...

  19. 41 CFR 102-34.340 - Do we need a fleet management information system?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... management information system? 102-34.340 Section 102-34.340 Public Contracts and Property Management Federal... VEHICLE MANAGEMENT Federal Fleet Report § 102-34.340 Do we need a fleet management information system? Yes, you must have a fleet management information system at the department or agency level that — (a...

  20. 41 CFR 102-34.340 - Do we need a fleet management information system?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management information system? 102-34.340 Section 102-34.340 Public Contracts and Property Management Federal... VEHICLE MANAGEMENT Federal Fleet Report § 102-34.340 Do we need a fleet management information system? Yes, you must have a fleet management information system at the department or agency level that — (a...

  1. 41 CFR 102-34.340 - Do we need a fleet management information system?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management information system? 102-34.340 Section 102-34.340 Public Contracts and Property Management Federal... VEHICLE MANAGEMENT Federal Fleet Report § 102-34.340 Do we need a fleet management information system? Yes, you must have a fleet management information system at the department or agency level that — (a...

  2. 48 CFR 970.5232-7 - Financial management system.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Financial management... for Management and Operating Contracts 970.5232-7 Financial management system. As prescribed in 970.3270(b)(1), insert the following clause: Financial Management System (DEC 2000) The Contractor shall...

  3. 48 CFR 970.5232-7 - Financial management system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... maintain and administer a financial management system that is suitable to provide proper accounting in... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Financial management... for Management and Operating Contracts 970.5232-7 Financial management system. As prescribed in 970...

  4. Development of the KOSMS management simulation training system and its application

    NASA Astrophysics Data System (ADS)

    Takatsu, Yoshiki

    The use of games which simulate actual corporate management has recently become more common and is now utilized in various ways for in-house corporate training courses. KOSMS (Kobe Steel Management Simulation System), a training system designed to help improve the management skills of senior management staff, is a unique management simulation training system in which the participants, using personal computers, must make decisions concerning a variety of management activities, in simulated competition with other corporations. This report outlines the KOSMS system, and describes the basic structure and detailed contents of the management simulation models, and actual application of the KOSMS management simulation training.

  5. Evaluation of Microcomputer-Based Operation and Maintenance Management Systems for Army Water/Wastewater Treatment Plant Operation.

    DTIC Science & Technology

    1986-07-01

    COMPUTER-AIDED OPERATION MANAGEMENT SYSTEM ................. 29 Functions of an Off-Line Computer-Aided Operation Management System Applications of...System Comparisons 85 DISTRIBUTION 5V J. • 0. FIGURES Number Page 1 Hardware Components 21 2 Basic Functions of a Computer-Aided Operation Management System...Plant Visits 26 4 Computer-Aided Operation Management Systems Reviewed for Analysis of Basic Functions 29 5 Progress of Software System Installation and

  6. 5 CFR 9701.405 - Performance management system requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... performance management systems for DHS employees, subject to the requirements set forth in this subpart. (b) Each DHS performance management system must— (1) Specify the employees covered by the system(s); (2... 5 Administrative Personnel 3 2013-01-01 2013-01-01 false Performance management system...

  7. 5 CFR 9701.405 - Performance management system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... performance management systems for DHS employees, subject to the requirements set forth in this subpart. (b) Each DHS performance management system must— (1) Specify the employees covered by the system(s); (2... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Performance management system...

  8. 5 CFR 9701.405 - Performance management system requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... performance management systems for DHS employees, subject to the requirements set forth in this subpart. (b) Each DHS performance management system must— (1) Specify the employees covered by the system(s); (2... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Performance management system...

  9. 5 CFR 9701.405 - Performance management system requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... performance management systems for DHS employees, subject to the requirements set forth in this subpart. (b) Each DHS performance management system must— (1) Specify the employees covered by the system(s); (2... 5 Administrative Personnel 3 2014-01-01 2014-01-01 false Performance management system...

  10. Home and Building Energy Management Systems | Grid Modernization | NREL

    Science.gov Websites

    Home and Building Energy Management Systems Home and Building Energy Management Systems NREL building assets and energy management systems can provide value to the grid. Photo of a pair of NREL researchers who received a record of invention for a home energy management system in a smart home laboratory

  11. Feasibility of Executing MIMS on Interdata 80.

    DTIC Science & Technology

    CDC 6500 computers, CDC 6600 computers, MIMS(Medical Information Management System ), Medical information management system , File structures, Computer...storage managementThe report examines the feasibility of implementing large information management system on mini-computers. The Medical Information ... Management System and the Interdata 80 mini-computer were selected as being representative systems. The FORTRAN programs currently being used in MIMS

  12. Design distributed simulation platform for vehicle management system

    NASA Astrophysics Data System (ADS)

    Wen, Zhaodong; Wang, Zhanlin; Qiu, Lihua

    2006-11-01

    Next generation military aircraft requires the airborne management system high performance. General modules, data integration, high speed data bus and so on are needed to share and manage information of the subsystems efficiently. The subsystems include flight control system, propulsion system, hydraulic power system, environmental control system, fuel management system, electrical power system and so on. The unattached or mixed architecture is changed to integrated architecture. That means the whole airborne system is regarded into one system to manage. So the physical devices are distributed but the system information is integrated and shared. The process function of each subsystem are integrated (including general process modules, dynamic reconfiguration), furthermore, the sensors and the signal processing functions are shared. On the other hand, it is a foundation for power shared. Establish a distributed vehicle management system using 1553B bus and distributed processors which can provide a validation platform for the research of airborne system integrated management. This paper establishes the Vehicle Management System (VMS) simulation platform. Discuss the software and hardware configuration and analyze the communication and fault-tolerant method.

  13. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  14. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  15. 48 CFR 1034.201 - Policy.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CONTRACTING MAJOR SYSTEM ACQUISITION Earned Value Management System 1034.201 Policy. (a) (1) An Earned Value Management System (EVMS) is required for major acquisitions for development/modernization/enhancement (DME..., Earned Value Management System; and, as appropriate, 1052.234-4, Earned Value Management System Alternate...

  16. Design of investment management optimization system for power grid companies under new electricity reform

    NASA Astrophysics Data System (ADS)

    Yang, Chunhui; Su, Zhixiong; Wang, Xin; Liu, Yang; Qi, Yongwei

    2017-03-01

    The new normalization of the economic situation and the implementation of a new round of electric power system reform put forward higher requirements to the daily operation of power grid companies. As an important day-to-day operation of power grid companies, investment management is directly related to the promotion of the company's operating efficiency and management level. In this context, the establishment of power grid company investment management optimization system will help to improve the level of investment management and control the company, which is of great significance for power gird companies to adapt to market environment changing as soon as possible and meet the policy environment requirements. Therefore, the purpose of this paper is to construct the investment management optimization system of power grid companies, which includes investment management system, investment process control system, investment structure optimization system, and investment project evaluation system and investment management information platform support system.

  17. 75 FR 41798 - Solicitation of Letters of Interest to Participate in Biotechnology Quality Management System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-19

    ...] Solicitation of Letters of Interest to Participate in Biotechnology Quality Management System Program AGENCY... participate in the APHIS Biotechnology Quality Management System Program. The Biotechnology Quality Management..., audit-based compliance assistance program known as the Biotechnology Quality Management System Program...

  18. 75 FR 61413 - Notice of Availability of Biotechnology Quality Management System Audit Standard and Evaluation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-05

    ...] Notice of Availability of Biotechnology Quality Management System Audit Standard and Evaluation of... Biotechnology Quality Management System Program (BQMS Program) to assist regulated entities in achieving and... customized biotechnology quality management system (BQMS) to improve their management of domestic research...

  19. 25 CFR 900.54 - Should the property management system prescribe internal controls?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... EDUCATION ASSISTANCE ACT Standards for Tribal or Tribal Organization Management Systems Property Management... contract consistent with the Indian tribe or tribal organization's property management system. ... 25 Indians 2 2010-04-01 2010-04-01 false Should the property management system prescribe internal...

  20. 25 CFR 900.54 - Should the property management system prescribe internal controls?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... EDUCATION ASSISTANCE ACT Standards for Tribal or Tribal Organization Management Systems Property Management... contract consistent with the Indian tribe or tribal organization's property management system. ... 25 Indians 2 2011-04-01 2011-04-01 false Should the property management system prescribe internal...

  1. 41 CFR 105-71.120 - Standards for financial management systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management systems. 105-71.120 Section 105-71.120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services... management systems. (a) A State must expand and account for grant funds in accordance with State laws and...

  2. 41 CFR 105-71.120 - Standards for financial management systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management systems. 105-71.120 Section 105-71.120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services... management systems. (a) A State must expand and account for grant funds in accordance with State laws and...

  3. 22 CFR 145.21 - Standards for financial management systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Standards for financial management systems. 145... Financial and Program Management § 145.21 Standards for financial management systems. (a) The Department... whenever practical. (b) Recipients' financial management systems shall provide for the following. (1...

  4. 29 CFR 95.21 - Standards for financial management systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Standards for financial management systems. 95.21 Section 95... Requirements Financial and Program Management § 95.21 Standards for financial management systems. (a... practical. (b) Recipients' financial management systems shall provide for the following: (1) Accurate...

  5. 7 CFR 3019.21 - Standards for financial management systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Standards for financial management systems. 3019.21... Requirements Financial and Program Management § 3019.21 Standards for financial management systems. (a) Federal... cost information whenever practical. (b) Recipients' financial management systems shall provide for the...

  6. 75 FR 21264 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-23

    ... Defense. A0030-22 AMC System name: Army Food Management Information System Records System location.... 9397, as amended. Purpose(s): The Army Food Management Information System will be used to automate the...: Supervisor, Army Food Management Information System, Program Manager, 401 First Street, Suite 157, Fort Lee...

  7. Transportation infrastructure : states' implementation of transportation management systems

    DOT National Transportation Integrated Search

    1997-01-13

    This report focuses on the U.S. General Accounting Office's ISTEA update of the states' implementation of pavement management systems, bridges, highway safety, congestion management systems, public transportation, and intermodal management systems. A...

  8. Intelligent community management system based on the devicenet fieldbus

    NASA Astrophysics Data System (ADS)

    Wang, Yulan; Wang, Jianxiong; Liu, Jiwen

    2013-03-01

    With the rapid development of the national economy and the improvement of people's living standards, people are making higher demands on the living environment. And the estate management content, management efficiency and service quality have been higher required. This paper in-depth analyzes about the intelligent community of the structure and composition. According to the users' requirements and related specifications, it achieves the district management systems, which includes Basic Information Management: the management level of housing, household information management, administrator-level management, password management, etc. Service Management: standard property costs, property charges collecting, the history of arrears and other property expenses. Security Management: household gas, water, electricity and security and other security management, security management district and other public places. Systems Management: backup database, restore database, log management. This article also carries out on the Intelligent Community System analysis, proposes an architecture which is based on B / S technology system. And it has achieved a global network device management with friendly, easy to use, unified human - machine interface.

  9. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  10. 5 CFR 430.304 - SES performance management systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false SES performance management systems. 430.304 Section 430.304 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PERFORMANCE MANAGEMENT Managing Senior Executive Performance § 430.304 SES performance management systems. (a...

  11. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  12. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  13. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  14. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  15. The Management and Demonstration System at Murray State University.

    ERIC Educational Resources Information Center

    Schroeder, Gary G.

    The management system in use at the Murray State University Teacher Corps Project is described. The system uses management by objectives and the demonstration approach, and encourages managers to focus on the development and demonstration of ideas, processes, and structures. The system's operating concepts of time management and human resources…

  16. 5 CFR 297.103 - Designations of authority by system manager.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... manager. 297.103 Section 297.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... system manager. The responsible Office system manager having jurisdiction over a system of records may designate in writing an Office employee to evaluate and issue the Office's decision on Privacy Act matters...

  17. 5 CFR 297.103 - Designations of authority by system manager.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... manager. 297.103 Section 297.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... system manager. The responsible Office system manager having jurisdiction over a system of records may designate in writing an Office employee to evaluate and issue the Office's decision on Privacy Act matters...

  18. 5 CFR 297.103 - Designations of authority by system manager.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... manager. 297.103 Section 297.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... system manager. The responsible Office system manager having jurisdiction over a system of records may designate in writing an Office employee to evaluate and issue the Office's decision on Privacy Act matters...

  19. 5 CFR 297.103 - Designations of authority by system manager.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... manager. 297.103 Section 297.103 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... system manager. The responsible Office system manager having jurisdiction over a system of records may designate in writing an Office employee to evaluate and issue the Office's decision on Privacy Act matters...

  20. Correlation Research of Medical Security Management System Network Platform in Medical Practice

    NASA Astrophysics Data System (ADS)

    Jie, Wang; Fan, Zhang; Jian, Hao; Li-nong, Yu; Jun, Fei; Ping, Hao; Ya-wei, Shen; Yue-jin, Chang

    Objective-The related research of medical security management system network in medical practice. Methods-Establishing network platform of medical safety management system, medical security network host station, medical security management system(C/S), medical security management system of departments and sections, comprehensive query, medical security disposal and examination system. Results-In medical safety management, medical security management system can reflect the hospital medical security problem, and can achieve real-time detection and improve the medical security incident detection rate. Conclusion-The application of the research in the hospital management implementation, can find hospital medical security hidden danger and the problems of medical disputes, and can help in resolving medical disputes in time and achieve good work efficiency, which is worth applying in the hospital practice.

Top