Sample records for large information system

  1. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  2. Review and synthesis of problems and directions for large scale geographic information system development

    NASA Technical Reports Server (NTRS)

    Boyle, A. R.; Dangermond, J.; Marble, D.; Simonett, D. S.; Tomlinson, R. F.

    1983-01-01

    Problems and directions for large scale geographic information system development were reviewed and the general problems associated with automated geographic information systems and spatial data handling were addressed.

  3. Support of an Active Science Project by a Large Information System: Lessons for the EOS Era

    NASA Technical Reports Server (NTRS)

    Angelici, Gary L.; Skiles, J. W.; Popovici, Lidia Z.

    1993-01-01

    The ability of large information systems to support the changing data requirements of active science projects is being tested in a NASA collaborative study. This paper briefly profiles both the active science project and the large information system involved in this effort and offers some observations about the effectiveness of the project support. This is followed by lessons that are important for those participating in large information systems that need to support active science projects or that make available the valuable data produced by these projects. We learned in this work that it is difficult for a large information system focused on long term data management to satisfy the requirements of an on-going science project. For example, in order to provide the best service, it is important for all information system staff to keep focused on the needs and constraints of the scientists in the development of appropriate services. If the lessons learned in this and other science support experiences are not applied by those involved with large information systems of the EOS (Earth Observing System) era, then the final data products produced by future science projects may not be robust or of high quality, thereby making the conduct of the project science less efficacious and reducing the value of these unique suites of data for future research.

  4. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  5. The Application of Large-Scale Hypermedia Information Systems to Training.

    ERIC Educational Resources Information Center

    Crowder, Richard; And Others

    1995-01-01

    Discusses the use of hypermedia in electronic information systems that support maintenance operations in large-scale industrial plants. Findings show that after establishing an information system, the same resource base can be used to train personnel how to use the computer system and how to perform operational and maintenance tasks. (Author/JMV)

  6. Information Systems Integration and Enterprise Application Integration (EAI) Adoption: A Case from Financial Services

    ERIC Educational Resources Information Center

    Lam, Wing

    2007-01-01

    Increasingly, organizations find that they need to integrate large number of information systems in order to support enterprise-wide business initiatives such as e-business, supply chain management and customer relationship management. To date, organizations have largely tended to address information systems (IS) integration in an ad-hoc manner.…

  7. Teach or No Teach: Is Large System Education Resurging?

    ERIC Educational Resources Information Center

    Sharma, Aditya; Murphy, Marianne C.

    2011-01-01

    Legacy or not, mainframe education is being taught at many U.S. universities. Some computer science programs have always had some large system content but there does appear to be resurgence of mainframe related content in business programs such as Management Information Systems (MIS) and Computer Information Systems (CIS). Many companies such as…

  8. Lexical Problems in Large Distributed Information Systems.

    ERIC Educational Resources Information Center

    Berkovich, Simon Ya; Shneiderman, Ben

    1980-01-01

    Suggests a unified concept of a lexical subsystem as part of an information system to deal with lexical problems in local and network environments. The linguistic and control functions of the lexical subsystems in solving problems for large computer systems are described, and references are included. (Author/BK)

  9. Programming secure mobile agents in healthcare environments using role-based permissions.

    PubMed

    Georgiadis, C K; Baltatzis, J; Pangalos, G I

    2003-01-01

    The healthcare environment consists of vast amounts of dynamic and unstructured information, distributed over a large number of information systems. Mobile agent technology is having an ever-growing impact on the delivery of medical information. It supports acquiring and manipulating information distributed in a large number of information systems. Moreover is suitable for the computer untrained medical stuff. But the introduction of mobile agents generates advanced threads to the sensitive healthcare information, unless the proper countermeasures are taken. By applying the role-based approach to the authorization problem, we ease the sharing of information between hospital information systems and we reduce the administering part. The different initiative of the agent's migration method, results in different methods of assigning roles to the agent.

  10. Technology for large space systems: A bibliography with indexes (supplement 08)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 414 reports, articles and other documents introduced into the NASA scientific and technical information system. It provides helpful information to the researcher, manager, and designer in technology development and mission design in the area of Large Space System Technology. Subject matter is grouped according to systems, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  11. Technology for large space systems: A bibliography with indexes (supplement 09)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 414 reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1983 and June 30, 1983. Information on technology development and mission design in the area of Large Space System Technology is provided. Subject matter is grouped according to systems, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics. advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  12. Technology for large space systems: A bibliography with indexes (supplement 10)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The bibliography lists 408 reports, articles and other documents introduced into the NASA scientific and technical information system to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of large space system technology. Subject matter is grouped according to systems, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  13. Feasibility of Executing MIMS on Interdata 80.

    DTIC Science & Technology

    CDC 6500 computers, CDC 6600 computers, MIMS(Medical Information Management System ), Medical information management system , File structures, Computer...storage managementThe report examines the feasibility of implementing large information management system on mini-computers. The Medical Information ... Management System and the Interdata 80 mini-computer were selected as being representative systems. The FORTRAN programs currently being used in MIMS

  14. Informatics Resources to Support Health Care Quality Improvement in the Veterans Health Administration

    PubMed Central

    Hynes, Denise M.; Perrin, Ruth A.; Rappaport, Steven; Stevens, Joanne M.; Demakis, John G.

    2004-01-01

    Information systems are increasingly important for measuring and improving health care quality. A number of integrated health care delivery systems use advanced information systems and integrated decision support to carry out quality assurance activities, but none as large as the Veterans Health Administration (VHA). The VHA's Quality Enhancement Research Initiative (QUERI) is a large-scale, multidisciplinary quality improvement initiative designed to ensure excellence in all areas where VHA provides health care services, including inpatient, outpatient, and long-term care settings. In this paper, we describe the role of information systems in the VHA QUERI process, highlight the major information systems critical to this quality improvement process, and discuss issues associated with the use of these systems. PMID:15187063

  15. PASSIM--an open source software system for managing information in biomedical studies.

    PubMed

    Viksna, Juris; Celms, Edgars; Opmanis, Martins; Podnieks, Karlis; Rucevskis, Peteris; Zarins, Andris; Barrett, Amy; Neogi, Sudeshna Guha; Krestyaninova, Maria; McCarthy, Mark I; Brazma, Alvis; Sarkans, Ugis

    2007-02-09

    One of the crucial aspects of day-to-day laboratory information management is collection, storage and retrieval of information about research subjects and biomedical samples. An efficient link between sample data and experiment results is absolutely imperative for a successful outcome of a biomedical study. Currently available software solutions are largely limited to large-scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but often implies sufficient investment of time, effort and funds, which are not always available. There is a clear need for lightweight open source systems for patient and sample information management. We present a web-based tool for submission, management and retrieval of sample and research subject data. The system secures confidentiality by separating anonymized sample information from individuals' records. It is simple and generic, and can be customised for various biomedical studies. Information can be both entered and accessed using the same web interface. User groups and their privileges can be defined. The system is open-source and is supplied with an on-line tutorial and necessary documentation. It has proven to be successful in a large international collaborative project. The presented system closes the gap between the need and the availability of lightweight software solutions for managing information in biomedical studies involving human research subjects.

  16. Process and information integration via hypermedia

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.

    1990-01-01

    Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.

  17. Landscape permeability for large carnivores in Washington: a geographic information system weighted-distance and least-cost corridor assessment.

    Treesearch

    Peter H. Singleton; William L. Gaines; John F. Lehmkuhl

    2002-01-01

    We conducted a regional-scale evaluation of landscape permeability for large carnivores in Washington and adjacent portions of British Columbia and Idaho. We developed geographic information system based landscape permeability models for wolves (Canis lupus), wolverine (Gulo gulo), lynx (Lynx canadensis),...

  18. Technology for large space systems: A special bibliography with indexes (supplement 04)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists 259 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1980 and December 31, 1980. Its purpose is to provide information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology Program. Subject matter is grouped according to systems, interactive analysis and design. Structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  19. Identifying Challenges to the Integration of Computer-Based Surveillance Information Systems in a Large City Health Department: A Case Study.

    PubMed

    Jennings, Jacky M; Stover, Jeffrey A; Bair-Merritt, Megan H; Fichtenberg, Caroline; Munoz, Mary Grace; Maziad, Rafiq; Ketemepi, Sherry Johnson; Zenilman, Jonathan

    2009-01-01

    Integrated infectious disease surveillance information systems have the potential to provide important new surveillance capacities and business efficiencies for local health departments. We conducted a case study at a large city health department of the primary computer-based infectious disease surveillance information systems during a 10-year period to identify the major challenges for information integration across the systems. The assessment included key informant interviews and evaluations of the computer-based surveillance information systems used for acute communicable diseases, human immunodeficiency virus/acquired immunodeficiency syndrome, sexually transmitted diseases, and tuberculosis. Assessments were conducted in 1998 with a follow-up in 2008. Assessments specifically identified and described the primary computer-based surveillance information system, any duplicative information systems, and selected variables collected. Persistent challenges to information integration across the information systems included the existence of duplicative data systems, differences in the variables used to collect similar information, and differences in basic architecture. The assessments identified a number of challenges for information integration across the infectious disease surveillance information systems at this city health department. The results suggest that local disease control programs use computer-based surveillance information systems that were not designed for data integration. To the extent that integration provides important new surveillance capacities and business efficiencies, we recommend that patient-centric information systems be designed that provide all the epidemiologic, clinical, and research needs in one system. In addition, the systems should include a standard system of elements and fields across similar surveillance systems.

  20. Technology for large space systems: A bibliography with indexes (supplement 20)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This bibliography lists 694 reports, articles, and other documents introduced into the NASA Scientific and Technical Information System between July, 1988 and December, 1988. Its purpose is to provide helpful information to the researcher or manager engaged in the development of technologies related to large space systems. Subject areas include mission and program definition, design techniques, structural and thermal analysis, structural dynamics and control systems, electronics, advanced materials, assembly concepts, and propulsion.

  1. An SQL query generator for CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Chirica, Laurian

    1990-01-01

    As expert systems become more widely used, their access to large amounts of external information becomes increasingly important. This information exists in several forms such as statistical, tabular data, knowledge gained by experts and large databases of information maintained by companies. Because many expert systems, including CLIPS, do not provide access to this external information, much of the usefulness of expert systems is left untapped. The scope of this paper is to describe a database extension for the CLIPS expert system shell. The current industry standard database language is SQL. Due to SQL standardization, large amounts of information stored on various computers, potentially at different locations, will be more easily accessible. Expert systems should be able to directly access these existing databases rather than requiring information to be re-entered into the expert system environment. The ORACLE relational database management system (RDBMS) was used to provide a database connection within the CLIPS environment. To facilitate relational database access a query generation system was developed as a CLIPS user function. The queries are entered in a CLlPS-like syntax and are passed to the query generator, which constructs and submits for execution, an SQL query to the ORACLE RDBMS. The query results are asserted as CLIPS facts. The query generator was developed primarily for use within the ICADS project (Intelligent Computer Aided Design System) currently being developed by the CAD Research Unit in the California Polytechnic State University (Cal Poly). In ICADS, there are several parallel or distributed expert systems accessing a common knowledge base of facts. Expert system has a narrow domain of interest and therefore needs only certain portions of the information. The query generator provides a common method of accessing this information and allows the expert system to specify what data is needed without specifying how to retrieve it.

  2. MIADS2 ... an alphanumeric map information assembly and display system for a large computer

    Treesearch

    Elliot L. Amidon

    1966-01-01

    A major improvement and extension of the Map Information Assembly and Display System (MIADS) developed in 1964 is described. Basic principles remain unchanged, but the computer programs have been expanded and rewritten for a large computer, in Fortran IV and MAP languages. The code system is extended from 99 integers to about 2,200 alphanumeric 2-character codes. Hand-...

  3. Study on GIS-based sport-games information system

    NASA Astrophysics Data System (ADS)

    Peng, Hongzhi; Yang, Lingbin; Deng, Meirong; Han, Yongshun

    2008-10-01

    With the development of internet and such info-technologies as, Information Superhighway, Computer Technology, Remote Sensing(RS), Global Positioning System(GPS), Digital Communication and National Information Network(NIN),etc. Geographic Information System (GIS) becomes more and more popular in fields of science and industries. It is not only feasible but also necessary to apply GIS to large-scale sport games. This paper firstly discussed GIS technology and its application, then elaborated on the frame and content of Sport-Games Geography Information System(SG-GIS) with the function of gathering, storing, processing, sharing, exchanging and utilizing all kind of spatial-temporal information about sport games, and lastly designed and developed a public service GIS for the 6th Asian Winter Games in Changchun, China(CAWGIS). The application of CAWGIS showed that the established SG-GIS was feasible and GIS-based sport games information system was able to effectively process a large amount of sport-games information and provide the real-time sport games service for governors, athletes and the public.

  4. Change management and clinical engagement: critical elements for a successful clinical information system implementation.

    PubMed

    Detwiller, Maureen; Petillion, Wendy

    2014-06-01

    Moving a large healthcare organization from an old, nonstandardized clinical information system to a new user-friendly, standards-based system was much more than an upgrade to technology. This project to standardize terminology, optimize key processes, and implement a new clinical information system was a large change initiative over 4 years that affected clinicians across the organization. Effective change management and engagement of clinical stakeholders were critical to the success of the initiative. The focus of this article was to outline the strategies and methodologies used and the lessons learned.

  5. The application of artificial intelligence techniques to large distributed networks

    NASA Technical Reports Server (NTRS)

    Dubyah, R.; Smith, T. R.; Star, J. L.

    1985-01-01

    Data accessibility and transfer of information, including the land resources information system pilot, are structured as large computer information networks. These pilot efforts include the reduction of the difficulty to find and use data, reducing processing costs, and minimize incompatibility between data sources. Artificial Intelligence (AI) techniques were suggested to achieve these goals. The applicability of certain AI techniques are explored in the context of distributed problem solving systems and the pilot land data system (PLDS). The topics discussed include: PLDS and its data processing requirements, expert systems and PLDS, distributed problem solving systems, AI problem solving paradigms, query processing, and distributed data bases.

  6. SPIRES (Stanford Public Information REtrieval System). Annual Report (2d, 1968).

    ERIC Educational Resources Information Center

    Parker, Edwin B.; And Others

    During 1968 the name of the project was changed from Stanford Physics Information Retrieval System" to "Stanford Public Information Retrieval System" to reflect the broadening of perspective and goals due to formal collaboration with Project BALLOTS (Bibliographic Automation of Large Library Operations using a Time-Sharing System).…

  7. Medical Information Management System (MIMS): A generalized interactive information system

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Friedman, C. A.; Hipkins, K. R.

    1975-01-01

    An interactive information system is described. It is a general purpose, free format system which offers immediate assistance where manipulation of large data bases is required. The medical area is a prime area of application. Examples of the system's operation, commentary on the examples, and a complete listing of the system program are included.

  8. Technology for large space systems: A special bibliography with indexes

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 460 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1968 and December 31, 1978. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  9. Technology for large space systems: A special bibliography with indexes (supplement 01)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This bibliography lists 180 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1979 and June 30, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, and flight experiments.

  10. LARGE BUILDING RADON MANUAL

    EPA Science Inventory

    The report summarizes information on how bilding systems -- especially the heating, ventilating, and air-conditioning (HVAC) system -- inclurence radon entry into large buildings and can be used to mitigate radon problems. It addresses the fundamentals of large building HVAC syst...

  11. Technology for large space systems: A bibliography with indexes (supplement 22)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This bibliography lists 1077 reports, articles, and other documents introduced into the NASA Scientific and Technical Information System between July 1, 1989 and December 31, 1989. Its purpose is to provide helpful information to the researcher or manager engaged in the development of technologies related to large space systems. Subject areas include mission and program definition, design techniques, structural and thermal analysis, structural dynamics and control systems, electronics, advanced materials, assembly concepts, and propulsion.

  12. Technology for Large Space Systems: A Special Bibliography with Indexes (Supplement 2)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    This bibliography lists 258 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1979 and December 31, 1979. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  13. Technology for large space systems: A special bibliography with indexes (supplement 05)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This bibliography lists 298 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1981 and June 30, 1981. Its purpose is to provide helpful, information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  14. Technology for large space systems: A bibliography with indexes (supplement 12)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A bibliography listing 516 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1984 and December 31, 1984 is presented. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of Large Space System Technology. Subject matter is grouped according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  15. Technology for large space systems: A special bibliography with indexes (supplement 06)

    NASA Technical Reports Server (NTRS)

    1982-01-01

    This bibliography lists 220 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1981 and December 31, 1981. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design in the area of the Large Space Systems Technology (LSST) Program. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  16. Position measurement of the direct drive motor of Large Aperture Telescope

    NASA Astrophysics Data System (ADS)

    Li, Ying; Wang, Daxing

    2010-07-01

    Along with the development of space and astronomy science, production of large aperture telescope and super large aperture telescope will definitely become the trend. It's one of methods to solve precise drive of large aperture telescope using direct drive technology unified designed of electricity and magnetism structure. A direct drive precise rotary table with diameter of 2.5 meters researched and produced by us is a typical mechanical & electrical integration design. This paper mainly introduces position measurement control system of direct drive motor. In design of this motor, position measurement control system requires having high resolution, and precisely aligning the position of rotor shaft and making measurement, meanwhile transferring position information to position reversing information corresponding to needed motor pole number. This system has chosen high precision metal band coder and absolute type coder, processing information of coders, and has sent 32-bit RISC CPU making software processing, and gained high resolution composite coder. The paper gives relevant laboratory test results at the end, indicating the position measurement can apply to large aperture telescope control system. This project is subsidized by Chinese National Natural Science Funds (10833004).

  17. On the management and processing of earth resources information

    NASA Technical Reports Server (NTRS)

    Skinner, C. W.; Gonzalez, R. C.

    1973-01-01

    The basic concepts of a recently completed large-scale earth resources information system plan are reported. Attention is focused throughout the paper on the information management and processing requirements. After the development of the principal system concepts, a model system for implementation at the state level is discussed.

  18. An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Mary C.

    1985-01-01

    There exists a large number of large-scale bibliographic Information Storage and Retrieval Systems containing large amounts of valuable data of interest in a wide variety of research applications. These systems are not used to capacity because the end users, i.e., the researchers, have not been trained in the techniques of accessing such systems. This thesis describes the development of a transportable, university-level course in methods of querying on-line interactive Information Storage and Retrieval systems as a solution to this problem. This course was designed to instruct upper division science and engineering students to enable these end users to directly access such systems. The course is designed to be taught by instructors who are not specialists in either computer science or research skills. It is independent of any particular IS and R system or computer hardware. The project is sponsored by NASA and conducted by the University of Southwestern Louisiana and Southern University.

  19. Human Factors in the Large: Experiences from Denmark, Finland and Canada in Moving Towards Regional and National Evaluations of Health Information System Usability. Contribution of the IMIA Human Factors Working Group.

    PubMed

    Kushniruk, A; Kaipio, J; Nieminen, M; Hyppönen, H; Lääveri, T; Nohr, C; Kanstrup, A M; Berg Christiansen, M; Kuo, M-H; Borycki, E

    2014-08-15

    The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems.

  20. Human Factors in the Large: Experiences from Denmark, Finland and Canada in Moving Towards Regional and National Evaluations of Health Information System Usability

    PubMed Central

    Kaipio, J.; Nieminen, M.; Hyppönen, H.; Lääveri, T.; Nohr, C.; Kanstrup, A. M.; Berg Christiansen, M.; Kuo, M.-H.; Borycki, E.

    2014-01-01

    Summary Objectives The objective of this paper is to explore approaches to understanding the usability of health information systems at regional and national levels. Methods Several different methods are discussed in case studies from Denmark, Finland and Canada. They range from small scale qualitative studies involving usability testing of systems to larger scale national level questionnaire studies aimed at assessing the use and usability of health information systems by entire groups of health professionals. Results It was found that regional and national usability studies can complement smaller scale usability studies, and that they are needed in order to understand larger trends regarding system usability. Despite adoption of EHRs, many health professionals rate the usability of the systems as low. A range of usability issues have been noted when data is collected on a large scale through use of widely distributed questionnaires and websites designed to monitor user perceptions of usability. Conclusion As health information systems are deployed on a widespread basis, studies that examine systems used regionally or nationally are required. In addition, collection of large scale data on the usability of specific IT products is needed in order to complement smaller scale studies of specific systems. PMID:25123725

  1. Information fusion based optimal control for large civil aircraft system.

    PubMed

    Zhen, Ziyang; Jiang, Ju; Wang, Xinhua; Gao, Chen

    2015-03-01

    Wind disturbance has a great influence on landing security of Large Civil Aircraft. Through simulation research and engineering experience, it can be found that PID control is not good enough to solve the problem of restraining the wind disturbance. This paper focuses on anti-wind attitude control for Large Civil Aircraft in landing phase. In order to improve the riding comfort and the flight security, an information fusion based optimal control strategy is presented to restrain the wind in landing phase for maintaining attitudes and airspeed. Data of Boeing707 is used to establish a nonlinear mode with total variables of Large Civil Aircraft, and then two linear models are obtained which are divided into longitudinal and lateral equations. Based on engineering experience, the longitudinal channel adopts PID control and C inner control to keep longitudinal attitude constant, and applies autothrottle system for keeping airspeed constant, while an information fusion based optimal regulator in the lateral control channel is designed to achieve lateral attitude holding. According to information fusion estimation, by fusing hard constraint information of system dynamic equations and the soft constraint information of performance index function, optimal estimation of the control sequence is derived. Based on this, an information fusion state regulator is deduced for discrete time linear system with disturbance. The simulation results of nonlinear model of aircraft indicate that the information fusion optimal control is better than traditional PID control, LQR control and LQR control with integral action, in anti-wind disturbance performance in the landing phase. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Multi-Modal Traveler Information System - Gateway Functional Requirements

    DOT National Transportation Integrated Search

    1997-11-17

    The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

  3. Multi-Modal Traveler Information System - Gateway Interface Control Requirements

    DOT National Transportation Integrated Search

    1997-10-30

    The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

  4. 2016 pocket guide to large truck and bus statistics.

    DOT National Transportation Integrated Search

    2016-05-01

    FMCSA created and maintains the Motor Carrier Management Information System (MCMIS). MCMIS contains information on the safety performance of commercial motor carriers (large trucks and buses) and hazardous materials (HM) carriers subject to the Feder...

  5. 2017 Pocket Guide to Large Truck and Bus Statistics.

    DOT National Transportation Integrated Search

    2017-06-01

    FMCSA created and maintains the Motor Carrier Management Information System (MCMIS). MCMIS contains information on the safety performance of commercial motor carriers (large trucks and buses) and hazardous materials (HM) carriers subject to the Feder...

  6. The Changing Environment of Management Information Systems.

    ERIC Educational Resources Information Center

    Tagawa, Ken

    1982-01-01

    The promise of mainframe computers in the 1970s for management information systems (MIS) is largely unfulfilled, and newer office automation systems and data communication systems are designed to be responsive to MIS needs. The status of these innovations is briefly outlined. (MSE)

  7. Applications of Artificial Intelligence to Information Search and Retrieval: The Development and Testing of an Intelligent Technical Information System.

    ERIC Educational Resources Information Center

    Harvey, Francis A.

    This paper describes the evolution and development of an intelligent information system, i.e., a knowledge base for steel structures being undertaken as part of the Technical Information Center for Steel Structures at Lehigh University's Center of Advanced Technology for Large Structural Systems (ATLSS). The initial development of the Technical…

  8. A Lean Approach to Improving SE Visibility in Large Operational Systems Evolution

    DTIC Science & Technology

    2013-06-01

    large health care system of systems. To enhance both visibility and flow, the approach utilizes visualization techniques, pull-scheduling processes...development processes. This paper describes an example implementation of the concept in a large health care system of systems. To enhance both visibility...and then provides the results to the requestor as soon as available. Hospital System Information Support Development The health care SoS is a set

  9. Multi-Modal Traveler Information System - GCM Corridor Architecture Interface Control Requirements

    DOT National Transportation Integrated Search

    1997-10-31

    The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

  10. Multi-Modal Traveler Information System - GCM Corridor Architecture Functional Requirements

    DOT National Transportation Integrated Search

    1997-11-17

    The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

  11. Large deviation analysis of a simple information engine

    NASA Astrophysics Data System (ADS)

    Maitland, Michael; Grosskinsky, Stefan; Harris, Rosemary J.

    2015-11-01

    Information thermodynamics provides a framework for studying the effect of feedback loops on entropy production. It has enabled the understanding of novel thermodynamic systems such as the information engine, which can be seen as a modern version of "Maxwell's Dæmon," whereby a feedback controller processes information gained by measurements in order to extract work. Here, we analyze a simple model of such an engine that uses feedback control based on measurements to obtain negative entropy production. We focus on the distribution and fluctuations of the information obtained by the feedback controller. Significantly, our model allows an analytic treatment for a two-state system with exact calculation of the large deviation rate function. These results suggest an approximate technique for larger systems, which is corroborated by simulation data.

  12. Internet based ECG medical information system.

    PubMed

    James, D A; Rowlands, D; Mahnovetski, R; Channells, J; Cutmore, T

    2003-03-01

    Physiological monitoring of humans for medical applications is well established and ready to be adapted to the Internet. This paper describes the implementation of a Medical Information System (MIS-ECG system) incorporating an Internet based ECG acquisition device. Traditionally clinical monitoring of ECG is largely a labour intensive process with data being typically stored on paper. Until recently, ECG monitoring applications have also been constrained somewhat by the size of the equipment required. Today's technology enables large and fixed hospital monitoring systems to be replaced by small portable devices. With an increasing emphasis on health management a truly integrated information system for the acquisition, analysis, patient particulars and archiving is now a realistic possibility. This paper describes recent Internet and technological advances and presents the design and testing of the MIS-ECG system that utilises those advances.

  13. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  14. An Analysis of Information Technology Adoption by IRBs of Large Academic Medical Centers in the United States.

    PubMed

    He, Shan; Botkin, Jeffrey R; Hurdle, John F

    2015-02-01

    The clinical research landscape has changed dramatically in recent years in terms of both volume and complexity. This poses new challenges for Institutional Review Boards' (IRBs) review efficiency and quality, especially at large academic medical centers. This article discusses the technical facets of IRB modernization. We analyzed the information technology used by IRBs in large academic institutions across the United States. We found that large academic medical centers have a high electronic IRB adoption rate; however, the capabilities of electronic IRB systems vary greatly. We discuss potential use-cases of a fully exploited electronic IRB system that promise to streamline the clinical research work flow. The key to that approach utilizes a structured and standardized information model for the IRB application. © The Author(s) 2014.

  15. The Ideal Oriented Co-design Approach Revisited

    NASA Astrophysics Data System (ADS)

    Johnstone, Christina

    There exist a large number of different methodologies for developing information systems on the market. This implies that there also are a large number of "best" ways of developing those information systems. Avison and Fitzgerald (2003) states that every methodology is built on a philosophy. With philosophy they refer to the underlying attitudes and viewpoints, and the different assumptions and emphases to be found within the specific methodology.

  16. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  17. Engaging Engineering and Information Systems Students in Advocacy for Individuals with Disabilities through a Disability Film Media Project

    ERIC Educational Resources Information Center

    Lawler, James; Iturralde, Val; Goldstein, Allan; Joseph, Anthony

    2015-01-01

    College curricula of engineering and information systems do not afford frequent engagement with individuals with disabilities. The authors of this research study analyzed the benefits of disability films for a community film festival of largely engineering and information systems students and individuals with developmental and intellectual…

  18. Facilitating access to information in large documents with an intelligent hypertext system

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie

    1993-01-01

    Retrieving specific information from large amounts of documentation is not an easy task. It could be facilitated if information relevant in the current problem solving context could be automatically supplied to the user. As a first step towards this goal, we have developed an intelligent hypertext system called CID (Computer Integrated Documentation) and tested it on the Space Station Freedom requirement documents. The CID system enables integration of various technical documents in a hypertext framework and includes an intelligent context-sensitive indexing and retrieval mechanism. This mechanism utilizes on-line user information requirements and relevance feedback either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows the CID system to provide helpful responses, based on previous usage of the documentation, and to improve its performance over time.

  19. Attributes and Behaviors of Performance-Centered Systems.

    ERIC Educational Resources Information Center

    Gery, Gloria

    1995-01-01

    Examines attributes, characteristics, and behaviors of performance-centered software packages that are emerging in the consumer software marketplace and compares them with large-scale systems software being designed by internal information systems staffs and vendors of large-scale software designed for financial, manufacturing, processing, and…

  20. Development of a full-text information retrieval system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keizo Oyama; AKira Miyazawa, Atsuhiro Takasu; Kouji Shibano

    The authors have executed a project to realize a full-text information retrieval system. The system is designed to deal with a document database comprising full text of a large number of documents such as academic papers. The document structures are utilized in searching and extracting appropriate information. The concept of structure handling and the configuration of the system are described in this paper.

  1. Designing a place for automation.

    PubMed

    Bazzoli, F

    1995-05-01

    Re-engineering is a hot topic in health care as market forces increase pressure to cut costs. Providers and payers that are redesigning their business processes are counting on information systems to help achieve simplification and make large gains in efficiency. But these same organizations say they're reluctant to make large upfront investments in information systems until they know exactly what role technology will play in the re-engineered entity.

  2. Concepts for a global resources information system

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Urena, J. L.

    1984-01-01

    The objective of the Global Resources Information System (GRIS) is to establish an effective and efficient information management system to meet the data access requirements of NASA and NASA-related scientists conducting large-scale, multi-disciplinary, multi-mission scientific investigations. Using standard interfaces and operating guidelines, diverse data systems can be integrated to provide the capabilities to access and process multiple geographically dispersed data sets and to develop the necessary procedures and algorithms to derive global resource information.

  3. Power quality load management for large spacecraft electrical power systems

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.

    1988-01-01

    In December, 1986, a Center Director's Discretionary Fund (CDDF) proposal was granted to study power system control techniques in large space electrical power systems. Presented are the accomplishments in the area of power system control by power quality load management. In addition, information concerning the distortion problems in a 20 kHz ac power system is presented.

  4. A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.

    ERIC Educational Resources Information Center

    Niehaus, R. J.; Sholtz, D.

    This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…

  5. Some aspects of control of a large-scale dynamic system

    NASA Technical Reports Server (NTRS)

    Aoki, M.

    1975-01-01

    Techniques of predicting and/or controlling the dynamic behavior of large scale systems are discussed in terms of decentralized decision making. Topics discussed include: (1) control of large scale systems by dynamic team with delayed information sharing; (2) dynamic resource allocation problems by a team (hierarchical structure with a coordinator); and (3) some problems related to the construction of a model of reduced dimension.

  6. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  7. Availability of Heat to Drive Hydrothermal Systems in Large Martian Impact Craters

    NASA Technical Reports Server (NTRS)

    Thorsos, I. E.; Newsom, H. E.; Davies, A. G.

    2001-01-01

    The central uplift in large craters on Mars can provide a substantial source of heat, equivalent to heat produced by the impact melt sheet. The heat generated in large impacts could play a significant role in hydrothermal systems on Mars. Additional information is contained in the original extended abstract.

  8. SELECTIVE DISSEMINATION OF INFORMATION--REVIEW OF SELECTED SYSTEMS AND A DESIGN FOR ARMY TECHNICAL LIBRARIES. FINAL REPORT. ARMY TECHNICAL LIBRARY IMPROVEMENT STUDIES (ATLIS), REPORT NO. 8.

    ERIC Educational Resources Information Center

    BIVONA, WILLIAM A.

    THIS REPORT PRESENTS AN ANALYSIS OF OVER EIGHTEEN SMALL, INTERMEDIATE, AND LARGE SCALE SYSTEMS FOR THE SELECTIVE DISSEMINATION OF INFORMATION (SDI). SYSTEMS ARE COMPARED AND ANALYZED WITH RESPECT TO DESIGN CRITERIA AND THE FOLLOWING NINE SYSTEM PARAMETERS--(1) INFORMATION INPUT, (2) METHODS OF INDEXING AND ABSTRACTING, (3) USER INTEREST PROFILE…

  9. The evolution of educational information systems and nurse faculty roles.

    PubMed

    Nelson, Ramona; Meyers, Linda; Rizzolo, Mary Anne; Rutar, Pamela; Proto, Marcia B; Newbold, Susan

    2006-01-01

    Institutions of higher education are purchasing and/or designing sophisticated administrative information systems to manage such functions as the application, admissions, and registration process, grants management, student records, and classroom scheduling. Although faculty also manage large amounts of data, few automated systems have been created to help faculty improve teaching and learning through the management of information related to individual students, the curriculum, educational programs, and program evaluation. This article highlights the potential benefits that comprehensive educational information systems offer nurse faculty.

  10. Intranet and HTML at a major university hospital--experiences from Munich.

    PubMed

    Dugas, M

    1997-01-01

    Intranet-technology is the application of Internet-Tools in local networks. With this technique electronic information systems for large hospitals can be realized very easily. This technology has been in routine use in 'Klinikum Grosshadern' for more than one year on over 50 wards and more than 200 computers. The following clinical application areas are described: drug information, nursing information, electronic literature retrieval systems, multimedia teaching und laboratory information systems.

  11. Stability of large-scale systems.

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.

    1972-01-01

    The purpose of this paper is to present the results obtained in stability study of large-scale systems based upon the comparison principle and vector Liapunov functions. The exposition is essentially self-contained, with emphasis on recent innovations which utilize explicit information about the system structure. This provides a natural foundation for the stability theory of dynamic systems under structural perturbations.

  12. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  13. What information do people use, trust, and find useful during a disaster? Evidence from five large wildfires

    Treesearch

    Toddi A. Steelman; Sarah M. McCaffrey; Anne-Lise Knox Velez; Jason Alexander Briefel

    2015-01-01

    The communication system through which information flows during a disaster can be conceived of as a set of relationships among sources and recipients who are concerned about key information characteristics. The recipient perspective is often neglected within this system. In this article, we explore recipient perspectives related to what information was used, useful,...

  14. Verification of Space Station Secondary Power System Stability Using Design of Experiment

    NASA Technical Reports Server (NTRS)

    Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce

    1998-01-01

    This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.

  15. Computerization of Library and Information Services in Mainland China.

    ERIC Educational Resources Information Center

    Lin, Sharon Chien

    1994-01-01

    Describes two phases of the automation of library and information services in mainland China. From 1974-86, much effort was concentrated on developing computer systems, databases, online retrieval, and networking. From 1986 to the present, practical progress became possible largely because of CD-ROM technology; and large scale networking for…

  16. Advanced techniques for the storage and use of very large, heterogeneous spatial databases

    NASA Technical Reports Server (NTRS)

    Peuquet, Donna J.

    1987-01-01

    Progress is reported in the development of a prototype knowledge-based geographic information system. The overall purpose of this project is to investigate and demonstrate the use of advanced methods in order to greatly improve the capabilities of geographic information system technology in the handling of large, multi-source collections of spatial data in an efficient manner, and to make these collections of data more accessible and usable for the Earth scientist.

  17. The 'cube' meta-model for the information system of large health sector organizations--a (platform neutral) mapping tool to integrate information system development with changing business functions and organizational development.

    PubMed

    Balkányi, László

    2002-01-01

    To develop information systems (IS) in the changing environment of the health sector, a simple but throughout model, avoiding the techno-jargon of informatics, might be useful for the top management. A platform neutral, extensible, transparent conceptual model should be established. Limitations of current methods lead to a simple, but comprehensive mapping, in the form of a three-dimensional cube. The three 'orthogonal' views are (a) organization functionality, (b) organizational structures and (c) information technology. Each of the cube-sides is described according to its nature. This approach enables to define any kind of an IS component as a certain point/layer/domain of the cube and enables also the management to label all IS components independently form any supplier(s) and/or any specific platform. The model handles changes in organization structure, business functionality and the serving info-system independently form each other. Practical application extends to (a) planning complex, new ISs, (b) guiding development of multi-vendor, multi-site ISs, (c) supporting large-scale public procurement procedures and the contracting, implementation phase by establishing a platform neutral reference, (d) keeping an exhaustive inventory of an existing large-scale system, that handles non-tangible aspects of the IS.

  18. Operating tool for a distributed data and information management system

    NASA Astrophysics Data System (ADS)

    Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.

    2002-07-01

    The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.

  19. The preparedness of hospital Health Information Services for system failures due to internal disasters.

    PubMed

    Lee, Cheens; Robinson, Kerin M; Wendt, Kate; Williamson, Dianne

    The unimpeded functioning of hospital Health Information Services (HIS) is essential for patient care, clinical governance, organisational performance measurement, funding and research. In an investigation of hospital Health Information Services' preparedness for internal disasters, all hospitals in the state of Victoria with the following characteristics were surveyed: they have a Health Information Service/ Department; there is a Manager of the Health Information Service/Department; and their inpatient capacity is greater than 80 beds. Fifty percent of the respondents have experienced an internal disaster within the past decade, the majority affecting the Health Information Service. The most commonly occurring internal disasters were computer system failure and floods. Two-thirds of the hospitals have internal disaster plans; the most frequently occurring scenarios provided for are computer system failure, power failure and fire. More large hospitals have established back-up systems than medium- and small-size hospitals. Fifty-three percent of hospitals have a recovery plan for internal disasters. Hospitals typically self-rate as having a 'medium' level of internal disaster preparedness. Overall, large hospitals are better prepared for internal disasters than medium and small hospitals, and preparation for disruption of computer systems and medical record services is relatively high on their agendas.

  20. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  1. A pilot's subjective analysis of a Cockpit Display of Traffic Information (CDTI). [terminal configured vehicle

    NASA Technical Reports Server (NTRS)

    Keyser, G. L., Jr.

    1981-01-01

    Both the advent of electronic displays for cockpit applications and the availability of high-capacity data transmission systems, linking aicraft with ATC ground computers, offer the opportunity of expanding the pilots' role in the distributive management process. A critical element in this process is believed to be the presentation to the pilot of his traffic situation. A representative cockpit display of traffic information (CDTI) system is presented as viewed from the pilot in the cockpit, and the research results from flight tests presented. The use of advanced controls and displays allows for presentation to the pilot, large quantities of information that he has not had before. The real challenge in the design of an operational CDTI system will be the satisfaction of needs for information and the presentation of all necessary information, only in a useable format in order to avoid clutter. Even though a reasonably large display was utilized in these tests, display clutter was the primary problem from the standpoint of information assimilation.

  2. Supporting Knowledge Transfer in IS Deployment Projects

    NASA Astrophysics Data System (ADS)

    Schönström, Mikael

    To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).

  3. Graduating to Postdoc: Information-Sharing in Support of Organizational Structures and Needs

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Lucas, Paul J.; Compton, Michael M.; Stewart, Helen J.; Baya, Vinod; DelAlto, Martha

    1999-01-01

    The deployment of information-sharing systems in large organizations can significantly impact existing policies and procedures with regard to authority and control over information. Unless information-sharing systems explicitly support organizational structures and needs, these systems will be rejected summarily. The Postdoc system is a deployed Web-based information-sharing system created specifically to address organizational needs. Postdoc contains various organizational support features including a shared, globally navigable document space, as well as specialized access control, distributed administration, and mailing list features built around the key notion of hierarchical group structures. We review successes and difficulties in supporting organizational needs with Postdoc

  4. Medical Information Management System (MIMS) CareWindows.

    PubMed Central

    Stiphout, R. M.; Schiffman, R. M.; Christner, M. F.; Ward, R.; Purves, T. M.

    1991-01-01

    The demonstration of MIMS/CareWindows will include: (1) a review of the application environment and development history, (2) a demonstration of a very large, comprehensive clinical information system with a cost effective graphic user server and communications interface. PMID:1807755

  5. Practical aspects of handling data protection and data security.

    PubMed

    Louwerse, C P

    1991-01-01

    Looking at practical applications of health care information systems, we must conclude that in the field of data protection there still is too large a gap between what is feasible and necessary on one hand, and what is achieved in actual realizations on the other. To illustrate this point, we sketch the actual data protection measures in a large hospital information system, and describe the effects of changes affecting the system, such as increasing use of personal computers, and growing intensity of use of the system. Trends in the development of new and additional systems are indicated, and a summary of possible weak points and gaps in the security is given, some suggestions for improvement are made.

  6. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, J. R.; Silver, R. B.

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives.

  7. A clinical information systems strategy for a large integrated delivery network.

    PubMed Central

    Kuperman, G. J.; Spurr, C.; Flammini, S.; Bates, D.; Glaser, J.

    2000-01-01

    Integrated delivery networks (IDNs) are an emerging class of health care institutions. IDNs are formed from the affiliation of individual health care institutions and are intended to be more efficient in the current fiscal health care environment. To realize efficiencies and support their strategic visions, IDNs rely critically on excellent information technology (IT). Because of its importance to the mission of the IDN, strategic decisions about IT are made by the top leadership of the IDN. At Partners HealthCare System, a large IDN in Boston, MA, a clinical information systems strategy has been created to support the Partners clinical vision. In this paper, we discuss the Partners' structure, clinical vision, and current IT initiatives in place to address the clinical vision. The initiatives are: a clinical data repository, inpatient process support, electronic medical records, a portal strategy, referral applications, knowledge resources, support for product lines, patient computing, confidentiality, and clinical decision support. We address several of the issues encountered in trying to bring excellent information technology to a large IDN. PMID:11079921

  8. A Survey of the Current Situation of Clinical Biobanks in China.

    PubMed

    Li, Haiyan; Ni, Mingyu; Wang, Peng; Wang, Xiaomin

    2017-06-01

    The development of biomedical research urgently needs the support of a large number of high-quality clinical biospecimens. Therefore, human biobanks at different levels have been established successively in China and other countries at a significantly increasing pace in recent years. To better understand the general current state of clinical biobanks in China, we surveyed 42 clinical biobanks based in hospitals and collected information involving their management systems, sharing mechanisms, quality control systems, and informational management systems using closed questionnaire methods. Based on our current information, there has not been such a large-scale survey in China. An understanding of the status and challenges current clinical biobanks face will provide valuable insights for the construction and sustainable development of higher quality clinical biobanks.

  9. On the concept of the interactive information and simulation system for gas dynamics and multiphysics problems

    NASA Astrophysics Data System (ADS)

    Bessonov, O.; Silvestrov, P.

    2017-02-01

    This paper describes the general idea and the first implementation of the Interactive information and simulation system - an integrated environment that combines computational modules for modeling the aerodynamics and aerothermodynamics of re-entry space vehicles with the large collection of different information materials on this topic. The internal organization and the composition of the system are described and illustrated. Examples of the computational and information output are presented. The system has the unified implementation for Windows and Linux operation systems and can be deployed on any modern high-performance personal computer.

  10. Beliefs and Attitudes Associated with ERP Adoption Behaviours: A Grounded Theory Study from IT Manager and End-user Perspectives

    NASA Astrophysics Data System (ADS)

    Arunthari, Santipat; Hasan, Helen

    (1998, p. 121) defines an Enterprise Resource Planning (ERP) system as an enterprise system that promises seamless integration of all information flowing through a company, including financial and accounting information, human resource information, supply chain information, customer information. ERP systems came on the scene in the early 1990s as a response to the proliferation of standalone business applications to service these separate information needs in most large organisations. Enterprise wide projects, such as data warehousing, requiring integrated approaches to organisational operations and information management were inhibited through a proliferation of incompatible off-the-shelf packages, in-house developments and aging legacy systems.

  11. Concepts of Management Information Systems.

    ERIC Educational Resources Information Center

    Emery, J.C.

    The paper attempts to provide a general framework for dealing with management information systems (MIS). An MIS is defined to have the following characteristics: (1) related to ongoing activities of an organization, (2) a man-machine system, (3) composed of a collection of subsystems, and (4) oriented around a large data base. An MIS places a…

  12. The Clinical Practice Library of Medicine (CPLM): An on-line biomedical computer library. System documentation

    NASA Technical Reports Server (NTRS)

    Grams, R. R.

    1982-01-01

    A system designed to access a large range of available medical textbook information in an online interactive fashion is described. A high level query type database manager, INQUIRE, is used. Operating instructions, system flow diagrams, database descriptions, text generation, and error messages are discussed. User information is provided.

  13. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  14. A GENERAL SIMULATION MODEL FOR INFORMATION SYSTEMS: A REPORT ON A MODELLING CONCEPT

    DTIC Science & Technology

    The report is concerned with the design of large-scale management information systems (MIS). A special design methodology was created, along with a design model to complement it. The purpose of the paper is to present the model.

  15. The South Australian Department of Mines and Energy Bibliography Retrieval System.

    ERIC Educational Resources Information Center

    Mannik, Maire

    1980-01-01

    Described is the South Australian Department of Mines and Energy Bibliography Retrieval System which is a repository for a large amount of geological and related information. Instructions for retrieval are outlined, and the coding information procedures are given. (DS)

  16. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    ERIC Educational Resources Information Center

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  17. User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases

    ERIC Educational Resources Information Center

    Hartley, Roger; Almuhaidib, Saud M. Y.

    2007-01-01

    Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…

  18. Multiresource inventories incorporating GIS, GPS, and database management systems

    Treesearch

    Loukas G. Arvanitis; Balaji Ramachandran; Daniel P. Brackett; Hesham Abd-El Rasol; Xuesong Du

    2000-01-01

    Large-scale natural resource inventories generate enormous data sets. Their effective handling requires a sophisticated database management system. Such a system must be robust enough to efficiently store large amounts of data and flexible enough to allow users to manipulate a wide variety of information. In a pilot project, related to a multiresource inventory of the...

  19. A Semantics-Based Information Distribution Framework for Large Web-Based Course Forum System

    ERIC Educational Resources Information Center

    Chim, Hung; Deng, Xiaotie

    2008-01-01

    We propose a novel data distribution framework for developing a large Web-based course forum system. In the distributed architectural design, each forum server is fully equipped with the ability to support some course forums independently. The forum servers collaborating with each other constitute the whole forum system. Therefore, the workload of…

  20. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2002-08-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.

  1. Geospatial Technologies and Higher Education in Argentina

    ERIC Educational Resources Information Center

    Leguizamon, Saturnino

    2010-01-01

    The term "geospatial technologies" encompasses a large area of fields involving cartography, spatial analysis, geographic information system, remote sensing, global positioning systems and many others. These technologies should be expected to be available (as "natural tools") for a country with a large surface and a variety of…

  2. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  3. Experimental violation of Bell inequalities for multi-dimensional systems

    PubMed Central

    Lo, Hsin-Pin; Li, Che-Ming; Yabushita, Atsushi; Chen, Yueh-Nan; Luo, Chih-Wei; Kobayashi, Takayoshi

    2016-01-01

    Quantum correlations between spatially separated parts of a d-dimensional bipartite system (d ≥ 2) have no classical analog. Such correlations, also called entanglements, are not only conceptually important, but also have a profound impact on information science. In theory the violation of Bell inequalities based on local realistic theories for d-dimensional systems provides evidence of quantum nonlocality. Experimental verification is required to confirm whether a quantum system of extremely large dimension can possess this feature, however it has never been performed for large dimension. Here, we report that Bell inequalities are experimentally violated for bipartite quantum systems of dimensionality d = 16 with the usual ensembles of polarization-entangled photon pairs. We also estimate that our entanglement source violates Bell inequalities for extremely high dimensionality of d > 4000. The designed scenario offers a possible new method to investigate the entanglement of multipartite systems of large dimensionality and their application in quantum information processing. PMID:26917246

  4. Research on parallel combinatory spread spectrum communication system with double information matching

    NASA Astrophysics Data System (ADS)

    Xue, Wei; Wang, Qi; Wang, Tianyu

    2018-04-01

    This paper presents an improved parallel combinatory spread spectrum (PC/SS) communication system with the method of double information matching (DIM). Compared with conventional PC/SS system, the new model inherits the advantage of high transmission speed, large information capacity and high security. Besides, the problem traditional system will face is the high bit error rate (BER) and since its data-sequence mapping algorithm. Hence the new model presented shows lower BER and higher efficiency by its optimization of mapping algorithm.

  5. Managing Vocabulary Mapping Services

    PubMed Central

    Che, Chengjian; Monson, Kent; Poon, Kasey B.; Shakib, Shaun C.; Lau, Lee Min

    2005-01-01

    The efficient management and maintenance of large-scale and high-quality vocabulary mapping is an operational challenge. The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) group developed an information management system to provide controlled mapping services, resulting in improved efficiency and quality maintenance. PMID:16779203

  6. Data Representations for Geographic Information Systems.

    ERIC Educational Resources Information Center

    Shaffer, Clifford A.

    1992-01-01

    Surveys the field and literature of geographic information systems (GIS) and spatial data representation as it relates to GIS. Highlights include GIS terms, data types, and operations; vector representations and raster, or grid, representations; spatial indexing; elevation data representations; large spatial databases; and problem areas and future…

  7. Enhanced Information Retrieval Using AJAX

    NASA Astrophysics Data System (ADS)

    Kachhwaha, Rajendra; Rajvanshi, Nitin

    2010-11-01

    Information Retrieval deals with the representation, storage, organization of, and access to information items. The representation and organization of information items should provide the user with easy access to the information with the rapid development of Internet, large amounts of digitally stored information is readily available on the World Wide Web. This information is so huge that it becomes increasingly difficult and time consuming for the users to find the information relevant to their needs. The explosive growth of information on the Internet has greatly increased the need for information retrieval systems. However, most of the search engines are using conventional information retrieval systems. An information system needs to implement sophisticated pattern matching tools to determine contents at a faster rate. AJAX has recently emerged as the new tool such the of information retrieval process of information retrieval can become fast and information reaches the use at a faster pace as compared to conventional retrieval systems.

  8. Software engineering principles applied to large healthcare information systems--a case report.

    PubMed

    Nardon, Fabiane Bizinella; de A Moura, Lincoln

    2007-01-01

    São Paulo is the largest city in Brazil and one of the largest cities in the world. In 2004, São Paulo City Department of Health decided to implement a Healthcare Information System to support managing healthcare services and provide an ambulatory health record. The resulting information system is one of the largest public healthcare information systems ever built, with more than 2 million lines of code. Although statistics shows that most software projects fail, and the risks for the São Paulo initiative were enormous, the information system was completed on-time and on-budget. In this paper, we discuss the software engineering principles adopted that allowed to accomplish that project's goals, hoping that sharing the experience of this project will help other healthcare information systems initiatives to succeed.

  9. Rapid learning: a breakthrough agenda.

    PubMed

    Etheredge, Lynn M

    2014-07-01

    A "rapid-learning health system" was proposed in a 2007 thematic issue of Health Affairs. The system was envisioned as one that uses evidence-based medicine to quickly determine the best possible treatments for patients. It does so by drawing on electronic health records and the power of big data to access large volumes of information from a variety of sources at high speed. The foundation for a rapid-learning health system was laid during 2007-13 by workshops, policy papers, large public investments in databases and research programs, and developing learning systems. Challenges now include implementing a new clinical research system with several hundred million patients, modernizing clinical trials and registries, devising and funding research on national priorities, and analyzing genetic and other factors that influence diseases and responses to treatment. Next steps also should aim to improve comparative effectiveness research; build on investments in health information technology to standardize handling of genetic information and support information exchange through apps and software modules; and develop new tools, data, and information for clinical decision support. Further advances will require commitment, leadership, and public-private and global collaboration. Project HOPE—The People-to-People Health Foundation, Inc.

  10. Developing Data Systems To Support the Analysis and Development of Large-Scale, On-Line Assessment.

    ERIC Educational Resources Information Center

    Yu, Chong Ho

    Today many data warehousing systems are data rich, but information poor. Extracting useful information from an ocean of data to support administrative, policy, and instructional decisions becomes a major challenge to both database designers and measurement specialists. This paper focuses on the development of a data processing system that…

  11. SHOEBOX: A Personal File Handling System for Textual Data. Information System Language Studies, Number 23.

    ERIC Educational Resources Information Center

    Glantz, Richard S.

    Until recently, the emphasis in information storage and retrieval systems has been towards batch-processing of large files. In contrast, SHOEBOX is designed for the unformatted, personal file collection of the computer-naive individual. Operating through display terminals in a time-sharing, interactive environment on the IBM 360, the user can…

  12. Development of a database system for mapping insertional mutations onto the mouse genome with large-scale experimental data

    PubMed Central

    2009-01-01

    Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505

  13. Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Sweet, Nicholas

    Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.

  14. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  15. Construction of In-house Databases in a Corporation

    NASA Astrophysics Data System (ADS)

    Nishikawa, Takaya

    The author describes the progress in and present status of the information management system at the research laboratories as a R & D component of pharmaceutical industry. The system deals with three fundamental types of information, that is, graphic information, numeral information and textual information which includes the former two types of information. The author and others have constructed the system which enables to process these kinds of information integrally. The system is also featured by the fact that natural form of information in which Japanese words (2 byte type) and English (1 byte type) as culture of personal & word processing computers are mixed can be processed by large-size computers because Japanese language are eligible for computer processing. The system is originally for research administrators, but can be effective also for researchers. At present 7 databases are available including external databases. The system is always ready to accept other databases newly.

  16. Information technology aided exploration of system design spaces

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Kiper, James D.; Kalafat, Selcuk

    2004-01-01

    We report on a practical application of information technology techniques to aid system engineers effectively explore large design spaces. We make use of heuristic search, visualization and data mining, the combination of which we have implemented wtihin a risk management tool in use at JPL and NASA.

  17. Online POMDP Algorithms for Very Large Observation Spaces

    DTIC Science & Technology

    2017-06-06

    stochastic optimization: From sets to paths." In Advances in Neural Information Processing Systems, pp. 1585- 1593 . 2015. • Luo, Yuanfu, Haoyu Bai...and Wee Sun Lee. "Adaptive stochastic optimization: From sets to paths." In Advances in Neural Information Processing Systems, pp. 1585- 1593 . 2015

  18. Customer Relationship Management Systems - Why Many Large Companies Do Not Have Them?

    NASA Astrophysics Data System (ADS)

    Cunha, Manuela; Varajão, João; Santana, Daniela; Bentes, Isabel

    Today's information technologies are heavily embedded in the reality of organizations. Their role is essential not only at the level of internal processes optimization, but also the interaction between the company and its environment. In this context, the Customer Relationship Management (CRM) systems are powerful competitive tools in many different sectors of activity. Despite the undeniable importance of these systems, there are in practice, many large companies that do not use them. Supported by the results of a survey carried out in a sample of large enterprises, this paper seeks to answer to the research question "why many large companies do not have CRM systems".

  19. Detecting crop population growth using chlorophyll fluorescence imaging.

    PubMed

    Wang, Heng; Qian, Xiangjie; Zhang, Lan; Xu, Sailong; Li, Haifeng; Xia, Xiaojian; Dai, Liankui; Xu, Liang; Yu, Jingquan; Liu, Xu

    2017-12-10

    For both field and greenhouse crops, it is challenging to evaluate their growth information on a large area over a long time. In this work, we developed a chlorophyll fluorescence imaging-based system for crop population growth information detection. Modular design was used to make the system provide high-intensity uniform illumination. This system can perform modulated chlorophyll fluorescence induction kinetics measurement and chlorophyll fluorescence parameter imaging over a large area of up to 45  cm×34  cm. The system can provide different lighting intensity by modulating the duty cycle of its control signal. Results of continuous monitoring of cucumbers in nitrogen deficiency show the system can reduce the judge error of crop physiological status and improve monitoring efficiency. Meanwhile, the system is promising in high throughput application scenarios.

  20. A Description of the Text Data Base System TDBS. Stockholm Papers in Library and Information Science.

    ERIC Educational Resources Information Center

    Lofstrom, Mats

    Because experience with large information retrieval (IR) and database management (DBM) systems has shown that they are not adequate for the handling of textual material, two Swedish companies--Paralog and AU-System Network--have joined in a venture to develop a software package which combines features from IR and DMB systems to form a Text Data…

  1. Workflow management in large distributed systems

    NASA Astrophysics Data System (ADS)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  2. WebCIS: large scale deployment of a Web-based clinical information system.

    PubMed

    Hripcsak, G; Cimino, J J; Sengupta, S

    1999-01-01

    WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.

  3. Automated information retrieval using CLIPS

    NASA Technical Reports Server (NTRS)

    Raines, Rodney Doyle, III; Beug, James Lewis

    1991-01-01

    Expert systems have considerable potential to assist computer users in managing the large volume of information available to them. One possible use of an expert system is to model the information retrieval interests of a human user and then make recommendations to the user as to articles of interest. At Cal Poly, a prototype expert system written in the C Language Integrated Production System (CLIPS) serves as an Automated Information Retrieval System (AIRS). AIRS monitors a user's reading preferences, develops a profile of the user, and then evaluates items returned from the information base. When prompted by the user, AIRS returns a list of items of interest to the user. In order to minimize the impact on system resources, AIRS is designed to run in the background during periods of light system use.

  4. DIALOG: An executive computer program for linking independent programs

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hague, D. S.; Watson, D. A.

    1973-01-01

    A very large scale computer programming procedure called the DIALOG executive system was developed for the CDC 6000 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. Each computer program maintains its individual identity and is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG executive system. The installation and uses of the DIALOG executive system are described.

  5. A new method of edge detection for object recognition

    USGS Publications Warehouse

    Maddox, Brian G.; Rhew, Benjamin

    2004-01-01

    Traditional edge detection systems function by returning every edge in an input image. This can result in a large amount of clutter and make certain vectorization algorithms less accurate. Accuracy problems can then have a large impact on automated object recognition systems that depend on edge information. A new method of directed edge detection can be used to limit the number of edges returned based on a particular feature. This results in a cleaner image that is easier for vectorization. Vectorized edges from this process could then feed an object recognition system where the edge data would also contain information as to what type of feature it bordered.

  6. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  7. Seattle wide-area information for travelers (SWIFT) : architecture study

    DOT National Transportation Integrated Search

    1998-10-19

    The SWIFT (Seattle Wide-area Information For Travelers) Field Operational Test was intended to evaluate the performance of a large-scale urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. The unique features of the SWIF...

  8. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  9. Using expert systems to implement a semantic data model of a large mass storage system

    NASA Technical Reports Server (NTRS)

    Roelofs, Larry H.; Campbell, William J.

    1990-01-01

    The successful development of large volume data storage systems will depend not only on the ability of the designers to store data, but on the ability to manage such data once it is in the system. The hypothesis is that mass storage data management can only be implemented successfully based on highly intelligent meta data management services. There now exists a proposed mass store system standard proposed by the IEEE that addresses many of the issues related to the storage of large volumes of data, however, the model does not consider a major technical issue, namely the high level management of stored data. However, if the model were expanded to include the semantics and pragmatics of the data domain using a Semantic Data Model (SDM) concept, the result would be data that is expressive of the Intelligent Information Fusion (IIF) concept and also organized and classified in context to its use and purpose. The results are presented of a demonstration prototype SDM implemented using the expert system development tool NEXPERT OBJECT. In the prototype, a simple instance of a SDM was created to support a hypothetical application for the Earth Observing System, Data Information System (EOSDIS). The massive amounts of data that EOSDIS will manage requires the definition and design of a powerful information management system in order to support even the most basic needs of the project. The application domain is characterized by a semantic like network that represents the data content and the relationships between the data based on user views and the more generalized domain architectural view of the information world. The data in the domain are represented by objects that define classes, types and instances of the data. In addition, data properties are selectively inherited between parent and daughter relationships in the domain. Based on the SDM a simple information system design is developed from the low level data storage media, through record management and meta data management to the user interface.

  10. Organizational technologies of chronic disease management programs in large rural multispecialty group practice systems.

    PubMed

    Gamm, Larry; Bolin, Jane Nelson; Kash, Bita A

    2005-01-01

    Four large rural multispecialty group practice systems employ a mix of organizational technologies to provide chronic disease management with measurable impacts on their patient populations and costs. Four technologies-administrative, clinical, information, and social-are proposed as key dimensions for examining disease management programs. The benefits of disease management are recognized by these systems despite marked variability in the organization of the programs. Committees spanning health plans and clinics in the 4 systems and electronic medical records and/or other disease management information systems are important coordinating mechanisms. Increased reliance on nurses for patient education and care coordination in all 4 systems reflects significant extension of clinical and social technologies in the management of patient care. The promise of disease management as offered by these systems and other auspices are considered.

  11. Structural design of a vertical antenna boresight 18.3 by 18.3-m planar near-field antenna measurement system

    NASA Technical Reports Server (NTRS)

    Sharp, G. R.; Trimarchi, P. A.; Wanhainen, J. S.

    1984-01-01

    A large very precise near-field planar scanner was proposed for NASA Lewis Research Center. This scanner would permit near-field measurements over a horizontal scan plane measuring 18.3 m by 18.3 m. Large aperture antennas mounted with antenna boresight vertical could be tested up to 60 GHz. When such a large near field scanner is used for pattern testing, the antenna or antenna system under test does not have to be moved. Hence, such antennas and antenna systems can be positioned and supported to simulate configuration in zero g. Thus, very large and heavy machinery that would be needed to accurately move the antennas are avoided. A preliminary investigation was undertaken to address the mechanical design of such a challenging near-field antenna scanner. The configuration, structural design and results of a parametric NASTRAN structural optimization analysis are contained. Further, the resulting design was dynamically analyzed in order to provide resonant frequency information to the scanner mechanical drive system designers. If other large near field scanners of comparable dimensions are to be constructed, the information can be used for design optimization of these also.

  12. Optical position measurement for a Large Gap Magnetic Suspension System

    NASA Technical Reports Server (NTRS)

    Welch, Sharon S.; Shelton, Kevin J.; Clemmons, James I.

    1991-01-01

    This paper describes the design of an optical position measurement system which is being built as part of the NASA Langley Large Gap Magnetic Suspension System (LGMSS). The LGMSS is a five degree-of-freedom, large-gap magnetic suspension system which is being built for Langley Research Center as part of the Advanced Controls Test Facility (ACTF). The LGMSS consists of a planar array of electromagnets which levitate and position a cylindrically shaped model containing a permanent magnet core. The optical position measurement system provides information on the location and orientation of the model to the LGMSS control system to stabilize levitation of the model.

  13. Lost in transportation: Information measures and cognitive limits in multilayer navigation.

    PubMed

    Gallotti, Riccardo; Porter, Mason A; Barthelemy, Marc

    2016-02-01

    Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world's 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the "Dunbar number," which represents a limit to the size of an individual's friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially.

  14. Lost in transportation: Information measures and cognitive limits in multilayer navigation

    PubMed Central

    Gallotti, Riccardo; Porter, Mason A.; Barthelemy, Marc

    2016-01-01

    Cities and their transportation systems become increasingly complex and multimodal as they grow, and it is natural to wonder whether it is possible to quantitatively characterize our difficulty navigating in them and whether such navigation exceeds our cognitive limits. A transition between different search strategies for navigating in metropolitan maps has been observed for large, complex metropolitan networks. This evidence suggests the existence of a limit associated with cognitive overload and caused by a large amount of information that needs to be processed. In this light, we analyzed the world’s 15 largest metropolitan networks and estimated the information limit for determining a trip in a transportation system to be on the order of 8 bits. Similar to the “Dunbar number,” which represents a limit to the size of an individual’s friendship circle, our cognitive limit suggests that maps should not consist of more than 250 connection points to be easily readable. We also show that including connections with other transportation modes dramatically increases the information needed to navigate in multilayer transportation networks. In large cities such as New York, Paris, and Tokyo, more than 80% of the trips are above the 8-bit limit. Multimodal transportation systems in large cities have thus already exceeded human cognitive limits and, consequently, the traditional view of navigation in cities has to be revised substantially. PMID:26989769

  15. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include miniaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is less easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large data sets. Three limiting paradigms are as follows: saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage, and retrieval off the shelf; and the linear model of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  16. Creating databases for biological information: an introduction.

    PubMed

    Stein, Lincoln

    2013-06-01

    The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.

  17. Advanced data structures for the interpretation of image and cartographic data in geo-based information systems

    NASA Technical Reports Server (NTRS)

    Peuquet, D. J.

    1986-01-01

    A growing need to usse geographic information systems (GIS) to improve the flexibility and overall performance of very large, heterogeneous data bases was examined. The Vaster structure and the Topological Grid structure were compared to test whether such hybrid structures represent an improvement in performance. The use of artificial intelligence in a geographic/earth sciences data base context is being explored. The architecture of the Knowledge Based GIS (KBGIS) has a dual object/spatial data base and a three tier hierarchial search subsystem. Quadtree Spatial Spectra (QTSS) are derived, based on the quadtree data structure, to generate and represent spatial distribution information for large volumes of spatial data.

  18. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  19. Pilot study of the domestic information display system in state and local government

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An interactive computer based system that can retrieve a wide range of data (demographic, environmental, socio-economic, etc.,) from a large data base and display these data for different geographic units in the form of choropleth maps was developed. The system was designed to display statistical information in a geographic format for national policy makers.

  20. Information Resource Management for Rural Communities in Zambia: Implications for National Information Policy Formulation.

    ERIC Educational Resources Information Center

    Lundu, Maurice C.

    Arguing that the information needs of the rural community are largely ignored in the establishment of information systems and services, this paper: (1) identifies the unique characteristics of information users in rural areas; (2) identifies and analyzes the types of communication channels most likely to deliver the information related to…

  1. Direct evaluation of free energy for large system through structure integration approach.

    PubMed

    Takeuchi, Kazuhito; Tanaka, Ryohei; Yuge, Koretaka

    2015-09-30

    We propose a new approach, 'structure integration', enabling direct evaluation of configurational free energy for large systems. The present approach is based on the statistical information of lattice. Through first-principles-based simulation, we find that the present method evaluates configurational free energy accurately in disorder states above critical temperature.

  2. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Vontiesenhausen, G. F.

    1978-01-01

    An attempt is made to provide pertinent and readily usable information on the extraterrestrial processing of materials and manufacturing of components and elements of these planned large space systems from preprocessed lunar materials which are made available at a processing and manufacturing site in space. Required facilities, equipment, machinery, energy and manpower are defined.

  3. A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis

    NASA Astrophysics Data System (ADS)

    Aoyama, Mikio

    Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.

  4. Software engineering risk factors in the implementation of a small electronic medical record system: the problem of scalability.

    PubMed

    Chiang, Michael F; Starren, Justin B

    2002-01-01

    The successful implementation of clinical information systems is difficult. In examining the reasons and potential solutions for this problem, the medical informatics community may benefit from the lessons of a rich body of software engineering and management literature about the failure of software projects. Based on previous studies, we present a conceptual framework for understanding the risk factors associated with large-scale projects. However, the vast majority of existing literature is based on large, enterprise-wide systems, and it unclear whether those results may be scaled down and applied to smaller projects such as departmental medical information systems. To examine this issue, we discuss the case study of a delayed electronic medical record implementation project in a small specialty practice at Columbia-Presbyterian Medical Center. While the factors contributing to the delay of this small project share some attributes with those found in larger organizations, there are important differences. The significance of these differences for groups implementing small medical information systems is discussed.

  5. Seattle wide-area information for travelers (SWIFT) : consumer acceptance study

    DOT National Transportation Integrated Search

    1998-10-19

    The Seattle Wide-area Information for Travelers (SWIFT) 0perational Test was intended to evaluate the performance of a large-scale, urban Advanced Traveler Information System (ATIS) deployment in the Seattle area. With the majority of the SWIFT syste...

  6. Information science team

    NASA Technical Reports Server (NTRS)

    Billingsley, F.

    1982-01-01

    Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.

  7. Feasibility evaluation and study of adapting the attitude reference system to the Orbiter camera payload system's large format camera

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A design concept that will implement a mapping capability for the Orbital Camera Payload System (OCPS) when ground control points are not available is discussed. Through the use of stellar imagery collected by a pair of cameras whose optical axis are structurally related to the large format camera optical axis, such pointing information is made available.

  8. A technology ecosystem perspective on hospital management information systems: lessons from the health literature.

    PubMed

    Bain, Christopher A; Standing, Craig

    2009-01-01

    Hospital managers have a large range of information needs including quality metrics, financial reports, access information needs, educational, resourcing and decision support needs. Currently these needs involve interactions by managers with numerous disparate systems, both electronic such as SAP, Oracle Financials, PAS' (patient administration systems) like HOMER, and relevant websites; and paper-based systems. Hospital management information systems (HMIS) can be thought of sitting within a Technology Ecosystem (TE). In addition, Hospital Management Information Systems (HMIS) could benefit from a broader and deeper TE model, and the HMIS environment may in fact represents its own TE (the HMTE). This research will examine lessons from the health literature in relation to some of these issues, and propose an extension to the base model of a TE.

  9. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    PubMed

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  10. INFORMATION SCIENCE--OUTLINE, ASSESSMENT, INTERDISCIPLINARY DISCUSSION. REPORT FOR JUNE, 1965-JUNE, 1966.

    ERIC Educational Resources Information Center

    IBERALL, A.S.

    THIS REPORT PROVIDES AN ASSESSMENT AND INTRODUCTION TO THE INTERDISCIPLINARY LITERATURE OF THREE APSECTS OF INFORMATION SCIENCE, IN ANNOTATED BIBLIOGRAPHY FORM. THESE ARE--COMMUNICATION NETWORKS, HUMAN INFORMATION PROCESSES (PRINCIPALLY LANGUAGE AND INFORMATION RETRIEVAL), AND THE LARGE CYBERNETIC SYSTEMS SUCH AS THE HUMAN BRAIN AND CENTRAL…

  11. Converting information from paper to optical media

    NASA Technical Reports Server (NTRS)

    Deaton, Timothy N.; Tiller, Bruce K.

    1990-01-01

    The technology of converting large amounts of paper into electronic form is described for use in information management systems based on optical disk storage. The space savings and photographic nature of microfiche are combined in these systems with the advantages of computerized data (fast and flexible retrieval of graphics and text, simultaneous instant access for multiple users, and easy manipulation of data). It is noted that electronic imaging systems offer a unique opportunity to dramatically increase the productivity and profitability of information systems. Particular attention is given to the CALS (Computer-aided Aquisition and Logistic Support) system.

  12. The Research of Paper Datum Mmanagement Information System

    NASA Astrophysics Data System (ADS)

    Zhigang, Ji; Gaifang, Niu; Lingxi, Liu

    Now, paper management is becoming an important work in many colleges and universities, and the digitization of paper management is a significant constituent part of the information of college management. We have studied a universal framework of comprehensive management system spanning departments and geographical positions by taking the opportunity of the developing of the paper management system. The framework provides support for setting up large complicated distributed application fleetly, efficiently, expansively and safely, and it is a new project to realize the standardization of paper information management.

  13. Safety Case Development as an Information Modelling Problem

    NASA Astrophysics Data System (ADS)

    Lewis, Robert

    This paper considers the benefits from applying information modelling as the basis for creating an electronically-based safety case. It highlights the current difficulties of developing and managing large document-based safety cases for complex systems such as those found in Air Traffic Control systems. After a review of current tools and related literature on this subject, the paper proceeds to examine the many relationships between entities that can exist within a large safety case. The paper considers the benefits to both safety case writers and readers from the future development of an ideal safety case tool that is able to exploit these information models. The paper also introduces the idea that the safety case has formal relationships between entities that directly support the safety case argument using a methodology such as GSN, and informal relationships that provide links to direct and backing evidence and to supporting information.

  14. [Wound information management system: a standardized scheme for acquisition, storage and management of wound information].

    PubMed

    Liu, Hu; Su, Rong-jia; Wu, Min-jie; Zhang, Yi; Qiu, Xiang-jun; Feng, Jian-gang; Xie, Ting; Lu, Shu-liang

    2012-06-01

    To form a wound information management scheme with objectivity, standardization, and convenience by means of wound information management system. A wound information management system was set up with the acquisition terminal, the defined wound description, the data bank, and related softwares. The efficacy of this system was evaluated in clinical practice. The acquisition terminal was composed of the third generation mobile phone and the software. It was feasible to get access to the wound information, including description, image, and therapeutic plan from the data bank by mobile phone. During 4 months, a collection of a total of 232 wound treatment information was entered, and accordingly standardized data of 38 patients were formed automatically. This system can provide standardized wound information management by standardized techniques of acquisition, transmission, and storage of wound information. It can be used widely in hospitals, especially primary medical institutions. Data resource of the system makes it possible for epidemiological study with large sample size in future.

  15. A database paradigm for the management of DICOM-RT structure sets using a geographic information system

    NASA Astrophysics Data System (ADS)

    Shao, Weber; Kupelian, Patrick A.; Wang, Jason; Low, Daniel A.; Ruan, Dan

    2014-03-01

    We devise a paradigm for representing the DICOM-RT structure sets in a database management system, in such way that secondary calculations of geometric information can be performed quickly from the existing contour definitions. The implementation of this paradigm is achieved using the PostgreSQL database system and the PostGIS extension, a geographic information system commonly used for encoding geographical map data. The proposed paradigm eliminates the overhead of retrieving large data records from the database, as well as the need to implement various numerical and data parsing routines, when additional information related to the geometry of the anatomy is desired.

  16. Spatial inventory integrating raster databases and point sample data. [Geographic Information System for timber inventory

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Woodcock, C. E.; Logan, T. L.

    1983-01-01

    A timber inventory of the Eldorado National Forest, located in east-central California, provides an example of the use of a Geographic Information System (GIS) to stratify large areas of land for sampling and the collection of statistical data. The raster-based GIS format of the VICAR/IBIS software system allows simple and rapid tabulation of areas, and facilitates the selection of random locations for ground sampling. Algorithms that simplify the complex spatial pattern of raster-based information, and convert raster format data to strings of coordinate vectors, provide a link to conventional vector-based geographic information systems.

  17. openBIS: a flexible framework for managing and analyzing complex data in biology research

    PubMed Central

    2011-01-01

    Background Modern data generation techniques used in distributed systems biology research projects often create datasets of enormous size and diversity. We argue that in order to overcome the challenge of managing those large quantitative datasets and maximise the biological information extracted from them, a sound information system is required. Ease of integration with data analysis pipelines and other computational tools is a key requirement for it. Results We have developed openBIS, an open source software framework for constructing user-friendly, scalable and powerful information systems for data and metadata acquired in biological experiments. openBIS enables users to collect, integrate, share, publish data and to connect to data processing pipelines. This framework can be extended and has been customized for different data types acquired by a range of technologies. Conclusions openBIS is currently being used by several SystemsX.ch and EU projects applying mass spectrometric measurements of metabolites and proteins, High Content Screening, or Next Generation Sequencing technologies. The attributes that make it interesting to a large research community involved in systems biology projects include versatility, simplicity in deployment, scalability to very large data, flexibility to handle any biological data type and extensibility to the needs of any research domain. PMID:22151573

  18. Technology for large space systems: A bibliography with indexes (supplement 19)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This bibliography lists 526 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1988 and June 30, 1988. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  19. Large space structures and systems in the space station era: A bibliography with indexes (supplement 04)

    NASA Astrophysics Data System (ADS)

    1992-10-01

    Bibliographies and abstracts are listed for 1211 reports, articles, and other documents introduced into the NASA scientific and technical information system between 1 Jul. and 30 Dec. 1991. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  20. Technology for large space systems: A bibliography with indexes (supplement 14)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    This bibliography lists 645 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1985 and December 31, 1985. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  1. Large space structures and systems in the space station era: A bibliography with indexes (supplement 04)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bibliographies and abstracts are listed for 1211 reports, articles, and other documents introduced into the NASA scientific and technical information system between 1 Jul. and 30 Dec. 1991. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  2. Technology for large space systems: A bibliography with indexes (supplement 17)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This bibliography lists 512 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1987 and June 30, 1987. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  3. Technology for large space systems: A bibliography with indexes (supplement 13)

    NASA Technical Reports Server (NTRS)

    1986-01-01

    This bibliography lists 399 reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1985 and June 30, 1985. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  4. Technology for large space systems: A bibliography with indexes (supplement 18)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This bibliography lists 569 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1,1987 and December 31, 1987. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  5. Technology for large space systems: A bibliography with indexes (supplement 16)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This bibliography lists 673 reports, articles and other documents introduced into the NASA scientific and technical information system between July 1, 1986 and December 31, 1986. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  6. Technology for Large Space Systems: a Bibliography with Indexes (Supplement 21)

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This bibliography lists 745 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1989 and June 30, 1989. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  7. Technology for large space systems: A bibliography with indexes (supplement 15)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This bibliography lists 594 reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1986 and June 30, 1986. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  8. Considering Complex Objectives and Scarce Resources in Information Systems' Analysis.

    ERIC Educational Resources Information Center

    Crowther, Warren

    The low efficacy of many of the library and large-scale information systems that have been implemented in the developing countries has been disappointing, and their appropriateness is often questioned in the governmental and educational institutions of more industrialized countries beset by budget-crunching and a very dynamic transformation of…

  9. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  10. A peer-to-peer music sharing system based on query-by-humming

    NASA Astrophysics Data System (ADS)

    Wang, Jianrong; Chang, Xinglong; Zhao, Zheng; Zhang, Yebin; Shi, Qingwei

    2007-09-01

    Today, the main traffic in peer-to-peer (P2P) network is still multimedia files including large numbers of music files. The study of Music Information Retrieval (MIR) brings out many encouraging achievements in music search area. Nevertheless, the research of music search based on MIR in P2P network is still insufficient. Query by Humming (QBH) is one MIR technology studied for years. In this paper, we present a server based P2P music sharing system which is based on QBH and integrated with a Hierarchical Index Structure (HIS) to enhance the relation between surface data and potential information. HIS automatically evolving depends on the music related items carried by each peer such as midi files, lyrics and so forth. Instead of adding large amount of redundancy, the system generates a bit of index for multiple search input which improves the traditional keyword-based text search mode largely. When network bandwidth, speed, etc. are no longer a bottleneck of internet serve, the accessibility and accuracy of information provided by internet are being more concerned by end users.

  11. The Design of PC/MISI, a PC-Based Common User Interface to Remote Information Storage and Retrieval Systems. M.S. ThesisFinal Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    The amount of information contained in the data bases of large-scale information storage and retrieval systems is very large and growing at a rapid rate. The methods available for assessing this information have not been successful in making the information easily available to the people who have the greatest need for it. This thesis describes the design of a personal computer based system which will provide a means for these individuals to retrieve this data through one standardized interface. The thesis identifies each of the major problems associated with providing access to casual users of IS and R systems and describes the manner in which these problems are to be solved by the utilization of the local processing power of a PC. Additional capabilities, not available with standard access methods, are also provided to improve the user's ability to make use of this information. The design of PC/MISI is intended to facilitate its use as a research vehicle. Evaluation mechanisms and possible areas of future research are described. The PC/MISI development effort is part of a larger research effort directed at improving access to remote IS and R systems. This research effort, supported in part by NASA, is also reviewed.

  12. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  13. Biodiversity informatics: challenges and opportunities for applying biodiversity information to management and conservation

    Treesearch

    James S. Kagan

    2006-01-01

    Researchers, land managers, and the public currently often are unable to obtain useful biodiversity information because the subject represents such a large component of biology and ecology, and systems to compile and organize this information do not exist. Information on vascular plant taxonomy, as addressed by the Global Biodiversity Information Facility and key...

  14. Using external data sources to improve audit trail analysis.

    PubMed

    Herting, R L; Asaro, P V; Roth, A C; Barnes, M R

    1999-01-01

    Audit trail analysis is the primary means of detection of inappropriate use of the medical record. While audit logs contain large amounts of information, the information required to determine useful user-patient relationships is often not present. Adequate information isn't present because most audit trail analysis systems rely on the limited information available within the medical record system. We report a feature of the STAR (System for Text Archive and Retrieval) audit analysis system where information available in the medical record is augmented with external information sources such as: database sources, Light-weight Directory Access Protocol (LDAP) server sources, and World Wide Web (WWW) database sources. We discuss several issues that arise when combining the information from each of these disparate information sources. Furthermore, we explain how the enhanced person specific information obtained can be used to determine user-patient relationships that might signify a motive for inappropriately accessing a patient's medical record.

  15. Education of the handicapped child: Status, trend, and issues related to electronic delivery

    NASA Technical Reports Server (NTRS)

    Rothenberg, D.

    1973-01-01

    This study is part of a broader investigation of the role of large-scale educational telecommunications systems. Thus, data are analyzed and trends and issues discussed to provide information useful to the systems designer who wishes to identify and assess the opportunities for large-scale electronic delivery of education for the handicapped.

  16. Validation of Autism Spectrum Disorder Diagnoses in Large Healthcare Systems with Electronic Medical Records

    ERIC Educational Resources Information Center

    Coleman, Karen J.; Lutsky, Marta A.; Yau, Vincent; Qian, Yinge; Pomichowski, Magdalena E.; Crawford, Phillip M.; Lynch, Frances L.; Madden, Jeanne M.; Owen-Smith, Ashli; Pearson, John A.; Pearson, Kathryn A.; Rusinak, Donna; Quinn, Virginia P.; Croen, Lisa A.

    2015-01-01

    To identify factors associated with valid Autism Spectrum Disorder (ASD) diagnoses from electronic sources in large healthcare systems. We examined 1,272 charts from ASD diagnosed youth <18 years old. Expert reviewers classified diagnoses as confirmed, probable, possible, ruled out, or not enough information. A total of 845 were classified with…

  17. A hypertext system that learns from user feedback

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie

    1994-01-01

    Retrieving specific information from large amounts of documentation is not an easy task. It could be facilitated if information relevant in the current problem solving context could be automatically supplied to the user. As a first step towards this goal, we have developed an intelligent hypertext system called CID (Computer Integrated Documentation). Besides providing an hypertext interface for browsing large documents, the CID system automatically acquires and reuses the context in which previous searches were appropriate. This mechanism utilizes on-line user information requirements and relevance feedback either to reinforce current indexing in case of success or to generate new knowledge in case of failure. Thus, the user continually augments and refines the intelligence of the retrieval system. This allows the CID system to provide helpful responses, based on previous usage of the documentation, and to improve its performance over time. We successfully tested the CID system with users of the Space Station Freedom requirements documents. We are currently extending CID to other application domains (Space Shuttle operations documents, airplane maintenance manuals, and on-line training). We are also exploring the potential commercialization of this technique.

  18. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  19. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  20. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  1. Forecasting distribution of numbers of large fires

    Treesearch

    Haiganoush K. Preisler; Jeff Eidenshink; Stephen Howard; Robert E. Burgan

    2015-01-01

    Systems to estimate forest fire potential commonly utilize one or more indexes that relate to expected fire behavior; however they indicate neither the chance that a large fire will occur, nor the expected number of large fires. That is, they do not quantify the probabilistic nature of fire danger. In this work we use large fire occurrence information from the...

  2. Computer-Assisted Search Of Large Textual Data Bases

    NASA Technical Reports Server (NTRS)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  3. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  4. Critical Issues in Large-Scale Assessment: A Resource Guide.

    ERIC Educational Resources Information Center

    Redfield, Doris

    The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…

  5. Toward a Federal Land Information System: Experiences and issues

    USGS Publications Warehouse

    Sturdevant, James A.

    1988-01-01

    From 1983 to 1987, the U.S. Geological Survey conducted research to develop a national resource data base of Federal lands under the auspices of the Federal Land Information System (FLIS) program. The program's goal was to develop the capability to provide information to national mineral-use policymakers. Prototype spatial data bases containing mineral, land status, and base cartographic data were developed for the Medford, Oreg., area, the State of Alaska, and the Silver City, N. Mex., area. Other accomplishments included (1) the preparation of a digital format for U.S. Geological Survey mineral assessment data and (2) the development of a procedure for integrating parcel-level tabular Alaska land status data into a section-level geographic information system. Overall findings indicated that both vector and raster capabilities are required for a FLIS and that nationwide data availability is a limiting factor in FLIS development. As a result of a 1986 interbureau (U.S. Geological Survey, Bureau of Land Management, and Bureau of Mines) review of the FLIS program, activities were redirected to undertake research on large-area geographic information system techniques. Land use and land cover data generalization strategies were tested, and areafiltering software was found to be the optimum type. In addition, a procedure was developed for transferring tabular land status data of surveyed areas in the contiguous 48 States to spatial data for use in geographic information systems. The U.S. Geological Survey FLIS program, as an administrative unit, ended in 1987, but FLIS-related research on large-area geographic information systems continues.

  6. Large space structures and systems in the space station era: A bibliography with indexes

    NASA Technical Reports Server (NTRS)

    Ferrainolo, John J. (Compiler); Lawrence, George F. (Compiler)

    1991-01-01

    Bibliographies and abstracts are listed for 1219 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1, 1990 and December 31, 1990. The purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  7. Large space structures and systems in the space station era: A bibliography with indexes

    NASA Technical Reports Server (NTRS)

    Ferrainolo, John J. (Editor)

    1990-01-01

    Bibliographies and abstracts are listed for 1372 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1990 and June 30, 1990. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural and thermal analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems.

  8. DIALOG: An executive computer program for linking independent programs

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.; Hague, D. S.; Watson, D. A.

    1973-01-01

    A very large scale computer programming procedure called the DIALOG Executive System has been developed for the Univac 1100 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. The unique feature of the DIALOG Executive System is the manner in which computer programs are linked. Each program maintains its individual identity and as such is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG Executive System. The installation and use of the DIALOG Executive System are described at Johnson Space Center.

  9. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  10. On the decentralized control of large-scale systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chong, C.

    1973-01-01

    The decentralized control of stochastic large scale systems was considered. Particular emphasis was given to control strategies which utilize decentralized information and can be computed in a decentralized manner. The deterministic constrained optimization problem is generalized to the stochastic case when each decision variable depends on different information and the constraint is only required to be satisfied on the average. For problems with a particular structure, a hierarchical decomposition is obtained. For the stochastic control of dynamic systems with different information sets, a new kind of optimality is proposed which exploits the coupled nature of the dynamic system. The subsystems are assumed to be uncoupled and then certain constraints are required to be satisfied, either in a off-line or on-line fashion. For off-line coordination, a hierarchical approach of solving the problem is obtained. The lower level problems are all uncoupled. For on-line coordination, distinction is made between open loop feedback optimal coordination and closed loop optimal coordination.

  11. User-Centered Indexing for Adaptive Information Access

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Mathe, Nathalie

    1996-01-01

    We are focusing on information access tasks characterized by large volume of hypermedia connected technical documents, a need for rapid and effective access to familiar information, and long-term interaction with evolving information. The problem for technical users is to build and maintain a personalized task-oriented model of the information to quickly access relevant information. We propose a solution which provides user-centered adaptive information retrieval and navigation. This solution supports users in customizing information access over time. It is complementary to information discovery methods which provide access to new information, since it lets users customize future access to previously found information. It relies on a technique, called Adaptive Relevance Network, which creates and maintains a complex indexing structure to represent personal user's information access maps organized by concepts. This technique is integrated within the Adaptive HyperMan system, which helps NASA Space Shuttle flight controllers organize and access large amount of information. It allows users to select and mark any part of a document as interesting, and to index that part with user-defined concepts. Users can then do subsequent retrieval of marked portions of documents. This functionality allows users to define and access personal collections of information, which are dynamically computed. The system also supports collaborative review by letting users share group access maps. The adaptive relevance network provides long-term adaptation based both on usage and on explicit user input. The indexing structure is dynamic and evolves over time. Leading and generalization support flexible retrieval of information under similar concepts. The network is geared towards more recent information access, and automatically manages its size in order to maintain rapid access when scaling up to large hypermedia space. We present results of simulated learning experiments.

  12. Ontology-driven data integration and visualization for exploring regional geologic time and paleontological information

    NASA Astrophysics Data System (ADS)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo

    2018-06-01

    Initiatives of open data promote the online publication and sharing of large amounts of geologic data. How to retrieve information and discover knowledge from the big data is an ongoing challenge. In this paper, we developed an ontology-driven data integration and visualization pilot system for exploring information of regional geologic time, paleontology, and fundamental geology. The pilot system (http://www2.cs.uidaho.edu/%7Emax/gts/)

  13. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  14. Some Thoughts About Water Analysis in Shipboard Steam Propulsion Systems for Marine Engineering Students.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.; And Others

    Information is presented about the problems involved in using sea water in the steam propulsion systems of large, modern ships. Discussions supply background chemical information concerning the problems of corrosion, scale buildup, and sludge production. Suggestions are given for ways to maintain a good water treatment program to effectively deal…

  15. Application of Open-Source Enterprise Information System Modules: An Empirical Study

    ERIC Educational Resources Information Center

    Lee, Sang-Heui

    2010-01-01

    Although there have been a number of studies on large scale implementation of proprietary enterprise information systems (EIS), open-source software (OSS) for EIS has received limited attention in spite of its potential as a disruptive innovation. Cost saving is the main driver for adopting OSS among the other possible benefits including security…

  16. Implementing an Enterprise Information System to Reengineer and Streamline Administrative Processes in a Distance Learning Unit

    ERIC Educational Resources Information Center

    Abdous, M'hammed; He, Wu

    2009-01-01

    During the past three years, we have developed and implemented an enterprise information system (EIS) to reengineer and facilitate the administrative process for preparing and teaching distance learning courses in a midsized-to-large university (with 23,000 students). The outcome of the implementation has been a streamlined and efficient process…

  17. The Atlanta Project: How One Large School System Responded to Performance Information.

    ERIC Educational Resources Information Center

    White, Bayla F.; And Others

    This report presents the results of a field test, the purpose of which was to determine the effects on school system management, decisions, and operations of the introduction of specially prepared information on the relative achievement levels of schools and grades serving students of similar economic levels. A relatively simple and economical…

  18. The Impacts of Agile Development Methodology Use on Project Success: A Contingency View

    ERIC Educational Resources Information Center

    Tripp, John F.

    2012-01-01

    Agile Information Systems Development Methods have emerged in the past decade as an alternative manner of managing the work and delivery of information systems development teams, with a large number of organizations reporting the adoption & use of agile methods. The practitioners of these methods make broad claims as to the benefits of their…

  19. Information Retrieval (SPIRES) and Library Automation (BALLOTS) at Stanford University.

    ERIC Educational Resources Information Center

    Ferguson, Douglas, Ed.

    At Stanford University, two major projects have been involved jointly in library automation and information retrieval since 1968: BALLOTS (Bibliographic Automation of Large Library Operations) and SPIRES (Stanford Physics Information Retrieval System). In early 1969, two prototype applications were activated using the jointly developed systems…

  20. Interactive Model-Centric Systems Engineering (IMCSE) Phase 5

    DTIC Science & Technology

    2018-02-28

    Conducting Program Team Launches ................................................................................................. 12 Informing Policy...research advances knowledge relevant to human interaction with models and model-generated information . Figure 1 highlights several questions the...stakeholders interact using models and model generated information ; facets of human interaction with visualizations and large data sets; and underlying

  1. MFIRE-2: A Multi Agent System for Flow-Based Intrusion Detection Using Stochastic Search

    DTIC Science & Technology

    2012-03-01

    attacks that are distributed in nature , but may not protect individual systems effectively without incurring large bandwidth penalties while collecting...system-level information to help prepare for more significant attacks. The type of information potentially revealed by footprinting includes account...key areas where MAS may be appropriate: • The environment is open, highly dynamic, uncertain, or complex • Agents are a natural metaphor—Many

  2. The broiler meat system in Nairobi, Kenya: Using a value chain framework to understand animal and product flows, governance and sanitary risks.

    PubMed

    Carron, Maud; Alarcon, Pablo; Karani, Maurice; Muinde, Patrick; Akoko, James; Onono, Joshua; Fèvre, Eric M; Häsler, Barbara; Rushton, Jonathan

    2017-11-01

    Livestock food systems play key subsistence and income generation roles in low to middle income countries and are important networks for zoonotic disease transmission. The aim of this study was to use a value chain framework to characterize the broiler chicken meat system of Nairobi, its governance and sanitary risks. A total of 4 focus groups and 8 key informant interviews were used to collect cross-sectional data from: small-scale broiler farmers in selected Nairobi peri-urban and informal settlement areas; medium to large integrated broiler production companies; traders and meat inspectors in live chicken and chicken meat markets in Nairobi. Qualitative data were collected on types of people operating in the system, their interactions, sanitary measures in place, sourcing and selling of broiler chickens and products. Framework analysis was used to identify governance themes and risky sanitary practices present in the system. One large company was identified to supply 60% of Nairobi's day-old chicks to farmers, mainly through agrovet shops. Broiler meat products from integrated companies were sold in high-end retailers whereas their low value products were channelled through independent traders to consumers in informal settlements. Peri-urban small-scale farmers reported to slaughter the broilers on the farm and to sell carcasses to retailers (hotels and butcheries mainly) through brokers (80%), while farmers in the informal settlement reported to sell their broilers live to retailers (butcheries, hotels and hawkers mainly) directly. Broiler heads and legs were sold in informal settlements via roadside vendors. Sanitary risks identified were related to lack of biosecurity, cold chain and access to water, poor hygiene practices, lack of inspection at farm slaughter and limited health inspection in markets.
 Large companies dominated the governance of the broiler system through the control of day-old chick production. Overall government control was described as relatively weak leading to minimal official regulatory enforcement. Large companies and brokers were identified as dominant groups in market information dissemination and price setting. Lack of farmer association was found to be system-wide and to limit market access. Other system barriers included lack of space and expertise, leading to poor infrastructure and limited ability to implement effective hygienic measures.
 This study highlights significant structural differences between different broiler chains and inequalities in product quality and market access across the system. It provides a foundation for food safety assessments, disease control programmes and informs policy-making for the inclusive growth of this fast-evolving sector. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Applicability of computational systems biology in toxicology.

    PubMed

    Kongsbak, Kristine; Hadrup, Niels; Audouze, Karine; Vinggaard, Anne Marie

    2014-07-01

    Systems biology as a research field has emerged within the last few decades. Systems biology, often defined as the antithesis of the reductionist approach, integrates information about individual components of a biological system. In integrative systems biology, large data sets from various sources and databases are used to model and predict effects of chemicals on, for instance, human health. In toxicology, computational systems biology enables identification of important pathways and molecules from large data sets; tasks that can be extremely laborious when performed by a classical literature search. However, computational systems biology offers more advantages than providing a high-throughput literature search; it may form the basis for establishment of hypotheses on potential links between environmental chemicals and human diseases, which would be very difficult to establish experimentally. This is possible due to the existence of comprehensive databases containing information on networks of human protein-protein interactions and protein-disease associations. Experimentally determined targets of the specific chemical of interest can be fed into these networks to obtain additional information that can be used to establish hypotheses on links between the chemical and human diseases. Such information can also be applied for designing more intelligent animal/cell experiments that can test the established hypotheses. Here, we describe how and why to apply an integrative systems biology method in the hypothesis-generating phase of toxicological research. © 2014 Nordic Association for the Publication of BCPT (former Nordic Pharmacological Society).

  4. Methods of information geometry in computational system biology (consistency between chemical and biological evolution).

    PubMed

    Astakhov, Vadim

    2009-01-01

    Interest in simulation of large-scale metabolic networks, species development, and genesis of various diseases requires new simulation techniques to accommodate the high complexity of realistic biological networks. Information geometry and topological formalisms are proposed to analyze information processes. We analyze the complexity of large-scale biological networks as well as transition of the system functionality due to modification in the system architecture, system environment, and system components. The dynamic core model is developed. The term dynamic core is used to define a set of causally related network functions. Delocalization of dynamic core model provides a mathematical formalism to analyze migration of specific functions in biosystems which undergo structure transition induced by the environment. The term delocalization is used to describe these processes of migration. We constructed a holographic model with self-poetic dynamic cores which preserves functional properties under those transitions. Topological constraints such as Ricci flow and Pfaff dimension were found for statistical manifolds which represent biological networks. These constraints can provide insight on processes of degeneration and recovery which take place in large-scale networks. We would like to suggest that therapies which are able to effectively implement estimated constraints, will successfully adjust biological systems and recover altered functionality. Also, we mathematically formulate the hypothesis that there is a direct consistency between biological and chemical evolution. Any set of causal relations within a biological network has its dual reimplementation in the chemistry of the system environment.

  5. Design and deployment of a large brain-image database for clinical and nonclinical research

    NASA Astrophysics Data System (ADS)

    Yang, Guo Liang; Lim, Choie Cheio Tchoyoson; Banukumar, Narayanaswami; Aziz, Aamer; Hui, Francis; Nowinski, Wieslaw L.

    2004-04-01

    An efficient database is an essential component of organizing diverse information on image metadata and patient information for research in medical imaging. This paper describes the design, development and deployment of a large database system serving as a brain image repository that can be used across different platforms in various medical researches. It forms the infrastructure that links hospitals and institutions together and shares data among them. The database contains patient-, pathology-, image-, research- and management-specific data. The functionalities of the database system include image uploading, storage, indexing, downloading and sharing as well as database querying and management with security and data anonymization concerns well taken care of. The structure of database is multi-tier client-server architecture with Relational Database Management System, Security Layer, Application Layer and User Interface. Image source adapter has been developed to handle most of the popular image formats. The database has a user interface based on web browsers and is easy to handle. We have used Java programming language for its platform independency and vast function libraries. The brain image database can sort data according to clinically relevant information. This can be effectively used in research from the clinicians" points of view. The database is suitable for validation of algorithms on large population of cases. Medical images for processing could be identified and organized based on information in image metadata. Clinical research in various pathologies can thus be performed with greater efficiency and large image repositories can be managed more effectively. The prototype of the system has been installed in a few hospitals and is working to the satisfaction of the clinicians.

  6. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  7. VisIRR: A Visual Analytics System for Information Retrieval and Recommendation for Large-Scale Document Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choo, Jaegul; Kim, Hannah; Clarkson, Edward

    In this paper, we present an interactive visual information retrieval and recommendation system, called VisIRR, for large-scale document discovery. VisIRR effectively combines the paradigms of (1) a passive pull through query processes for retrieval and (2) an active push that recommends items of potential interest to users based on their preferences. Equipped with an efficient dynamic query interface against a large-scale corpus, VisIRR organizes the retrieved documents into high-level topics and visualizes them in a 2D space, representing the relationships among the topics along with their keyword summary. In addition, based on interactive personalized preference feedback with regard to documents,more » VisIRR provides document recommendations from the entire corpus, which are beyond the retrieved sets. Such recommended documents are visualized in the same space as the retrieved documents, so that users can seamlessly analyze both existing and newly recommended ones. This article presents novel computational methods, which make these integrated representations and fast interactions possible for a large-scale document corpus. We illustrate how the system works by providing detailed usage scenarios. Finally, we present preliminary user study results for evaluating the effectiveness of the system.« less

  8. VisIRR: A Visual Analytics System for Information Retrieval and Recommendation for Large-Scale Document Data

    DOE PAGES

    Choo, Jaegul; Kim, Hannah; Clarkson, Edward; ...

    2018-01-31

    In this paper, we present an interactive visual information retrieval and recommendation system, called VisIRR, for large-scale document discovery. VisIRR effectively combines the paradigms of (1) a passive pull through query processes for retrieval and (2) an active push that recommends items of potential interest to users based on their preferences. Equipped with an efficient dynamic query interface against a large-scale corpus, VisIRR organizes the retrieved documents into high-level topics and visualizes them in a 2D space, representing the relationships among the topics along with their keyword summary. In addition, based on interactive personalized preference feedback with regard to documents,more » VisIRR provides document recommendations from the entire corpus, which are beyond the retrieved sets. Such recommended documents are visualized in the same space as the retrieved documents, so that users can seamlessly analyze both existing and newly recommended ones. This article presents novel computational methods, which make these integrated representations and fast interactions possible for a large-scale document corpus. We illustrate how the system works by providing detailed usage scenarios. Finally, we present preliminary user study results for evaluating the effectiveness of the system.« less

  9. Information extraction from multi-institutional radiology reports.

    PubMed

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We also evaluated the generalizability of our approach across different organizations by training and testing our system on data from different organizations. Our results show the efficacy of our machine learning approach in extracting the information model's elements (10-fold cross-validation average performance: precision: 87%, recall: 84%, F1 score: 85%) and its superiority and generalizability compared to the common non-machine learning approach (p-value<0.05). Our machine learning information extraction approach provides an effective automatic method to annotate and extract clinically significant information from a large collection of free text radiology reports. This information extraction system can help clinicians better understand the radiology reports and prioritize their review process. In addition, the extracted information can be used by researchers to link radiology reports to information from other data sources such as electronic health records and the patient's genome. Extracted information also can facilitate disease surveillance, real-time clinical decision support for the radiologist, and content-based image retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. 3-Dimensional Root Cause Diagnosis via Co-analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Ziming; Lan, Zhiling; Yu, Li

    2012-01-01

    With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less

  11. A Computer-Based System Integrating Instruction and Information Retrieval: A Description of Some Methodological Considerations.

    ERIC Educational Resources Information Center

    Selig, Judith A.; And Others

    This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…

  12. The Opportunity and Challenge of The Age of Big Data

    NASA Astrophysics Data System (ADS)

    Yunguo, Hong

    2017-11-01

    The arrival of large data age has gradually expanded the scale of information industry in China, which has created favorable conditions for the expansion of information technology and computer network. Based on big data the computer system service function is becoming more and more perfect, and the efficiency of data processing in the system is improving, which provides important guarantee for the implementation of production plan in various industries. At the same time, the rapid development of fields such as Internet of things, social tools, cloud computing and the widen of information channel, these make the amount of data is increase, expand the influence range of the age of big data, we need to take the opportunities and challenges of the age of big data correctly, use data information resources effectively. Based on this, this paper will study the opportunities and challenges of the era of large data.

  13. HPAC info-dex 1 -- Locating a manufacturer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-06-01

    Information in the index includes manufacturer name, address, and telephone and FAX numbers. In this section are more than 200 pages of detailed product information from manufacturers of a wide variety of mechanical systems products. The information details ranges of capacities, sizes, and other data that will assist in the selection and application of these products for mechanical systems in large plants and buildings. Throughout the year, use this section for assistance on current engineering projects. The information details ranges of capacities, sizes, and other data that will assist in the selection and application of these products for mechanical systemsmore » in large plants and buildings. Throughout the year, use this section for assistance on current engineering projects. The manufacturers appearing in HPAC Info-dex 6 are boldface listed in HPAC Info-dex 1, HPAC Info-dex 2, and HPAC Info-dex 3.« less

  14. Information findability: An informal study to explore options for improving information findability for the systems analysis group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoecker, Nora Kathleen

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large setmore » of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.« less

  15. eHealth provides a novel opportunity to exploit the advantages of the Nordic countries in psychiatric genetic research, building on the public health care system, biobanks, and registries.

    PubMed

    Andreassen, Ole A

    2017-07-07

    Nordic countries have played an important role in the recent progress in psychiatric genetics, both with large well-characterized samples and expertise. The Nordic countries have research advantages due to the organization of their societies, including system of personal identifiers, national health registries with information about diseases, treatment and prescriptions, and a public health system with geographical catchment areas. For psychiatric genetic research, the large biobanks and population surveys are a unique added value. Further, the population is motivated to participate in research, and there is a trust in the institutions of the society. These factors have been important for Nordic contributions to biomedical research, and particularly psychiatric genetics. In the era of eHealth, the situation seems even more advantageous for Nordic countries. The system with public health care makes it easy to implement national measures, and most of the Nordic health care sector is already based on electronic information. The potential advantages regarding informed consent, large scale recruitment and follow-up, and longitudinal cohort studies are tremendous. New precision medicine approaches can be tested within the health care system, with an integrated approach, using large hospitals or regions of the country as a test beds. However, data protection and legal framework have to be clarified. In order to succeed, it is important to keep the people's trust, and maintain the high ethical standards and systems for secure data management. Then the full potential of the Nordic countries can be leveraged in the new era of precision medicine including psychiatric genetics. © 2017 Wiley Periodicals, Inc.

  16. Another HISA--the new standard: health informatics--service architecture.

    PubMed

    Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik

    2007-01-01

    In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.

  17. Information Management System Supporting a Multiple Property Survey Program with Legacy Radioactive Contamination.

    PubMed

    Stager, Ron; Chambers, Douglas; Wiatzka, Gerd; Dupre, Monica; Callough, Micah; Benson, John; Santiago, Erwin; van Veen, Walter

    2017-04-01

    The Port Hope Area Initiative is a project mandated and funded by the Government of Canada to remediate properties with legacy low-level radioactive waste contamination in the Town of Port Hope, Ontario. The management and use of large amounts of data from surveys of some 4800 properties is a significant task critical to the success of the project. A large amount of information is generated through the surveys, including scheduling individual field visits to the properties, capture of field data laboratory sample tracking, QA/QC, property report generation and project management reporting. Web-mapping tools were used to track and display temporal progress of various tasks and facilitated consideration of spatial associations of contamination levels. The IM system facilitated the management and integrity of the large amounts of information collected, evaluation of spatial associations, automated report reproduction and consistent application and traceable execution for this project.x. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Large space structures and systems in the space station era: A bibliography with indexes (supplement 03)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Bibliographies and abstracts are listed for 1221 reports, articles, and other documents introduced into the NASA scientific and technical information system between January 1, 1991 and June 30, 1991. Topics covered include large space structures and systems, space stations, extravehicular activity, thermal environments and control, tethering, spacecraft power supplies, structural concepts and control systems, electronics, advanced materials, propulsion, policies and international cooperation, vibration and dynamic controls, robotics and remote operations, data and communication systems, electric power generation, space commercialization, orbital transfer, and human factors engineering.

  19. Robotic System For Greenhouse Or Nursery

    NASA Technical Reports Server (NTRS)

    Gill, Paul; Montgomery, Jim; Silver, John; Heffelfinger, Neil; Simonton, Ward; Pease, Jim

    1993-01-01

    Report presents additional information about robotic system described in "Robotic Gripper With Force Control And Optical Sensors" (MFS-28537). "Flexible Agricultural Robotics Manipulator System" (FARMS) serves as prototype of robotic systems intended to enhance productivities of agricultural assembly-line-type facilities in large commercial greenhouses and nurseries.

  20. The Information System at CeSAM

    NASA Astrophysics Data System (ADS)

    Agneray, F.; Gimenez, S.; Moreau, C.; Roehlly, Y.

    2012-09-01

    Modern large observational programmes produce important amounts of data from various origins, and need high level quality control, fast data access via easy-to-use graphic interfaces, as well as possibility to cross-correlate informations coming from different observations. The Centre de donnéeS Astrophysique de Marseille (CeSAM) offer web access to VO compliant Information Systems to access data of different projects (VVDS, HeDAM, EXODAT, HST-COSMOS,…), including ancillary data obtained outside Laboratoire d'Astrophysique de Marseille (LAM) control. The CeSAM Information Systems provides download of catalogues and some additional services like: search, extract and display imaging and spectroscopic data by multi-criteria and Cone Search interfaces.

  1. The development and assessment of Web-based health information for a corporate Intranet--a pilot study.

    PubMed

    Matarrese, P; Helwig, A

    2000-01-01

    Consumers readily use the Internet for medical information, advice and support. Studies of general clinic populations show that moderated internet patient education systems can improve patient satisfaction and affect self help behaviors. Many Americans have Internet access through their employers and large corporations have often developed Intranets for employee information. There is little study of health information available online to employees through company Intranets. This study relates the development of an employer sponsored online health education system, the effects of this system on employee satisfaction with their health care, and the potential effects on worker productivity.

  2. Clinical benchmarking enabled by the digital health record.

    PubMed

    Ricciardi, T N; Masarie, F E; Middleton, B

    2001-01-01

    Office-based physicians are often ill equipped to report aggregate information about their patients and practice of medicine, since their practices have relied upon paper records for the management of clinical information. Physicians who do not have access to large-scale information technology support can now benefit from low-cost clinical documentation and reporting tools. We developed a hosted clinical data mart for users of a web-enabled charting tool, targeting the solo or small group practice. The system uses secure Java Server Pages with a dashboard-like menu to provide point-and-click access to simple reports such as case mix, medications, utilization, productivity, and patient demographics in its first release. The system automatically normalizes user-entered clinical terms to enhance the quality of structured data. Individual providers benefit from rapid patient identification for disease management, quality of care self-assessments, drug recalls, and compliance with clinical guidelines. The system provides knowledge integration by linking to trusted sources of online medical information in context. Information derived from the clinical record is clinically more accurate than billing data. Provider self-assessment and benchmarking empowers physicians, who may resent "being profiled" by external entities. In contrast to large-scale data warehouse projects, the current system delivers immediate value to individual physicians who choose an electronic clinical documentation tool.

  3. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  4. Compact Multimedia Systems in Multi-chip Module Technology

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Alkalaj, Leon

    1995-01-01

    This tutorial paper shows advanced multimedia system designs based on multi-chip module (MCM) technologies that provide essential computing, compression, communication, and storage capabilities for various large scale information highway applications.!.

  5. The Role of Advanced Information System Technology in Remote Sensing for NASA's Earth Science Enterprise in the 21st Century

    NASA Technical Reports Server (NTRS)

    Prescott, Glenn; Komar, George (Technical Monitor)

    2001-01-01

    Future NASA Earth observing satellites will carry high-precision instruments capable of producing large amounts of scientific data. The strategy will be to network these instrument-laden satellites into a web-like array of sensors to facilitate the collection, processing, transmission, storage, and distribution of data and data products - the essential elements of what we refer to as "Information Technology." Many of these Information Technologies will enable the satellite and ground information systems to function effectively in real-time, providing scientists with the capability of customizing data collection activities on a satellite or group of satellites directly from the ground. In future systems, extremely large quantities of data collected by scientific instruments will require the fastest processors, the highest communication channel transfer rates, and the largest data storage capacity to insure that data flows smoothly from the satellite-based instrument to the ground-based archive. Autonomous systems will control all essential processes and play a key role in coordinating the data flow through space-based communication networks. In this paper, we will discuss those critical information technologies for Earth observing satellites that will support the next generation of space-based scientific measurements of planet Earth, and insure that data and data products provided by these systems will be accessible to scientists and the user community in general.

  6. Residential solar-heating system-design package

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Design package for modular solar heating system includes performance specifications, design data, installation guidelines, and other information that should be valuable to those interested in system (or similar systems) for projected installation. When installed in insulated "energy saver" home, system can supply large percentage of total energy needs of building.

  7. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  8. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Wilson, Gregory S.; Huntress, Wesley T.

    1990-01-01

    The rationale behind Mission to Planet Earth is presented, and the program plan is described in detail. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to collect muultidisciplinary data. A sophisticated data system will process and archive an unprecedented large amount of information about the earth and how it functions as a system. Attention is given to the space observatories, the data and information systems, and the interdisciplinary research.

  9. Taking the Mystery Out of Research in Computing Information Systems: A New Approach to Teaching Research Paradigm Architecture.

    ERIC Educational Resources Information Center

    Heslin, J. Alexander, Jr.

    In senior-level undergraduate research courses in Computer Information Systems (CIS), students are required to read and assimilate a large volume of current research literature. One course objective is to demonstrate to the student that there are patterns or models or paradigms of research. A new approach in identifying research paradigms is…

  10. Clicking to Learn: A Case Study of Embedding Radio-Frequency Based Clickers in an Introductory Management Information Systems Course

    ERIC Educational Resources Information Center

    Nelson, Matthew L.; Hauck, Roslin V.

    2008-01-01

    The challenges associated with teaching a core introductory management information systems (MIS) course are well known (large class sizes serving a majority of non-MIS majors, sustaining student interests, encouraging class participation, etc.). This study offers a mechanism towards managing these challenges through the use of a simple and…

  11. Distributed resource allocation under communication constraints

    NASA Astrophysics Data System (ADS)

    Dodin, Pierre; Nimier, Vincent

    2001-03-01

    This paper deals with a study of the multi-sensor management problem for multi-target tracking. The collaboration between many sensors observing the same target means that they are able to fuse their data during the information process. Then one must take into account this possibility to compute the optimal association sensors-target at each step of time. In order to solve this problem for real large scale system, one must both consider the information aspect and the control aspect of the problem. To unify these problems, one possibility is to use a decentralized filtering algorithm locally driven by an assignment algorithm. The decentralized filtering algorithm we use in our model is the filtering algorithm of Grime, which relaxes the usual full-connected hypothesis. By full-connected, one means that the information in a full-connected system is totally distributed everywhere at the same moment, which is unacceptable for a real large scale system. We modelize the distributed assignment decision with the help of a greedy algorithm. Each sensor performs a global optimization, in order to estimate other information sets. A consequence of the relaxation of the full- connected hypothesis is that the sensors' information set are not the same at each step of time, producing an information dis- symmetry in the system. The assignment algorithm uses a local knowledge of this dis-symmetry. By testing the reactions and the coherence of the local assignment decisions of our system, against maneuvering targets, we show that it is still possible to manage with decentralized assignment control even though the system is not full-connected.

  12. The role of health informatics in clinical audit: part of the problem or key to the solution?

    PubMed

    Georgiou, Andrew; Pearson, Michael

    2002-05-01

    The concepts of quality assurance (for which clinical audit is an essential part), evaluation and clinical governance each depend on the ability to derive and record measurements that describe clinical performance. Rapid IT developments have raised many new possibilities for managing health care. They have allowed for easier collection and processing of data in greater quantities. These developments have encouraged the growth of quality assurance as a key feature of health care delivery. In the past most of the emphasis has been on hospital information systems designed predominantly for the administration of patients and the management of financial performance. Large, hi-tech information system capacity does not guarantee quality information. The task of producing information that can be confidently used to monitor the quality of clinical care requires attention to key aspects of the design and operation of the audit. The Myocardial Infarction National Audit Project (MINAP) utilizes an IT-based system to collect and process data on large numbers of patients and make them readily available to contributing hospitals. The project shows that IT systems that employ rigorous health informatics methodologies can do much to improve the monitoring and provision of health care.

  13. Spatial Variation in the Invertebrate Macrobenthos of Three Large Missouri River Reservoirs

    EPA Science Inventory

    Benthic macroinvertebrates assemblages are useful indicators of ecological condition for aquatic systems. This study was conducted to characterize benthic communities of three large reservoirs on the Missouri River. The information collected on abundance, distribution and varia...

  14. BIOME: A browser-aware search and order system

    NASA Technical Reports Server (NTRS)

    Grubb, Jon W.; Jennings, Sarah V.; Yow, Teresa G.; Daughterty, Patricia F.

    1996-01-01

    The Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC), which is associated with NASA's Earth Observing System Data and Information System (EOSDIS), provides access to a large number of tabular and imagery datasets used in ecological and environmental research. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC developed the Biogeochemical Information Ordering Management Environment (BIOME), a search and order system for the World Wide Web (WWW). The WWW provides a new vehicle that allows a wide range of users access to the data. This paper describes the specialized attributes incorporated into BIOME that allow researchers easy access to an otherwise bewildering array of data products.

  15. Case studies of traffic monitoring programs in large urban areas

    DOT National Transportation Integrated Search

    1997-07-01

    This is one of two documents prepared by the Center for Transportation Information of the Volpe National Transportation Systems Center in support of the Federal Highway Administration's Office of Highway Information Management. This report presents t...

  16. Computerized physician order entry from a chief information officer perspective.

    PubMed

    Cotter, Carole M

    2004-12-01

    Designing and implementing a computerized physician order entry system in the critical care units of a large urban hospital system is an enormous undertaking. With their significant potential to improve health care and significantly reduce errors, the time for computerized physician order entry or physician order management systems is past due. Careful integrated planning is the key to success, requiring multidisciplinary teams at all levels of clinical and administrative management to work together. Articulated from the viewpoint of the Chief Information Officer of Lifespan, a not-for-profit hospital system in Rhode Island, the vision and strategy preceding the information technology plan, understanding the system's current state, the gap analysis between current and future state, and finally, building and implementing the information technology plan are described.

  17. International Radiation Monitoring and Information System (IRMIS)

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Baciu, Florian; Stowisek, Jan; Saluja, Gurdeep; Kenny, Patrick; Albinet, Franck

    2017-09-01

    This article describes the International Radiation Monitoring Information System (IRMIS) which was developed by the International Atomic Energy Agency (IAEA) with the goal to provide Competent Authorities, the IAEA and other international organizations with a client server based web application to share and visualize large quantities of radiation monitoring data. The data maps the areas of potential impact that can assist countries to take appropriate protective actions in an emergency. Ever since the Chernobyl nuclear power plant accident in April of 19861 European Community (EC) has worked towards collecting routine environmental radiological monitoring data from national networked monitoring systems. European Radiological Data Exchange Platform (EURDEP) was created in 19952 to that end - to provide radiation monitoring data from most European countries reported in nearly real-time. During the response operations for the Fukushima Dai-ichi nuclear power plant accident (March 2011) the IAEA Incident and Emergency Centre (IEC) managed, harmonized and shared the large amount of data that was being generated from different organizations. This task underscored the need for a system which allows sharing large volumes of radiation monitoring data in an emergency. In 2014 EURDEP started the submission of the European radiological data to the International Radiation Monitoring Information System (IRMIS) as a European Regional HUB for IRMIS. IRMIS supports the implementation of the Convention on Early Notification of a Nuclear Accident by providing a web application for the reporting, sharing, visualizing and analysing of large quantities of environmental radiation monitoring data during nuclear or radiological emergencies. IRMIS is not an early warning system that automatically reports when there are significant deviations in radiation levels or when values are detected above certain levels. However, the configuration of the visualization features offered by IRMIS may help Member States to determine where elevated gamma dose rate measurements during a radiological or nuclear emergency indicate that actions to protect the public are necessary. The data can be used to assist emergency responders determine where and when to take necessary actions to protect the public. This new web online tool supports the IAEA's Unified System for Information Exchange in Incidents and Emergencies (USIE)3, an online tool where competent authorities can access information about all emergency situations, ranging from a lost radioactive source to a full-scale nuclear emergency.

  18. Information Extraction from Unstructured Text for the Biodefense Knowledge Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samatova, N F; Park, B; Krishnamurthy, R

    2005-04-29

    The Bio-Encyclopedia at the Biodefense Knowledge Center (BKC) is being constructed to allow an early detection of emerging biological threats to homeland security. It requires highly structured information extracted from variety of data sources. However, the quantity of new and vital information available from every day sources cannot be assimilated by hand, and therefore reliable high-throughput information extraction techniques are much anticipated. In support of the BKC, Lawrence Livermore National Laboratory and Oak Ridge National Laboratory, together with the University of Utah, are developing an information extraction system built around the bioterrorism domain. This paper reports two important pieces ofmore » our effort integrated in the system: key phrase extraction and semantic tagging. Whereas two key phrase extraction technologies developed during the course of project help identify relevant texts, our state-of-the-art semantic tagging system can pinpoint phrases related to emerging biological threats. Also we are enhancing and tailoring the Bio-Encyclopedia by augmenting semantic dictionaries and extracting details of important events, such as suspected disease outbreaks. Some of these technologies have already been applied to large corpora of free text sources vital to the BKC mission, including ProMED-mail, PubMed abstracts, and the DHS's Information Analysis and Infrastructure Protection (IAIP) news clippings. In order to address the challenges involved in incorporating such large amounts of unstructured text, the overall system is focused on precise extraction of the most relevant information for inclusion in the BKC.« less

  19. The Influence of Organizational Systems on Information Exchange in Long-Term Care Facilities: An Institutional Ethnography.

    PubMed

    Caspar, Sienna; Ratner, Pamela A; Phinney, Alison; MacKinnon, Karen

    2016-06-01

    Person-centered care is heavily dependent on effective information exchange among health care team members. We explored the organizational systems that influence resident care attendants' (RCAs) access to care information in long-term care (LTC) settings. We conducted an institutional ethnography in three LTC facilities. Investigative methods included naturalistic observations, in-depth interviews, and textual analysis. Practical access to texts containing individualized care-related information (e.g., care plans) was dependent on job classification. Regulated health care professionals accessed these texts daily. RCAs lacked practical access to these texts and primarily received and shared information orally. Microsystems of care, based on information exchange formats, emerged. Organizational systems mandated written exchange of information and did not formally support an oral exchange. Thus, oral information exchanges were largely dependent on the quality of workplace relationships. Formal systems are needed to support structured oral information exchange within and between the microsystems of care found in LTC. © The Author(s) 2016.

  20. Automation and hypermedia technology applications

    NASA Technical Reports Server (NTRS)

    Jupin, Joseph H.; Ng, Edward W.; James, Mark L.

    1993-01-01

    This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.

  1. KA-SB: from data integration to large scale reasoning

    PubMed Central

    Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F

    2009-01-01

    Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402

  2. Military helicopter cockpit modernization

    NASA Astrophysics Data System (ADS)

    Hall, Andrew S.

    2001-09-01

    This paper describes some of the initiatives being progressed by Smiths Aerospace to enhance the operational effectiveness of military helicopters, with particular emphasis on the GWHL Lynx and EH Industries EH101 programs. The areas discussed include engine instrumentation, flight instrumentation and the mission system displays. Various Crew Stations are described which provide a suite of AMLCD displays which: -Integrate information from the aircraft engine, electrical power and hydraulic systems onto 5ATI displays -Integrate primary flight, navigation and mission system sensor information onto large area (61/4' square or 6' by 8') displays -Provide standby attitude and air data information in the event of major system failure on 3ATI displays.

  3. System of Programmed Modules for Measuring Photographs with a Gamma-Telescope

    NASA Technical Reports Server (NTRS)

    Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.

    1978-01-01

    Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.

  4. Review of integrated digital systems: evolution and adoption

    NASA Astrophysics Data System (ADS)

    Fritz, Lawrence W.

    The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.

  5. A biomedical information system for retrieval and manipulation of NHANES data.

    PubMed

    Mukherjee, Sukrit; Martins, David; Norris, Keith C; Jenders, Robert A

    2013-01-01

    The retrieval and manipulation of data from large public databases like the U.S. National Health and Nutrition Examination Survey (NHANES) may require sophisticated statistical software and significant expertise that may be unavailable in the university setting. In response, we have developed the Data Retrieval And Manipulation System (DReAMS), an automated information system to handle all processes of data extraction and cleaning and then joining different subsets to produce analysis-ready output. The system is a browser-based data warehouse application in which the input data from flat files or operational systems are aggregated in a structured way so that the desired data can be read, recoded, queried and extracted efficiently. The current pilot implementation of the system provides access to a limited amount of NHANES database. We plan to increase the amount of data available through the system in the near future and to extend the techniques to other large databases from CDU archive with a current holding of about 53 databases.

  6. Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.; Slater, L. D.; Johnson, T.

    2012-12-01

    Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.

  7. A means to an end: a web-based client management system in palliative care.

    PubMed

    O'Connor, Margaret; Erwin, Trudy; Dawson, Linda

    2009-03-01

    Home-based palliative care (hospice) services require comprehensive and fully integrated information systems to develop and manage the various aspects of their business, incorporating client data and management information. These systems assist in maintaining the quality of client care as well as improved management efficiencies. This article reports on a large not-for-profit home-based palliative care service in Australia, which embarked on a project to develop an electronic data management system specifically designed to meet the needs of the palliative care sector. This web-based client information management system represents a joint venture between the organization and a commercial company and has been a very successful project.

  8. Integrated information in discrete dynamical systems: motivation and theoretical framework.

    PubMed

    Balduzzi, David; Tononi, Giulio

    2008-06-13

    This paper introduces a time- and state-dependent measure of integrated information, phi, which captures the repertoire of causal states available to a system as a whole. Specifically, phi quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) phi varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) phi varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) phi varies as a function of network architecture. High phi values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high phi because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high phi but are inefficient. (iv) In Hopfield networks, phi is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, phi appears to be a useful metric to characterize the capacity of any physical system to integrate information.

  9. JUICE: a data management system that facilitates the analysis of large volumes of information in an EST project workflow.

    PubMed

    Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A

    2006-11-23

    Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from http://genoma.unab.cl/juice_system/ or http://www.genomavegetal.cl/juice_system/.

  10. Computers in imaging and health care: now and in the future.

    PubMed

    Arenson, R L; Andriole, K P; Avrin, D E; Gould, R G

    2000-11-01

    Early picture archiving and communication systems (PACS) were characterized by the use of very expensive hardware devices, cumbersome display stations, duplication of database content, lack of interfaces to other clinical information systems, and immaturity in their understanding of the folder manager concepts and workflow reengineering. They were implemented historically at large academic medical centers by biomedical engineers and imaging informaticists. PACS were nonstandard, home-grown projects with mixed clinical acceptance. However, they clearly showed the great potential for PACS and filmless medical imaging. Filmless radiology is a reality today. The advent of efficient softcopy display of images provides a means for dealing with the ever-increasing number of studies and number of images per study. Computer power has increased, and archival storage cost has decreased to the extent that the economics of PACS is justifiable with respect to film. Network bandwidths have increased to allow large studies of many megabytes to arrive at display stations within seconds of examination completion. PACS vendors have recognized the need for efficient workflow and have built systems with intelligence in the management of patient data. Close integration with the hospital information system (HIS)-radiology information system (RIS) is critical for system functionality. Successful implementation of PACS requires integration or interoperation with hospital and radiology information systems. Besides the economic advantages, secure rapid access to all clinical information on patients, including imaging studies, anytime and anywhere, enhances the quality of patient care, although it is difficult to quantify. Medical image management systems are maturing, providing access outside of the radiology department to images and clinical information throughout the hospital or the enterprise via the Internet. Small and medium-sized community hospitals, private practices, and outpatient centers in rural areas will begin realizing the benefits of PACS already realized by the large tertiary care academic medical centers and research institutions. Hand-held devices and the Worldwide Web are going to change the way people communicate and do business. The impact on health care will be huge, including radiology. Computer-aided diagnosis, decision support tools, virtual imaging, and guidance systems will transform our practice as value-added applications utilizing the technologies pushed by PACS development efforts. Outcomes data and the electronic medical record (EMR) will drive our interactions with referring physicians and we expect the radiologist to become the informaticist, a new version of the medical management consultant.

  11. Thermal Environment for Classrooms. Central System Approach to Air Conditioning.

    ERIC Educational Resources Information Center

    Triechler, Walter W.

    This speech compares the air conditioning requirements of high-rise office buildings with those of large centralized school complexes. A description of one particular air conditioning system provides information about the system's arrangement, functions, performance efficiency, and cost effectiveness. (MLF)

  12. Information Retrieval Strategies of Millennial Undergraduate Students in Web and Library Database Searches

    ERIC Educational Resources Information Center

    Porter, Brandi

    2009-01-01

    Millennial students make up a large portion of undergraduate students attending colleges and universities, and they have a variety of online resources available to them to complete academically related information searches, primarily Web based and library-based online information retrieval systems. The content, ease of use, and required search…

  13. HPAC Info-dex 6: Manufacturers` product information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-06-01

    This is a listing of manufacturers` product information published by Heating, Piping, and Air Conditioning magazine. The information details ranges of capacities, sizes, and other data needed for the selection and application of these products for mechanical systems in large plants and buildings. This listing is cross referenced to other indexes published by HPAC magazine.

  14. Designing an Integrated System of Databases: A Workstation for Information Seekers.

    ERIC Educational Resources Information Center

    Micco, Mary; Smith, Irma

    1987-01-01

    Proposes a framework for the design of a full function workstation for information retrieval based on study of information seeking behavior. A large amount of local storage of the CD-ROM jukebox variety and full networking capability to both local and external databases are identified as requirements of the prototype. (MES)

  15. Mechanisation and Automation of Information Library Procedures in the USSR.

    ERIC Educational Resources Information Center

    Batenko, A. I.

    Scientific and technical libraries represent a fundamental link in a complex information storage and retrieval system. The handling of a large volume of scientific and technical data and provision of information library services requires the utilization of computing facilities and automation equipment, and was started in the Soviet Union on a…

  16. Evolution of Information Systems Curriculum in an Australian University over the Last Twenty-Five Years

    NASA Astrophysics Data System (ADS)

    Tatnall, Arthur; Burgess, Stephen

    Information Systems (IS) courses began in Australia’s higher education institutions in the 1960, and have continued to evolve at a rapid rate since then. Beginning with a need by the Australian Commonwealth Government for a large number of computer professionals, Information Systems (or Business Computing) courses developed rapidly. The nature and content of these courses in the 1960s and 70s, however, was quite different to present courses and this paper traces this change and the reasons for it. After some brief discussion of the beginnings and the early days of Information Systems curriculum, we address in particular how these courses have evolved in one Australian university over the last 25 years. IS curriculum is seen to adapt, new materials are added and emphases changed as new technologies and new computing applications emerge. The paper offers a model of how curriculum change in Information Systems takes place.

  17. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  18. Information System and Geographic Information System Tools in the Data Analyses of the Control Program for Visceral Leishmaniases from 2006 to 2010 in the Sanitary District of Venda Nova, Belo Horizonte, Minas Gerais, Brazil

    PubMed Central

    Saraiva, Lara; Leite, Camila Gonçalves; de Carvalho, Luiz Otávio Alves; Andrade Filho, José Dilermando; de Menezes, Fernanda Carvalho; Fiúza, Vanessa de Oliveira Pires

    2012-01-01

    The aim of this paper is to report a brief history of control actions for Visceral Leishmaniasis (VL) from 2006 to 2010 in the Sanitary District (DS) of Venda Nova, Belo Horizonte, Minas Gerais, Brazil, focusing on the use of information systems and Geographic Information System (GIS) tools. The analyses showed that the use of an automated database allied with geoprocessing tools may favor control measures of VL, especially with regard to the evaluation of control actions carried out. Descriptive analyses of control measures allowed to evaluating that the information system and GIS tools promoted greater efficiency in making decisions and planning activities. These analyses also pointed to the necessity of new approaches to the control of VL in large urban centers. PMID:22518168

  19. End-user interest in geotechnical data management systems.

    DOT National Transportation Integrated Search

    2008-12-01

    In conducting geotechnical site investigations, large volumes of subsurface information and associated test data : are generated. The current practice relies on paper-based filing systems that are often difficult and cumbersome : to access by users. ...

  20. ERP implementation in rural health care.

    PubMed

    Trimmer, Kenneth J; Pumphrey, Lela D; Wiggins, Carla

    2002-01-01

    Enterprise resource planning (ERP) systems provide organizations with the opportunity to integrate individual, functionally-oriented information systems. Although much of the focus in the popular press has been placed on ERP systems in large for-profit organizations, small hospitals and clinics are candidates for ERP systems. Focusing information systems on critical success factors (CSFs) allows the organization to address a limited number of areas associated with performance. This limited number of factors can provide management with an insight into dimensions of information that must be addressed by a system. Focuses on CSFs for small health-care organizations. In addition, also considers factors critical to the implementation of health-care information systems. Presents two cases. The results indicate support for the continuing use of CSFs to help focus on the benefits of ERPs. Focusing on groups of tangible and intangible benefits can also assist the rural health-care organization in the use of ERPs.

  1. Meta Data Mining in Earth Remote Sensing Data Archives

    NASA Astrophysics Data System (ADS)

    Davis, B.; Steinwand, D.

    2014-12-01

    Modern search and discovery tools for satellite based remote sensing data are often catalog based and rely on query systems which use scene- (or granule-) based meta data for those queries. While these traditional catalog systems are often robust, very little has been done in the way of meta data mining to aid in the search and discovery process. The recently coined term "Big Data" can be applied in the remote sensing world's efforts to derive information from the vast data holdings of satellite based land remote sensing data. Large catalog-based search and discovery systems such as the United States Geological Survey's Earth Explorer system and the NASA Earth Observing System Data and Information System's Reverb-ECHO system provide comprehensive access to these data holdings, but do little to expose the underlying scene-based meta data. These catalog-based systems are extremely flexible, but are manually intensive and often require a high level of user expertise. Exposing scene-based meta data to external, web-based services can enable machine-driven queries to aid in the search and discovery process. Furthermore, services which expose additional scene-based content data (such as product quality information) are now available and can provide a "deeper look" into remote sensing data archives too large for efficient manual search methods. This presentation shows examples of the mining of Landsat and Aster scene-based meta data, and an experimental service using OPeNDAP to extract information from quality band from multiple granules in the MODIS archive.

  2. DEVELOPMENT OF PROTOCOLS TO IDENTIFY CRITICAL ECOSYSTEMS

    EPA Science Inventory

    Healthy, functioning ecosystems are critical to the sustainability of human and natural communities, but the identification of areas of healthy ecosystems in an area as large as Region 5 is difficult due to time and information constraints. Geographic Information Systems (GIS) a...

  3. Knowledge structure representation and automated updates in intelligent information management systems

    NASA Technical Reports Server (NTRS)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  4. Environmental factor(tm) system: RCRA hazardous waste handler information (on CD-ROM). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    Environmental Factor(trademark) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity, and compliance history for facilities found in the EPA Research Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management, and minimization by companies who are large quantity generators; and (3) Data on the waste management practices of treatment, storage, and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action, or violation information, TSD status, generator and transporter status, and more. (2) View compliance information - dates of evaluation, violation, enforcement, and corrective action. (3) Lookup facilities by waste processing categories of marketing, transporting, processing, and energy recovery. (4) Use owner/operator information and names, titles, and telephone numbers of project managers for prospecting. (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving, and exporting.« less

  5. Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Edwards, David E.; Haimes, Robert

    1999-01-01

    An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.

  6. RICIS research

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.

    1987-01-01

    The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.

  7. A Collaboration in Support of LBA Science and Data Exchange: Beija-flor and EOS-WEBSTER

    NASA Astrophysics Data System (ADS)

    Schloss, A. L.; Gentry, M. J.; Keller, M.; Rhyne, T.; Moore, B.

    2001-12-01

    The University of New Hampshire (UNH) has developed a Web-based tool that makes data, information, products, and services concerning terrestrial ecological and hydrological processes available to the Earth Science community. Our WEB-based System for Terrestrial Ecosystem Research (EOS-WEBSTER) provides a GIS-oriented interface to select, subset, reformat and download three main types of data: selected NASA Earth Observing System (EOS) remotely sensed data products, results from a suite of ecosystem and hydrological models, and geographic reference data. The Large Scale Biosphere-Atmosphere Experiment in Amazonia Project (LBA) has implemented a search engine, Beija-flor, that provides a centralized access point to data sets acquired for and produced by LBA researchers. The metadata in the Beija-flor index describe the content of the data sets and contain links to data distributed around the world. The query system returns a list of data sets that meet the search criteria of the user. A common problem when a user of a system like Beija-flor wants data products located within another system is that users are required to re-specify information, such as spatial coordinates, in the other system. This poster describes methodology by which Beija-flor generates a unique URL containing the requested search parameters and passes the information to EOS-WEBSTER, thus making the interactive services and large diverse data holdings in EOS-WEBSTER directly available to Beija-flor users. This "Calling Card" is used by EOS-WEBSTER to generate on-demand custom products tailored to each Beija-flor request. Through a collaborative effort, we have demonstrated the ability to integrate project-specific search engines such as Beija-flor with the products and services of large data systems such as EOS-WEBSTER, to provide very specific information products with a minimal amount of additional programming. This methodology has the potential to greatly facilitate research data exchange by enhancing the interoperability of diverse data systems beyond the two described here.

  8. Research on wastewater reuse planning in Beijing central region.

    PubMed

    Jia, H; Guo, R; Xin, K; Wang, J

    2005-01-01

    The need to implement wastewater reuse in Beijing is discussed. Based on the investigation of the built wastewater reuse projects in Beijing, the differences between small wastewater reuse system and large systems were analyzed according to the technical, economical and social issues. The advantages and disadvantages of the small system and the large system were then given. In wastewater reuse planning in Beijing urban region, the large system was adopted. The rations of reclaimed water for difference land use type, including industrial reuse, municipal reuse, grass irrigation, and scenes water reuse were determined. Then according to the land use information in every block in central Beijing, using GIS techniques, the amounts of the reclaimed water needed in every block were calculated, and the main pipe system of reclaimed water was planned.

  9. Evaluation of immunization data completeness within a large community health care system exchanging data with a state immunization information system.

    PubMed

    Hendrickson, Bryan K; Panchanathan, Sarada S; Petitti, Diana

    2015-01-01

    Information systems are used by most states to maintain registries of immunization data both for monitoring population-level adherence and for use in clinical practice and research. Direct data exchange between such systems and electronic health record systems presents an opportunity to improve the completeness and quality of information available. Our goals were to describe and compare the completeness of the Arizona State Immunization System, the electronic health record at a large community health provider in Arizona exchanging electronic data with the Arizona system, and personal immunization records in an effort to contribute to the discussion on the completeness of state-run immunization registries and data exchange with these registries. Immunization histories from these sources were collected and reviewed sequentially. Unique dates of vaccination administrations were counted for each patient and tagged on the basis of comparisons across sources. We quantified completeness by combining information from all 3 sources and comparing each source with the complete set. We determined that the state registry was 71.8% complete, the hospital electronic health record was 81.9% complete, and personal records were 87.8% complete. Of the 2017 unique vaccination administrations, 65% were present in all 3 sources, 24.6% in 2 of the 3 sources, and 10.4% in only 1 source. Only 11% of patients had records in complete agreement across the 3 sources. This study highlights issues related to data completeness, exchange, and reporting of immunization information to state registries and suggests that there is some degree of deficiency in completeness of immunization registries and other sources. This study indicates that there is a need to strengthen links between electronic data sources with immunization information and describes potential improvements in completeness that such efforts could provide, enabling providers to better rely on state immunization registries and to improve research utilization of immunization information systems.

  10. 35-GHz radar sensor for automotive collision avoidance

    NASA Astrophysics Data System (ADS)

    Zhang, Jun

    1999-07-01

    This paper describes the development of a radar sensor system used for automotive collision avoidance. Because the heavy truck may have great larger radar cross section than a motorcyclist has, the radar receiver may have a large dynamic range. And multi-targets at different speed may confuse the echo spectrum causing the ambiguity between range and speed of target. To get more information about target and background and to adapt to the large dynamic range and multi-targets, a frequency modulated and pseudo- random binary sequences phase modulated continuous wave radar system is described. The analysis of this double- modulation system is given. A high-speed signal processing and data processing component are used to process and combine the data and information from echo at different direction and at every moment.

  11. Energy information systems (EIS): Technology costs, benefit, and best practice uses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Lin, Guanjing; Piette, Mary Ann

    2013-11-26

    Energy information systems are the web-based software, data acquisition hardware, and communication systems used to store, analyze, and display building energy data. They often include analysis methods such as baselining, benchmarking, load profiling, and energy anomaly detection. This report documents a large-scale assessment of energy information system (EIS) uses, costs, and energy benefits, based on a series of focused case study investigations that are synthesized into generalizable findings. The overall objective is to provide organizational decision makers with the information they need to make informed choices as to whether or not to invest in an EIS--a promising technology that canmore » enable up to 20 percent site energy savings, quick payback, and persistent low-energy performance when implemented as part of best-practice energy management programs.« less

  12. Music information retrieval in compressed audio files: a survey

    NASA Astrophysics Data System (ADS)

    Zampoglou, Markos; Malamos, Athanasios G.

    2014-07-01

    In this paper, we present an organized survey of the existing literature on music information retrieval systems in which descriptor features are extracted directly from the compressed audio files, without prior decompression to pulse-code modulation format. Avoiding the decompression step and utilizing the readily available compressed-domain information can significantly lighten the computational cost of a music information retrieval system, allowing application to large-scale music databases. We identify a number of systems relying on compressed-domain information and form a systematic classification of the features they extract, the retrieval tasks they tackle and the degree in which they achieve an actual increase in the overall speed-as well as any resulting loss in accuracy. Finally, we discuss recent developments in the field, and the potential research directions they open toward ultra-fast, scalable systems.

  13. Prototype of an Integrated Hurricane Information System for Research: Description and Illustration of its Use in Evaluating WRF Model Simulations

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.

    2007-12-01

    In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.

  14. The Impact of Geographic Information Systems on Emergency Management Decision Making at the U.S. Department of Homeland Security

    ERIC Educational Resources Information Center

    King, Steven Gray

    2012-01-01

    Geographic information systems (GIS) reveal relationships and patterns from large quantities of diverse data in the form of maps and reports. The United States spends billions of dollars to use GIS to improve decisions made during responses to natural disasters and terrorist attacks, but precisely how GIS improves or impairs decision making is not…

  15. Afghanistan irrigation system assessment using remote sensing

    NASA Astrophysics Data System (ADS)

    Haack, Barry

    1997-01-01

    The Helmand-Arghandab Valley irrigation system in southern Afghanistan is one of the country's most important capital resources. Prior to the civil and military conflict that has engulfed Afghanistan for more than 15 years, agricultural lands irrigated by the system produced a large proportion of the country's food grains and cotton. This study successfully employed Landsat satellite imagery, Geographic Information Systems (GIS), Global Positioning Systems (GPS), and field surveys to assess changes that have occurred in this system since 1973 as a consequence of the war. This information is a critical step in irrigation rehabilitation for restoration of Afghanistan's agricultural productivity.

  16. Information management in the emergency department.

    PubMed

    Taylor, Todd B

    2004-02-01

    Information system planning for the ED is complex and new to emergency medicine, despite being used in other industries for many years. It has been estimated that less than 15% of EDs have comprehensive EDIS currently in place. The manner in which administration is approached in large part determines the success in obtaining appropriate institutional support for an EDIS. Active physician and nurse involvement is essential in the process if the new system is to be accepted at the user level. In the ED, large volumes of information are collected, collated,interpreted, and acted on immediately. Effective information management therefore is key to the successful operation of any ED. Although computerized information systems have tremendous potential for improving information management, such systems are often underused or implemented in such a way that they increase the workload on caregivers and staff. This is counter productive and should be avoided. In developing and implementing EDIS one should be careful not to automate poorly designed manual processes. Examples are ED tracking systems that require staff to manually relocate patients in the system. This task probably is completed only when the ED volume is low and "worked around" when the department is busy. Information from such a system is, therefore, flawed; at best useless and at worst counter productive. Alternatively, systems are available that can track patients automatically through the ED by way of infrared sensors similar to those used in baggage-tracking systems that have been in place in airports for years. In the automated (computerized) ED, we must have zero-fault-tolerant,enterprise-wide, hospital information networked systems that prevent unnecessary duplication of tasks, assist in tracking and entering data, and ultimately help analyze the information on a minute-to-minute basis. Such systems only reach their potential when they are fully integrated, including legacy systems, rather than stand alone proprietary EDIS. Further,a modular approach in which individual components are connected to a flexible computer backbone is ideal.Finally, good clinical content is key to virtually every aspect of the EDIS. Much of this content is yet to be developed and what is available still needs to be adapted to the EDIS environment. Daunting as it may be, an EDIS implementation properly accomplished results in better patient care, improved staff productivity, and a satisfying work environment (Box 3).

  17. Resource Management for Distributed Parallel Systems

    NASA Technical Reports Server (NTRS)

    Neuman, B. Clifford; Rao, Santosh

    1993-01-01

    Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.

  18. Self-Organized Percolation and Critical Sales Fluctuations

    NASA Astrophysics Data System (ADS)

    Weisbuch, Gérard; Solomon, Sorin

    There is a discrepancy between the standard view of equilibrium through price adjustment in economics and the observation of large fluctuations in stock markets. We study here a simple model where agents decisions not only depend upon their individual preferences but also upon information obtained from their neighbors in a social network. The model shows that information diffusion coupled to the adjustment process drives the system to criticality with large fluctuations rather than converging smoothly to equilibrium.

  19. The research and development of water resources management information system based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai

    According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..

  20. Validating a Geographical Image Retrieval System.

    ERIC Educational Resources Information Center

    Zhu, Bin; Chen, Hsinchun

    2000-01-01

    Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…

  1. A 14 × 14 μm2 footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide

    PubMed Central

    Wang, S. M.; Cheng, Q. Q.; Gong, Y. X.; Xu, P.; Sun, C.; Li, L.; Li, T.; Zhu, S. N.

    2016-01-01

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm2. The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits. PMID:27142992

  2. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  3. eFarm: A Tool for Better Observing Agricultural Land Systems

    PubMed Central

    Yu, Qiangyi; Shi, Yun; Tang, Huajun; Yang, Peng; Xie, Ankun; Liu, Bin; Wu, Wenbin

    2017-01-01

    Currently, observations of an agricultural land system (ALS) largely depend on remotely-sensed images, focusing on its biophysical features. While social surveys capture the socioeconomic features, the information was inadequately integrated with the biophysical features of an ALS and the applications are limited due to the issues of cost and efficiency to carry out such detailed and comparable social surveys at a large spatial coverage. In this paper, we introduce a smartphone-based app, called eFarm: a crowdsourcing and human sensing tool to collect the geotagged ALS information at the land parcel level, based on the high resolution remotely-sensed images. We illustrate its main functionalities, including map visualization, data management, and data sensing. Results of the trial test suggest the system works well. We believe the tool is able to acquire the human–land integrated information which is broadly-covered and timely-updated, thus presenting great potential for improving sensing, mapping, and modeling of ALS studies. PMID:28245554

  4. A 14 × 14 μm(2) footprint polarization-encoded quantum controlled-NOT gate based on hybrid waveguide.

    PubMed

    Wang, S M; Cheng, Q Q; Gong, Y X; Xu, P; Sun, C; Li, L; Li, T; Zhu, S N

    2016-05-04

    Photonic quantum information processing system has been widely used in communication, metrology and lithography. The recent emphasis on the miniaturized photonic platform is thus motivated by the urgent need for realizing large-scale information processing and computing. Although the integrated quantum logic gates and quantum algorithms based on path encoding have been successfully demonstrated, the technology for handling another commonly used polarization-encoded qubits has yet to be fully developed. Here, we show the implementation of a polarization-dependent beam-splitter in the hybrid waveguide system. With precisely design, the polarization-encoded controlled-NOT gate can be implemented using only single such polarization-dependent beam-splitter with the significant size reduction of the overall device footprint to 14 × 14 μm(2). The experimental demonstration of the highly integrated controlled-NOT gate sets the stage to develop large-scale quantum information processing system. Our hybrid design also establishes the new capabilities in controlling the polarization modes in integrated photonic circuits.

  5. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  6. Network connectivity paradigm for the large data produced by weather radar systems

    NASA Astrophysics Data System (ADS)

    Guenzi, Diego; Bechini, Renzo; Boraso, Rodolfo; Cremonini, Roberto; Fratianni, Simona

    2014-05-01

    The traffic over Internet is constantly increasing; this is due in particular to social networks activities but also to the enormous exchange of data caused especially by the so-called "Internet of Things". With this term we refer to every device that has the capability of exchanging information with other devices on the web. In geoscience (and, in particular, in meteorology and climatology) there is a constantly increasing number of sensors that are used to obtain data from different sources (like weather radars, digital rain gauges, etc.). This information-gathering activity, frequently, must be followed by a complex data analysis phase, especially when we have large data sets that can be very difficult to analyze (very long historical series of large data sets, for example), like the so called big data. These activities are particularly intensive in resource consumption and they lead to new computational models (like cloud computing) and new methods for storing data (like object store, linked open data, NOSQL or NewSQL). The weather radar systems can be seen as one of the sensors mentioned above: it transmit a large amount of raw data over the network (up to 40 megabytes every five minutes), with 24h/24h continuity and in any weather condition. Weather radar are often located in peaks and in wild areas where connectivity is poor. For this reason radar measurements are sometimes processed partially on site and reduced in size to adapt them to the limited bandwidth currently available by data transmission systems. With the aim to preserve the maximum flow of information, an innovative network connectivity paradigm for the large data produced by weather radar system is here presented. The study is focused on the Monte Settepani operational weather radar system, located over a wild peak summit in north-western Italy.

  7. Automated Induction Of Rule-Based Neural Networks

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.; Goodman, Rodney M.

    1994-01-01

    Prototype expert systems implemented in software and are functionally equivalent to neural networks set up automatically and placed into operation within minutes following information-theoretic approach to automated acquisition of knowledge from large example data bases. Approach based largely on use of ITRULE computer program.

  8. Simplified Processing Method for Meter Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Colotelo, Alison H. A.; Downs, Janelle L.

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  9. Survey of Knowledge Representation and Reasoning Systems

    DTIC Science & Technology

    2009-07-01

    processing large volumes of unstructured information such as natural language documents, email, audio , images and video [Ferrucci et al. 2006]. Using this...information we hope to obtain improved es- timation and prediction, data-mining, social network analysis, and semantic search and visualisation . Knowledge

  10. Open for business: private networks create a marketplace for health information exchange.

    PubMed

    Dimick, Chris

    2012-05-01

    Large health systems and their IT vendors are creating private information exchange networks at a time when federally funded state operations are gearing up for launch, Is there room for private and public offerings in the new HIE marketplace?

  11. DEVELOPMENT OF PROTOCOLS TO STUDY TO IDENTIFY CRITICAL ECOSYSTEMS

    EPA Science Inventory

    Healthy, functioning ecosystems are critical to the sustainability of human and natural communities, but the identification of areas of healthy ecosystems in an area as large as Region 5 is difficult due to time and information constraints. Geographic Information Systems (GIS) a...

  12. Distributed design approach in persistent identifiers systems

    NASA Astrophysics Data System (ADS)

    Golodoniuc, Pavel; Car, Nicholas; Klump, Jens

    2017-04-01

    The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation, we propose using Distributed Hash Tables (DHT), Peer Exchange networks (PEX), Magnet Links, and peer-to-peer (P2P) file sharing networks - the technologies that enable applications such as BitTorrent (Wu et al., 2010). The proposed approach introduces reliable information replication and caching mechanisms, eliminating the need for a central PID data store, and increases overall system fault tolerance due to the lack of a single point of failure. The proposed PID system's design aims to ensure trustworthiness of the system and incorporates important aspects of governance, such as the notion of the authoritative source, data integrity, caching, and data replication control.

  13. BIOME: A browser-aware search and order system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grubb, J.W.; Jennings, S.V.; Yow, T.G.

    1996-05-01

    The Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC), which is associated with NASA`s Earth Observing System Data and Information System (EOSDIS), provides access to a large number of tabular and imagery datasets used in ecological and environmental research. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC developed the Biogeochemical Information Ordering Management Environment (BIOME), a search and order system for the World Wide Web (WWW).more » The WWW provides a new vehicle that allows a wide range of users access to the data. This paper describes the specialized attributes incorporated into BIOME that allow researchers easy access to an otherwise bewildering array of data products.« less

  14. Inorganic material profiling using Arn+ cluster: Can we achieve high quality profiles?

    NASA Astrophysics Data System (ADS)

    Conard, T.; Fleischmann, C.; Havelund, R.; Franquet, A.; Poleunis, C.; Delcorte, A.; Vandervorst, W.

    2018-06-01

    Retrieving molecular information by sputtering of organic systems has been concretized in the last years due to the introduction of sputtering by large gas clusters which drastically eliminated the compound degradation during the analysis and has led to strong improvements in depth resolution. Rapidly however, a limitation was observed for heterogeneous systems where inorganic layers or structures needed to be profiled concurrently. As opposed to organic material, erosion of the inorganic layer appears very difficult and prone to many artefacts. To shed some light on these problems we investigated a simple system consisting of aluminum delta layer(s) buried in a silicon matrix in order to define the most favorable beam conditions for practical analysis. We show that counterintuitive to the small energy/atom used and unlike monoatomic ion sputtering, the information depth obtained with large cluster ions is typically very large (∼10 nm) and that this can be caused both by a large roughness development at early stages of the sputtering process and by a large mixing zone. As a consequence, a large deformation of the Al intensity profile is observed. Using sample rotation during profiling significantly improves the depth resolution while sample temperature has no significant effect. The determining parameter for high depth resolution still remains the total energy of the cluster instead of the energy per atom in the cluster.

  15. Proposal as to Efficient Collection and Exploitation of Earthquake Damage Information and Verification by Field Experiment at Toyohashi City

    NASA Astrophysics Data System (ADS)

    Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro

    Based on the earlier study result that the gathering of damage information can be quickly achieved in a municipality with a smaller population, it is proposed that damage information is gathered and analyzed using an area roughly equivalent to a primary school district as a basic unit. The introduction of this type of decentralized system is expected to quickly gather important information on each area. The information gathered by these communal disaster prevention bases is sent to the disaster prevention headquarters which in turn feeds back more extensive information over a wider area to the communal disaster prevention bases. Concrete systems have been developed according to the above mentioned framework, and we performed large-scale experiments on simulating disaster information collection, transmission and on utilization for smooth responses against earthquake disaster with collaboration from Toyohashi City, Aichi Prefecture, where is considered to suffer extensive damage from the Tokai and Tonankai Earthquakes with very high probability of the occurrence. Using disaster information collection/transmission equipments composed of long-distance wireless LAN, a notebook computer, a Web camera and an IP telephone, city staffs could easily input and transmit the information such as fire, collapsed houses and impassable roads, which were collected by the inhabitants participated in the experiment. Headquarters could confirm such information on the map automatically plotted, and also state of each disaster-prevention facility by means of Web-cameras and IP telephones. Based on the damage information, fire-spreading, evaluation, and traffic simulations were automatically executed at the disaster countermeasure office and their results were displayed on the large screen to utilize for making decisions such as residents' evacuation. These simulated results were simultaneously displayed at each disaster-prevention facility and were served to make people understand the situation of whole damage of the city and necessity of evacuation with optimum timing and access. According to the evaluation by the city staffs through the experiments, information technology is available for rationally implementing initial responses just after a large earthquake in spite of some improvement on the systems used in the experiments.

  16. Information sources in science and technology in Finland

    NASA Technical Reports Server (NTRS)

    Haarala, Arja-Riitta

    1994-01-01

    Finland poses some problems to be overcome in the field of scientific and technical information: a small user community which makes domestic systems costly; great distances within the country between users and suppliers of information; great distances to international data systems and large libraries abroad; and inadequate collections of scientific and technical information. The national bibliography Fennica includes all books and journals published in Finland. Data base services available in Finland include: reference data bases in science and technology; data banks for decision making such as statistical time series or legal proceedings; national bibliographies; and library catalogs.

  17. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  18. From prototype to production system: lessons learned from the evolution of the SignOut System at Mount Sinai Medical Center.

    PubMed

    Kushniruk, Andre; Karson, Tom; Moore, Carlton; Kannry, Joseph

    2003-01-01

    Approaches to the development of information systems in large health care institutions range from prototyping to conventional development of large scale production systems. This paper discusses the development of the SignOut System at Mount Sinai Medical Center, which was designed in 1997 to capture vital resident information. Local need quickly outstripped proposed delays for building a production system and a prototype system quickly became a production system. By the end of 2002 the New SignOut System was built to create an integrated application that was a true production system. In this paper we discuss the design and implementation issues in moving from a prototype to a production system. The production system had a number of advantages, including increased organizational visibility, integration into enterprise resource planning and full time staff for support. However, the prototype allowed for more rapid design and subsequent changes, less training, and equal to or superior help desk support. It is argued that healthcare IT systems may need characteristics of both prototype and production system development to rapidly meet the changing and different needs of healthcare user populations.

  19. Professional Development Of Junior Full Time Support Aerospace Maintenance Duty Officers

    DTIC Science & Technology

    2017-12-01

    management information system NAMP naval aviation maintenance program OCS officer candidate school OOMA optimized organizational maintenance activity...retrieval of information is effective and efficient. 13 Knowledge management solutions broadly fall into two categories, enterprise solutions...designed to manage large amounts of knowledge and information , access by many concurrent users at multiple organization units and locations, and

  20. [A medical consumable material management information system].

    PubMed

    Tang, Guoping; Hu, Liang

    2014-05-01

    Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.

  1. Artificial Intelligence: Applications in Education.

    ERIC Educational Resources Information Center

    Thorkildsen, Ron J.; And Others

    1986-01-01

    Artificial intelligence techniques are used in computer programs to search out rapidly and retrieve information from very large databases. Programing advances have also led to the development of systems that provide expert consultation (expert systems). These systems, as applied to education, are the primary emphasis of this article. (LMO)

  2. Educational System Efficiency Improvement Using Knowledge Discovery in Databases

    ERIC Educational Resources Information Center

    Lukaš, Mirko; Leškovic, Darko

    2007-01-01

    This study describes one of possible way of usage ICT in education system. We basically treated educational system like Business Company and develop appropriate model for clustering of student population. Modern educational systems are forced to extract the most necessary and purposeful information from a large amount of available data. Clustering…

  3. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  4. Technology for large space systems: A bibliography with indexes (supplement 07)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This bibliography lists 366 reports, articles and other documents introduced into the NASA scientific and technical information system between January 1, 1982 and June 30, 1982. Subject matter is grouped according to systems, interactive analysis and design, structural concepts, control systems, electronics, advanced materials, assembly concepts, propulsion, solar power satellite systems, and flight experiments.

  5. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  6. Security System Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    C Language Integration Production System (CLIPS), a NASA-developed expert systems program, has enabled a security systems manufacturer to design a new generation of hardware. C.CURESystem 1 Plus, manufactured by Software House, is a software based system that is used with a variety of access control hardware at installations around the world. Users can manage large amounts of information, solve unique security problems and control entry and time scheduling. CLIPS acts as an information management tool when accessed by C.CURESystem 1 Plus. It asks questions about the hardware and when given the answer, recommends possible quick solutions by non-expert persons.

  7. Enhancing E-Health Information Systems with Agent Technology

    PubMed Central

    Nguyen, Minh Tuan; Fuhrer, Patrik; Pasquier-Rocha, Jacques

    2009-01-01

    Agent Technology is an emerging and promising research area in software technology, which increasingly contributes to the development of value-added information systems for large healthcare organizations. Through the MediMAS prototype, resulting from a case study conducted at a local Swiss hospital, this paper aims at presenting the advantages of reinforcing such a complex E-health man-machine information organization with software agents. The latter will work on behalf of human agents, taking care of routine tasks, and thus increasing the speed, the systematic, and ultimately the reliability of the information exchanges. We further claim that the modeling of the software agent layer can be methodically derived from the actual “classical” laboratory organization and practices, as well as seamlessly integrated with the existing information system. PMID:19096509

  8. High-speed data search

    NASA Technical Reports Server (NTRS)

    Driscoll, James N.

    1994-01-01

    The high-speed data search system developed for KSC incorporates existing and emerging information retrieval technology to help a user intelligently and rapidly locate information found in large textual databases. This technology includes: natural language input; statistical ranking of retrieved information; an artificial intelligence concept called semantics, where 'surface level' knowledge found in text is used to improve the ranking of retrieved information; and relevance feedback, where user judgements about viewed information are used to automatically modify the search for further information. Semantics and relevance feedback are features of the system which are not available commercially. The system further demonstrates focus on paragraphs of information to decide relevance; and it can be used (without modification) to intelligently search all kinds of document collections, such as collections of legal documents medical documents, news stories, patents, and so forth. The purpose of this paper is to demonstrate the usefulness of statistical ranking, our semantic improvement, and relevance feedback.

  9. Military Standard: Military Training Programs

    DTIC Science & Technology

    1990-12-05

    Commander, Naval Sea Systems Command, SEA 55Z3, Department of the Navy, Washington, DC 20362-5101 by using the self -addressed Standardization Document...information to the trainee. 3.63 InMtructional media materials (IMM). Instructional materials that present a body of information and are largely self ...computer power and W storage in equipmnent which is self -contained (for example, videodisc player) - not necessarily part of a complt com utr system. For

  10. Visual attention mitigates information loss in small- and large-scale neural codes

    PubMed Central

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  11. A multidisciplinary approach to the development of low-cost high-performance lightwave networks

    NASA Technical Reports Server (NTRS)

    Maitan, Jacek; Harwit, Alex

    1991-01-01

    Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.

  12. Information Technology in Complex Health Services

    PubMed Central

    Southon, Frank Charles Gray; Sauer, Chris; Dampney, Christopher Noel Grant (Kit)

    1997-01-01

    Abstract Objective: To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. Design: A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Measurements: Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Results: Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. Conclusion: The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case. PMID:9067877

  13. Information technology in complex health services: organizational impediments to successful technology transfer and diffusion.

    PubMed

    Southon, F C; Sauer, C; Grant, C N

    1997-01-01

    To identify impediments to the successful transfer and implementation of packaged information systems through large, divisionalized health services. A case analysis of the failure of an implementation of a critical application in the Public Health System of the State of New South Wales, Australia, was carried out. This application had been proven in the United States environment. Interviews involving over 60 staff at all levels of the service were undertaken by a team of three. The interviews were recorded and analyzed for key themes, and the results were shared and compared to enable a continuing critical assessment. Two components of the transfer of the system were considered: the transfer from a different environment, and the diffusion throughout a large, divisionalized organization. The analyses were based on the Scott-Morton organizational fit framework. In relation to the first, it was found that there was a lack of fit in the business environments and strategies, organizational structures and strategy-structure pairing as well as the management process-roles pairing. The diffusion process experienced problems because of the lack of fit in the strategy-structure, strategy-structure-management processes, and strategy-structure-role relationships. The large-scale developments of integrated health services present great challenges to the efficient and reliable implementation of information technology, especially in large, divisionalized organizations. There is a need to take a more sophisticated approach to understanding the complexities of organizational factors than has traditionally been the case.

  14. An industrial information integration approach to in-orbit spacecraft

    NASA Astrophysics Data System (ADS)

    Du, Xiaoning; Wang, Hong; Du, Yuhao; Xu, Li Da; Chaudhry, Sohail; Bi, Zhuming; Guo, Rong; Huang, Yongxuan; Li, Jisheng

    2017-01-01

    To operate an in-orbit spacecraft, the spacecraft status has to be monitored autonomously by collecting and analysing real-time data, and then detecting abnormities and malfunctions of system components. To develop an information system for spacecraft state detection, we investigate the feasibility of using ontology-based artificial intelligence in the system development. We propose a new modelling technique based on the semantic web, agent, scenarios and ontologies model. In modelling, the subjects of astronautics fields are classified, corresponding agents and scenarios are defined, and they are connected by the semantic web to analyse data and detect failures. We introduce the modelling methodologies and the resulted framework of the status detection information system in this paper. We discuss system components as well as their interactions in details. The system has been prototyped and tested to illustrate its feasibility and effectiveness. The proposed modelling technique is generic which can be extended and applied to the system development of other large-scale and complex information systems.

  15. Improving Navigation information for the Rotterdam Harbour access through a 3D Model and HF radar

    NASA Astrophysics Data System (ADS)

    Schroevers, Marinus

    2015-04-01

    The Port of Rotterdam is one of the largest harbours in the world and a gateway to Europe. For the access to Rotterdam harbour, information on hydrodynamic and meteorological conditions is of vital importance for safe and swift navigation. This information focuses on the deep navigation channel in the shallow foreshore, which accommodates large seagoing vessels. Due to a large seaward extension of the Port of Rotterdam area in 2011, current patterns have changed. A re-evaluation of the information needed, showed a need for an improved accuracy of the cross channel currents and swell, and an extended forecast horizon. To obtain this, new information system was designed based on a three dimensional hydrodynamic model which produces a 72 hour forecast. Furthermore, the system will assimilate HF radars surface current to optimize the short term forecast. The project has started in 2013 by specifying data needed from the HF radar. At the same time (temporary) buoys were deployed to monitor vertical current profiles. The HF radar will be operational in July 2015, while the model development starts beginning 2015. A pre operational version of the system is presently planned for the end of 2016. A full operational version which assimilates the HF radar data is planned for 2017.

  16. A Data Analysis Expert System For Large Established Distributed Databases

    NASA Astrophysics Data System (ADS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-05-01

    The purpose of this work is to analyze the applicability of artificial intelligence techniques for developing a user-friendly, parallel interface to large isolated, incompatible NASA databases for the purpose of assisting the management decision process. To carry out this work, a survey was conducted to establish the data access requirements of several key NASA user groups. In addition, current NASA database access methods were evaluated. The results of this work are presented in the form of a design for a natural language database interface system, called the Deductively Augmented NASA Management Decision Support System (DANMDS). This design is feasible principally because of recently announced commercial hardware and software product developments which allow cross-vendor compatibility. The goal of the DANMDS system is commensurate with the central dilemma confronting most large companies and institutions in America, the retrieval of information from large, established, incompatible database systems. The DANMDS system implementation would represent a significant first step toward this problem's resolution.

  17. Enhanced situational awareness in the maritime domain: an agent-based approach for situation management

    NASA Astrophysics Data System (ADS)

    Brax, Christoffer; Niklasson, Lars

    2009-05-01

    Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.

  18. Automatically identifying health- and clinical-related content in wikipedia.

    PubMed

    Liu, Feifan; Moosavinasab, Soheil; Agarwal, Shashank; Bennett, Andrew S; Yu, Hong

    2013-01-01

    Physicians are increasingly using the Internet for finding medical information related to patient care. Wikipedia is a valuable online medical resource to be integrated into existing clinical question answering (QA) systems. On the other hand, Wikipedia contains a full spectrum of world's knowledge and therefore comprises a large partition of non-health-related content, which makes disambiguation more challenging and consequently leads to large overhead for existing systems to effectively filter irrelevant information. To overcome this, we have developed both unsupervised and supervised approaches to identify health-related articles as well as clinically relevant articles. Furthermore, we explored novel features by extracting health related hierarchy from the Wikipedia category network, from which a variety of features were derived and evaluated. Our experiments show promising results and also demonstrate that employing the category hierarchy can effectively improve the system performance.

  19. Using LTI Dynamics to Identify the Influential Nodes in a Network

    PubMed Central

    Jorswieck, Eduard; Scheunert, Christian

    2016-01-01

    Networks are used for modeling numerous technical, social or biological systems. In order to better understand the system dynamics, it is a matter of great interest to identify the most important nodes within the network. For a large set of problems, whether it is the optimal use of available resources, spreading information efficiently or even protection from malicious attacks, the most important node is the most influential spreader, the one that is capable of propagating information in the shortest time to a large portion of the network. Here we propose the Node Imposed Response (NiR), a measure which accurately evaluates node spreading power. It outperforms betweenness, degree, k-shell and h-index centrality in many cases and shows the similar accuracy to dynamics-sensitive centrality. We utilize the system-theoretic approach considering the network as a Linear Time-Invariant system. By observing the system response we can quantify the importance of each node. In addition, our study provides a robust tool set for various protective strategies. PMID:28030548

  20. Using Semantic Templates to Study Vulnerabilities Recorded in Large Software Repositories

    ERIC Educational Resources Information Center

    Wu, Yan

    2011-01-01

    Software vulnerabilities allow an attacker to reduce a system's Confidentiality, Availability, and Integrity by exposing information, executing malicious code, and undermine system functionalities that contribute to the overall system purpose and need. With new vulnerabilities discovered everyday in a variety of applications and user environments,…

  1. Risk Management for Enterprise Resource Planning System Implementations in Project-Based Firms

    ERIC Educational Resources Information Center

    Zeng, Yajun

    2010-01-01

    Enterprise Resource Planning (ERP) systems have been regarded as one of the most important information technology developments in the past decades. While ERP systems provide the potential to bring substantial benefits, their implementations are characterized with large capital outlay, long duration, and high risks of failure including…

  2. Computer Security Primer: Systems Architecture, Special Ontology and Cloud Virtual Machines

    ERIC Educational Resources Information Center

    Waguespack, Leslie J.

    2014-01-01

    With the increasing proliferation of multitasking and Internet-connected devices, security has reemerged as a fundamental design concern in information systems. The shift of IS curricula toward a largely organizational perspective of security leaves little room for focus on its foundation in systems architecture, the computational underpinnings of…

  3. Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation

    NASA Astrophysics Data System (ADS)

    Anisenkov, A. V.

    2018-03-01

    In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).

  4. Environmental Factor(tm) system: RCRA hazardous waste handler information (on cd-rom). Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-04-01

    Environmental Factor(tm) RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information - dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less

  5. Environmental Factor{trademark} system: RCRA hazardous waste handler information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-03-01

    Environmental Factor{trademark} RCRA Hazardous Waste Handler Information on CD-ROM unleashes the invaluable information found in two key EPA data sources on hazardous waste handlers and offers cradle-to-grave waste tracking. It`s easy to search and display: (1) Permit status, design capacity and compliance history for facilities found in the EPA Resource Conservation and Recovery Information System (RCRIS) program tracking database; (2) Detailed information on hazardous wastes generation, management and minimization by companies who are large quantity generators, and (3) Data on the waste management practices of treatment, storage and disposal (TSD) facilities from the EPA Biennial Reporting System which is collectedmore » every other year. Environmental Factor`s powerful database retrieval system lets you: (1) Search for RCRA facilities by permit type, SIC code, waste codes, corrective action or violation information, TSD status, generator and transporter status and more; (2) View compliance information -- dates of evaluation, violation, enforcement and corrective action; (3) Lookup facilities by waste processing categories of marketing, transporting, processing and energy recovery; (4) Use owner/operator information and names, titles and telephone numbers of project managers for prospecting; and (5) Browse detailed data on TSD facility and large quantity generators` activities such as onsite waste treatment, disposal, or recycling, offsite waste received, and waste generation and management. The product contains databases, search and retrieval software on two CD-ROMs, an installation diskette and User`s Guide. Environmental Factor has online context-sensitive help from any screen and a printed User`s Guide describing installation and step-by-step procedures for searching, retrieving and exporting. Hotline support is also available for no additional charge.« less

  6. JUICE: a data management system that facilitates the analysis of large volumes of information in an EST project workflow

    PubMed Central

    Latorre, Mariano; Silva, Herman; Saba, Juan; Guziolowski, Carito; Vizoso, Paula; Martinez, Veronica; Maldonado, Jonathan; Morales, Andrea; Caroca, Rodrigo; Cambiazo, Veronica; Campos-Vargas, Reinaldo; Gonzalez, Mauricio; Orellana, Ariel; Retamales, Julio; Meisel, Lee A

    2006-01-01

    Background Expressed sequence tag (EST) analyses provide a rapid and economical means to identify candidate genes that may be involved in a particular biological process. These ESTs are useful in many Functional Genomics studies. However, the large quantity and complexity of the data generated during an EST sequencing project can make the analysis of this information a daunting task. Results In an attempt to make this task friendlier, we have developed JUICE, an open source data management system (Apache + PHP + MySQL on Linux), which enables the user to easily upload, organize, visualize and search the different types of data generated in an EST project pipeline. In contrast to other systems, the JUICE data management system allows a branched pipeline to be established, modified and expanded, during the course of an EST project. The web interfaces and tools in JUICE enable the users to visualize the information in a graphical, user-friendly manner. The user may browse or search for sequences and/or sequence information within all the branches of the pipeline. The user can search using terms associated with the sequence name, annotation or other characteristics stored in JUICE and associated with sequences or sequence groups. Groups of sequences can be created by the user, stored in a clipboard and/or downloaded for further analyses. Different user profiles restrict the access of each user depending upon their role in the project. The user may have access exclusively to visualize sequence information, access to annotate sequences and sequence information, or administrative access. Conclusion JUICE is an open source data management system that has been developed to aid users in organizing and analyzing the large amount of data generated in an EST Project workflow. JUICE has been used in one of the first functional genomics projects in Chile, entitled "Functional Genomics in nectarines: Platform to potentiate the competitiveness of Chile in fruit exportation". However, due to its ability to organize and visualize data from external pipelines, JUICE is a flexible data management system that should be useful for other EST/Genome projects. The JUICE data management system is released under the Open Source GNU Lesser General Public License (LGPL). JUICE may be downloaded from or . PMID:17123449

  7. Mapping of beef, sheep and goat food systems in Nairobi - A framework for policy making and the identification of structural vulnerabilities and deficiencies.

    PubMed

    Alarcon, Pablo; Fèvre, Eric M; Murungi, Maurice K; Muinde, Patrick; Akoko, James; Dominguez-Salas, Paula; Kiambi, Stella; Ahmed, Sohel; Häsler, Barbara; Rushton, Jonathan

    2017-03-01

    Nairobi is a large rapidly-growing city whose demand for beef, mutton and goat products is expected to double by 2030. The study aimed to map the Nairobi beef, sheep and goat systems structure and flows to identify deficiencies and vulnerabilities to shocks. Cross-sectional data were collected through focus group discussions and interviews with people operating in Nairobi ruminant livestock and meat markets and in the large processing companies. Qualitative and quantitative data were obtained about the type of people, animals, products and value adding activities in the chains, and their structural, spatial and temporal interactions. Mapping analysis was done in three different dimensions: people and product profiling (interactions of people and products), geographical (routes of animals and products) and temporal mapping (seasonal fluctuations). The results obtained were used to identify structural deficiencies and vulnerability factors in the system. Results for the beef food system showed that 44-55% of the city's beef supply flows through the 'local terminal markets', but that 54-64% of total supply is controlled by one 'meat market'. Numerous informal chains were identified, with independent livestock and meat traders playing a pivotal role in the functionality of these systems, and where most activities are conducted with inefficient quality control and under scarce and inadequate infrastructure and organisation, generating wastage and potential food safety risks in low quality meat products. Geographical and temporal analysis showed the critical areas influencing the different markets, with larger markets increasing their market share in the low season. Large processing companies, partly integrated, operate with high quality infrastructures, but with up to 60% of their beef supply depending on similar routes as the informal markets. Only these companies were involved in value addition activities, reaching high-end markets, but also dominating the distribution of popular products, such as beef sausages, to middle and low-end market. For the small ruminant food system, 73% of the low season supply flows through a single large informal market, Kiamaiko, located in an urban informal settlement. No grading is done for these animals or the meat produced. Large companies were reported to export up to 90% of their products. Lack of traceability and control of animal production was a common feature in all chains. The mapping presented provides a framework for policy makers and institutions to understand and design improvement plans for the Nairobi ruminant food system. The structural deficiencies and vulnerabilities identified here indicate the areas of intervention needed.

  8. AGIS: The ATLAS Grid Information System

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Belov, Sergey; Di Girolamo, Alessandro; Gayazov, Stavro; Klimentov, Alexei; Oleynik, Danila; Senchenko, Alexander

    2012-12-01

    ATLAS is a particle physics experiment at the Large Hadron Collider at CERN. The experiment produces petabytes of data annually through simulation production and tens petabytes of data per year from the detector itself. The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.

  9. AGIS: The ATLAS Grid Information System

    NASA Astrophysics Data System (ADS)

    Anisenkov, A.; Di Girolamo, A.; Klimentov, A.; Oleynik, D.; Petrosyan, A.; Atlas Collaboration

    2014-06-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produced petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we describe the ATLAS Grid Information System (AGIS), designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  10. Military Health System Transformation Implications on Health Information Technology Modernization.

    PubMed

    Khan, Saad

    2018-03-01

    With the recent passage of the National Defense Authorization Act for Fiscal Year 2017, Congress has triggered groundbreaking Military Health System organizational restructuring with the Defense Health Agency assuming responsibility for managing all hospitals and clinics owned by the Army, Navy, and Air Force. This is a major shift toward a modern value-based managed care system, which will require much greater military-civilian health care delivery integration to be in place by October 2018. Just before the National Defense Authorization Act for Fiscal Year 2017 passage, the Department of Defense had already begun a seismic shift and awarded a contract for the new Military Health System-wide electronic health record system. In this perspective, we discuss the implications of the intersection of two large-scope and large-scale initiatives, health system transformation, and information technology modernization, being rolled out in the largest and most complex federal agency and potential risk mitigating steps. The Military Health System will require an expanded unified clinical leadership to spearhead short-term transformation; furthermore, developing, organizing, and growing a cadre of informatics expertise to expand the use and diffusion of novel solutions such as health information exchanges, data analytics, and others to transcend organizational barriers are still needed to achieve the long-term aim of health system reform as envisioned by the National Defense Authorization Act for Fiscal Year 2017.

  11. Method of developing all-optical trinary JK, D-type, and T-type flip-flops using semiconductor optical amplifiers.

    PubMed

    Garai, Sisir Kumar

    2012-04-10

    To meet the demand of very fast and agile optical networks, the optical processors in a network system should have a very fast execution rate, large information handling, and large information storage capacities. Multivalued logic operations and multistate optical flip-flops are the basic building blocks for such fast running optical computing and data processing systems. In the past two decades, many methods of implementing all-optical flip-flops have been proposed. Most of these suffer from speed limitations because of the low switching response of active devices. The frequency encoding technique has been used because of its many advantages. It can preserve its identity throughout data communication irrespective of loss of light energy due to reflection, refraction, attenuation, etc. The action of polarization-rotation-based very fast switching of semiconductor optical amplifiers increases processing speed. At the same time, tristate optical flip-flops increase information handling capacity.

  12. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  13. Health system reform and the role of field sites based upon demographic and health surveillance.

    PubMed Central

    Tollman, S. M.; Zwi, A. B.

    2000-01-01

    Field sites for demographic and health surveillance have made well-recognized contributions to the evaluation of new or untested interventions, largely through efficacy trials involving new technologies or the delivery of selected services, e.g. vaccines, oral rehydration therapy and alternative contraceptive methods. Their role in health system reform, whether national or international, has, however, proved considerably more limited. The present article explores the characteristics and defining features of such field sites in low-income and middle-income countries and argues that many currently active sites have a largely untapped potential for contributing substantially to national and subnational health development. Since the populations covered by these sites often correspond with the boundaries of districts or subdistricts, the strategic use of information generated by demographic surveillance can inform the decentralization efforts of national and provincial health authorities. Among the areas of particular importance are the following: making population-based information available and providing an information resource; evaluating programmes and interventions; and developing applications to policy and practice. The question is posed as to whether their potential contribution to health system reform justifies arguing for adaptations to these field sites and expanded investment in them. PMID:10686747

  14. A decision support system for map projections of small scale data

    USGS Publications Warehouse

    Finn, Michael P.; Usery, E. Lynn; Posch, Stephan T.; Seong, Jeong Chang

    2004-01-01

    The use of commercial geographic information system software to process large raster datasets of terrain elevation, population, land cover, vegetation, soils, temperature, and rainfall requires both projection from spherical coordinates to plane coordinate systems and transformation from one plane system to another. Decision support systems deliver information resulting in knowledge that assists in policies, priorities, or processes. This paper presents an approach to handling the problems of raster dataset projection and transformation through the development of a Web-enabled decision support system to aid users of transformation processes with the selection of appropriate map projections based on data type, areal extent, location, and preservation properties.

  15. 48 CFR 24.203 - Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Policy. 24.203 Section 24... PROTECTION OF PRIVACY AND FREEDOM OF INFORMATION Freedom of Information Act 24.203 Policy. (a) The Act... a large and increasing body of court rulings and policy guidance, contracting officers are cautioned...

  16. Fuelling a National Innovation System in Colombia

    ERIC Educational Resources Information Center

    Lucio-Arias, Diana

    2006-01-01

    This presentation of the innovation-driven environment in Colombia derives from important national efforts to gather and store pertinent information. Two large surveys have tested the "innovative behaviour" of Colombian manufacturing firms--the more recent of these was in 2005. Another information source is the Scienti platform, an…

  17. How to Search the ERIC File.

    ERIC Educational Resources Information Center

    Mathies, Lorraine

    1972-01-01

    The ERIC information system is designed for computerized information storage and retrieval. While the computer can play an increasingly more vital role in facilitating reference searches of large literature collections, experience shows that manual searching gives the user skills and expertise that are essential to effectively use the computerized…

  18. Program review presentation to Level 1, Interagency Coordination Committee

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Progress in the development of crop inventory technology is reported. Specific topics include the results of a thematic mapper analysis, variable selection studies/early season estimator improvements, the agricultural information system simulator, large unit proportion estimation, and development of common features for multi-satellite information extraction.

  19. Documents Similarity Measurement Using Field Association Terms.

    ERIC Educational Resources Information Center

    Atlam, El-Sayed; Fuketa, M.; Morita, K.; Aoe, Jun-ichi

    2003-01-01

    Discussion of text analysis and information retrieval and measurement of document similarity focuses on a new text manipulation system called FA (field association)-Sim that is useful for retrieving information in large heterogeneous texts and for recognizing content similarity in text excerpts. Discusses recall and precision, automatic indexing…

  20. Learning of spatio-temporal codes in a coupled oscillator system.

    PubMed

    Orosz, Gábor; Ashwin, Peter; Townley, Stuart

    2009-07-01

    In this paper, we consider a learning strategy that allows one to transmit information between two coupled phase oscillator systems (called teaching and learning systems) via frequency adaptation. The dynamics of these systems can be modeled with reference to a number of partially synchronized cluster states and transitions between them. Forcing the teaching system by steady but spatially nonhomogeneous inputs produces cyclic sequences of transitions between the cluster states, that is, information about inputs is encoded via a "winnerless competition" process into spatio-temporal codes. The large variety of codes can be learned by the learning system that adapts its frequencies to those of the teaching system. We visualize the dynamics using "weighted order parameters (WOPs)" that are analogous to "local field potentials" in neural systems. Since spatio-temporal coding is a mechanism that appears in olfactory systems, the developed learning rules may help to extract information from these neural ensembles.

  1. Exploring nursing e-learning systems success based on information system success model.

    PubMed

    Chang, Hui-Chuan; Liu, Chung-Feng; Hwang, Hsin-Ginn

    2011-12-01

    E-learning is thought of as an innovative approach to enhance nurses' care service knowledge. Extensive research has provided rich information toward system development, courses design, and nurses' satisfaction with an e-learning system. However, a comprehensive view in understanding nursing e-learning system success is an important but less focused-on topic. The purpose of this research was to explore net benefits of nursing e-learning systems based on the updated DeLone and McLean's Information System Success Model. The study used a self-administered questionnaire to collected 208 valid nurses' responses from 21 of Taiwan's medium- and large-scale hospitals that have implemented nursing e-learning systems. The result confirms that the model is sufficient to explore the nurses' use of e-learning systems in terms of intention to use, user satisfaction, and net benefits. However, while the three exogenous quality factors (system quality, information quality, and service quality) were all found to be critical factors affecting user satisfaction, only information quality showed a direct effect on the intention to use. This study provides useful insights for evaluating nursing e-learning system qualities as well as an understanding of nurses' intentions and satisfaction related to performance benefits.

  2. Promoting meaningful use of health information technology in Israel: ministry of health vision.

    PubMed

    Gerber, Ayala; Topaz, Maxim Max

    2014-01-01

    The Ministry of Health (MOH) of Israel has overall responsibility for the healthcare system. In recent years the MOH has developed strong capabilities in the areas of technology assessment and prioritization of new technologies. Israel completed the transition to computerized medical records a decade ago in most care settings; however, the processes in Israel was spontaneous, without government control and standards settings, therefore large variations among systems and among organizations were created. Currently, the main challenge is to convert the information scattered in different systems, to organized, visible information and to make it available to various levels in health management. The MOH's solution is of implementing a selected information system from a specific vendor, at all the hospitals and all HMO's clinics, in order to achieve interoperability. The sys-tem will enable access to the patient's medical record history from any location.

  3. Making automated computer program documentation a feature of total system design

    NASA Technical Reports Server (NTRS)

    Wolf, A. W.

    1970-01-01

    It is pointed out that in large-scale computer software systems, program documents are too often fraught with errors, out of date, poorly written, and sometimes nonexistent in whole or in part. The means are described by which many of these typical system documentation problems were overcome in a large and dynamic software project. A systems approach was employed which encompassed such items as: (1) configuration management; (2) standards and conventions; (3) collection of program information into central data banks; (4) interaction among executive, compiler, central data banks, and configuration management; and (5) automatic documentation. A complete description of the overall system is given.

  4. Remote sensing information sciences research group

    NASA Technical Reports Server (NTRS)

    Estes, John E.; Smith, Terence; Star, Jeffrey L.

    1988-01-01

    Research conducted under this grant was used to extend and expand existing remote sensing activities at the University of California, Santa Barbara in the areas of georeferenced information systems, matching assisted information extraction from image data and large spatial data bases, artificial intelligence, and vegetation analysis and modeling. The research thrusts during the past year are summarized. The projects are discussed in some detail.

  5. Using Process Redesign and Information Technology to Improve Procurement

    DTIC Science & Technology

    1994-04-01

    contrac- tor. Many large-volume contractors have automated order processing tied to ac- counting, manufacturing, and shipping subsystems. Currently...the contractor must receive the mailed order, analyze it, extract pertinent information, and en- ter that information into the automated order ... processing system. Almost all orders for small purchases are unilateral documents that do not require acceptance or acknowledgment by the contractor. For

  6. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  7. Information Equity and Information Technology: Some Preliminary Findings from a Videotex Field Trial.

    ERIC Educational Resources Information Center

    Ettema, James S.

    A study was conducted to determine who, within a target user group, used and benefitted from a videotex system. The subjects were large-scale farmers who agreed to have videotex terminals installed in their homes to receive a wide range of informational and commercial transaction services provided by a bank holding company. At the end of an…

  8. Information Visualization and Proposing New Interface for Movie Retrieval System (IMDB)

    ERIC Educational Resources Information Center

    Etemadpour, Ronak; Masood, Mona; Belaton, Bahari

    2010-01-01

    This research studies the development of a new prototype of visualization in support of movie retrieval. The goal of information visualization is unveiling of large amounts of data or abstract data set using visual presentation. With this knowledge the main goal is to develop a 2D presentation of information on movies from the IMDB (Internet Movie…

  9. Bioinspired principles for large-scale networked sensor systems: an overview.

    PubMed

    Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg

    2011-01-01

    Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy.

  10. HIPAA and the military health system: organizing technological and organizational reform in large enterprises

    NASA Astrophysics Data System (ADS)

    Collmann, Jeff R.

    2001-08-01

    The global scale, multiple units, diverse operating scenarios and complex authority structure of the Department of Defense Military Health System (MHS) create social boundaries that tend to reduce communication and collaboration about data security. Under auspices of the Defense Health Information Assurance Program (DHIAP), the Telemedicine and Advanced Technology Research Center (TATRC) is contributing to the MHS's efforts to prepare for and comply with the Health Insurance Portability and Accountability Act (HIPAA) of 1996 through organizational and technological innovations that bridge such boundaries. Building interdisciplinary (clinical, administrative and information technology) medical information security readiness teams (MISRT) at each military treatment facility (MTF) constitutes the heart of this process. DHIAP is equipping and training MISRTs to use new tools including 'OCTAVE', a self-directed risk assessment instrument and 'RIMR', a web-enabled Risk Information Management Resource. DHIAP sponsors an interdisciplinary, triservice workgroup for review and revision of relevant DoD and service policies and participates in formal DoD health information assurance activities. These activities help promote a community of proponents across the MHS supportive of improved health information assurance. The MHS HIPAA-compliance effort teaches important general lessons about organizational reform in large civilian or military enterprises.

  11. Phenotype-information-phenotype cycle for deconvolution of combinatorial antibody libraries selected against complex systems.

    PubMed

    Zhang, Hongkai; Torkamani, Ali; Jones, Teresa M; Ruiz, Diana I; Pons, Jaume; Lerner, Richard A

    2011-08-16

    Use of large combinatorial antibody libraries and next-generation sequencing of nucleic acids are two of the most powerful methods in modern molecular biology. The libraries are screened using the principles of evolutionary selection, albeit in real time, to enrich for members with a particular phenotype. This selective process necessarily results in the loss of information about less-fit molecules. On the other hand, sequencing of the library, by itself, gives information that is mostly unrelated to phenotype. If the two methods could be combined, the full potential of very large molecular libraries could be realized. Here we report the implementation of a phenotype-information-phenotype cycle that integrates information and gene recovery. After selection for phage-encoded antibodies that bind to targets expressed on the surface of Escherichia coli, the information content of the selected pool is obtained by pyrosequencing. Sequences that encode specific antibodies are identified by a bioinformatic analysis and recovered by a stringent affinity method that is uniquely suited for gene isolation from a highly degenerate collection of nucleic acids. This approach can be generalized for selection of antibodies against targets that are present as minor components of complex systems.

  12. Channelling information flows from observation to decision; or how to increase certainty

    NASA Astrophysics Data System (ADS)

    Weijs, S. V.

    2015-12-01

    To make adequate decisions in an uncertain world, information needs to reach the decision problem, to enable overseeing the full consequences of each possible decision.On its way from the physical world to a decision problem, information is transferred through the physical processes that influence the sensor, then through processes that happen in the sensor, through wires or electromagnetic waves. For the last decade, most information becomes digitized at some point. From moment of digitization, information can in principle be transferred losslessly. Information about the physical world is often also stored, sometimes in compressed form, such as physical laws, concepts, or models of specific hydrological systems. It is important to note, however, that all information about a physical system eventually has to originate from observation (although inevitably coloured by some prior assumptions). This colouring makes the compression lossy, but is effectively the only way to make use of similarities in time and space that enable predictions while measuring only a a few macro-states of a complex hydrological system.Adding physical process knowledge to a hydrological model can thus be seen as a convenient way to transfer information from observations from a different time or place, to make predictions about another situation, assuming the same dynamics are at work.The key challenge to achieve more certainty in hydrological prediction can therefore be formulated as a challenge to tap and channel information flows from the environment. For tapping more information flows, new measurement techniques, large scale campaigns, historical data sets, and large sample hydrology and regionalization efforts can bring progress. For channelling the information flows with minimum loss, model calibration, and model formulation techniques should be critically investigated. Some experience from research in a Swiss high alpine catchment are used as an illustration.

  13. Large, horizontal-axis wind turbines

    NASA Technical Reports Server (NTRS)

    Linscott, B. S.; Perkins, P.; Dennett, J. T.

    1984-01-01

    Development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generating systems are presented. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. There are several ongoing large wind system development projects and applied research efforts directed toward meeting the technology requirements for utility applications. Detailed information on these projects is provided. The Mod-O research facility and current applied research effort in aerodynamics, structural dynamics and aeroelasticity, composite and hybrid composite materials, and multiple system interaction are described. A chronology of component research and technology development for large, horizontal axis wind turbines is presented. Wind characteristics, wind turbine economics, and the impact of wind turbines on the environment are reported. The need for continued wind turbine research and technology development is explored. Over 40 references are sited and a bibliography is included.

  14. Development and management of a geographic information system for health research in a developing-country setting: a case study from Bangladesh.

    PubMed

    Sugimoto, Jonathan D; Labrique, Alain B; Ahmad, Salahuddin; Rashid, Mahbubur; Klemm, Rolf D W; Christian, Parul; West, Keith P

    2007-12-01

    In the last decade, geographic information systems (GIS) have become accessible to researchers in developing countries, yet guidance remains sparse for developing a GIS. Drawing on experience in developing a GIS for a large community trial in rural Bangladesh, six stages for constructing, maintaining, and using a GIS for health research purposes were outlined. The system contains 0.25 million landmarks, including 150,000 houses, in an area of 435 sq km with over 650,000 people. Assuming access to reasonably accurate paper boundary maps of the intended working area and the absence of pre-existing digital local-area maps, the six stages are: to (a) digitize and update existing paper maps, (b) join the digitized maps into a large-area map, (c) reference this large-area map to a geographic coordinate system, (d) insert location landmarks of interest, (e) maintain the GIS, and (f) link it to other research databases. These basic steps can produce a household-level, updated, scaleable GIS that can both enhance field efficiency and support epidemiologic analyses of demographic patterns, diseases, and health outcomes.

  15. Development and Management of a Geographic Information System for Health Research in a Developing-country Setting: A Case Study from Bangladesh

    PubMed Central

    Sugimoto, Jonathan D.; Labrique, Alain B.; Salahuddin, Ahmad; Rashid, Mahbubur; Klemm, Rolf D.W.; Christian, Parul; West, Keith P.

    2007-01-01

    In the last decade, geographic information systems (GIS) have become accessible to researchers in developing countries, yet guidance remains sparse for developing a GIS. Drawing on experience in developing a GIS for a large community trial in rural Bangladesh, six stages for constructing, maintaining, and using a GIS for health research purposes were outlined. The system contains 0.25 million landmarks, including 150,000 houses, in an area of 435 sq km with over 650,000 people. Assuming access to reasonably accurate paper boundary maps of the intended working area and the absence of pre-existing digital local-area maps, the six stages are: to (a) digitize and update existing paper maps, (b) join the digitized maps into a large-area map, (c) reference this large-area map to a geographic coordinate system, (d) insert location landmarks of interest, (e) maintain the GIS, and (f) link it to other research databases. These basic steps can produce a household-level, updated, scaleable GIS that can both enhance field efficiency and support epidemiologic analyses of demographic patterns, diseases, and health outcomes. PMID:18402187

  16. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.

  17. A web-based laboratory information system to improve quality of care of tuberculosis patients in Peru: functional requirements, implementation and usage statistics.

    PubMed

    Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin J A; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish S F

    2007-10-28

    Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS http://www.openmrs.org for other countries to use.

  18. A web-based laboratory information system to improve quality of care of tuberculosis patients in Peru: functional requirements, implementation and usage statistics

    PubMed Central

    Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin JA; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish SF

    2007-01-01

    Background Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. Methods A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Results Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Conclusion Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS for other countries to use. PMID:17963522

  19. Impact of Thailand universal coverage scheme on the country's health information systems and health information technology.

    PubMed

    Kijsanayotin, Boonchai

    2013-01-01

    Thailand achieved universal healthcare coverage with the implementation of the Universal Coverage Scheme (UCS) in 2001. This study employed qualitative method to explore the impact of the UCS on the country's health information systems (HIS) and health information technology (HIT) development. The results show that health insurance beneficiary registration system helps improve providers' service workflow and country vital statistics. Implementation of casemix financing tool, Thai Diagnosis-Related Groups, has stimulated health providers' HIS and HIT capacity building, data and medical record quality and the adoption of national administrative data standards. The system called "Disease Management Information Systems" aiming at reimbursement for select diseases increased the fragmentation of HIS and increase burden on data management to providers. The financial incentive of outpatient data quality improvement project enhance providers' HIS and HIT investment and also induce data fraudulence tendency. Implementation of UCS has largely brought favorable impact on the country HIS and HIT development. However, the unfavorable effects are also evident.

  20. Enabling Controlling Complex Networks with Local Topological Information.

    PubMed

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  1. Factors shaping the evolution of electronic documentation systems

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Sullivan, Tim R.; Scace, Jacque R.

    1990-01-01

    The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments.

  2. Learning to leverage existing information systems: Part 1. Principles.

    PubMed

    Neil, Nancy; Nerenz, David

    2003-10-01

    The success of performance improvement efforts depends on effective measurement and feedback regarding clinical processes and outcomes. Yet most health care organizations have fragmented rather than integrated data systems. Methods and practical guidance are provided for leveraging available information sources to obtain and create valid performance improvement-related information for use by clinicians and administrators. At Virginia Mason Health System (VMHS; Seattle), a vertically integrated hospital and multispecialty group practice, patient records are paper based and are supplemented with electronic reporting for laboratory and radiology services. Despite growth in the resources and interest devoted to organization-wide performance measurement, quality improvement, and evidence-based tools, VMHS's information systems consist of largely stand-alone, legacy systems organized around the ability to retrieve information on patients, one at a time. By 2002, without any investment in technology, VMHS had developed standardized, clinic-wide key indicators of performance updated and reported regularly at the patient, provider, site, and organizational levels. On the basis of VHMS's experience, principles can be suggested to guide other organizations to explore solutions using their own information systems: for example, start simply, but start; identify information needs; tap multiple data streams; and improve incrementally.

  3. An interactive web application for the dissemination of human systems immunology data.

    PubMed

    Speake, Cate; Presnell, Scott; Domico, Kelly; Zeitner, Brad; Bjork, Anna; Anderson, David; Mason, Michael J; Whalen, Elizabeth; Vargas, Olivia; Popov, Dimitry; Rinchai, Darawan; Jourde-Chiche, Noemie; Chiche, Laurent; Quinn, Charlie; Chaussabel, Damien

    2015-06-19

    Systems immunology approaches have proven invaluable in translational research settings. The current rate at which large-scale datasets are generated presents unique challenges and opportunities. Mining aggregates of these datasets could accelerate the pace of discovery, but new solutions are needed to integrate the heterogeneous data types with the contextual information that is necessary for interpretation. In addition, enabling tools and technologies facilitating investigators' interaction with large-scale datasets must be developed in order to promote insight and foster knowledge discovery. State of the art application programming was employed to develop an interactive web application for browsing and visualizing large and complex datasets. A collection of human immune transcriptome datasets were loaded alongside contextual information about the samples. We provide a resource enabling interactive query and navigation of transcriptome datasets relevant to human immunology research. Detailed information about studies and samples are displayed dynamically; if desired the associated data can be downloaded. Custom interactive visualizations of the data can be shared via email or social media. This application can be used to browse context-rich systems-scale data within and across systems immunology studies. This resource is publicly available online at [Gene Expression Browser Landing Page ( https://gxb.benaroyaresearch.org/dm3/landing.gsp )]. The source code is also available openly [Gene Expression Browser Source Code ( https://github.com/BenaroyaResearch/gxbrowser )]. We have developed a data browsing and visualization application capable of navigating increasingly large and complex datasets generated in the context of immunological studies. This intuitive tool ensures that, whether taken individually or as a whole, such datasets generated at great effort and expense remain interpretable and a ready source of insight for years to come.

  4. [Revelation of purchase system of developed nation to large medical equipment group purchase in our country].

    PubMed

    Tao, Lin; Guan, Bing; Liu, Shan

    2011-01-01

    There were some features of purchase system in developed nation, such as clear purchase objectives flexible methods, standard programming, emphasis on competition and open process. The measures suggested include playing the role of competition purchasing; establishing the e-business modern purchasing information system; establishing legislation system; and completing business purchasing.

  5. Northeast Artificial Intelligence Consortium (NAIC). Volume 12. Computer Architecture for Very Large Knowledge Bases

    DTIC Science & Technology

    1990-12-01

    data rate to the electronics would be much lower on the average and the data much "richer" in information. Intelligent use of...system bottleneck, a high data rate should be provided by I/O systems. 2. machines with intelligent storage management specially designed for logic...management information processing, surveillance sensors, intelligence data collection and handling, solid state sciences, electromagnetics, and propagation, and electronic reliability/maintainability and compatibility.

  6. Information driving force and its application in agent-based modeling

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2018-04-01

    Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.

  7. Renormalization of concurrence: The application of the quantum renormalization group to quantum-information systems

    NASA Astrophysics Data System (ADS)

    Kargarian, M.; Jafari, R.; Langari, A.

    2007-12-01

    We have combined the idea of renormalization group and quantum-information theory. We have shown how the entanglement or concurrence evolve as the size of the system becomes large, i.e., the finite size scaling is obtained. Moreover, we introduce how the renormalization-group approach can be implemented to obtain the quantum-information properties of a many-body system. We have obtained the concurrence as a measure of entanglement, its derivatives and their scaling behavior versus the size of system for the one-dimensional Ising model in transverse field. We have found that the derivative of concurrence between two blocks each containing half of the system size diverges at the critical point with the exponent, which is directly associated with the divergence of the correlation length.

  8. Study of Tools for Command and Telemetry Dictionaries

    NASA Technical Reports Server (NTRS)

    Pires, Craig; Knudson, Matthew D.

    2017-01-01

    The Command and Telemetry Dictionary is at the heart of space missions. The C&T Dictionary represents all of the information that is exchanged between the various systems both in space and on the ground. Large amounts of ever-changing information has to be disseminated to all for the various systems and sub-systems throughout all phases of the mission. The typical approach of having each sub-system manage it's own information flow, results in a patchwork of methods within a mission. This leads to significant duplication of effort and potential errors. More centralized methods have been developed to manage this data flow. This presentation will compare two tools that have been developed for this purpose, CCDD and SCIMI that were designed to work with the Core Flight System (cFS).

  9. Unification - An international aerospace information issue

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.; Lahr, Thomas F.

    1992-01-01

    Scientific and Technical Information (STI) represents the results of large investments in research and development (R&D) and the expertise of a nation and is a valuable resource. For more than four decades, NASA and its predecessor organizations have developed and managed the preeminent aerospace information system. NASA obtains foreign materials through its international exchange relationships, continually increasing the comprehensiveness of the NASA Aerospace Database (NAD). The NAD is de facto the international aerospace database. This paper reviews current NASA goals and activities with a view toward maintaining compatibility among international aerospace information systems, eliminating duplication of effort, and sharing resources through international cooperation wherever possible.

  10. Automated information-analytical system for thunderstorm monitoring and early warning alarms using modern physical sensors and information technologies with elements of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.

    2017-05-01

    Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.

  11. Public Health Surveillance and Meaningful Use Regulations: A Crisis of Opportunity

    PubMed Central

    Sundwall, David N.

    2012-01-01

    The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health’s information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints. PMID:22390523

  12. Achieving cost reductions in EOSDIS operations through technology evolution

    NASA Technical Reports Server (NTRS)

    Newsome, Penny; Moe, Karen; Harberts, Robert

    1996-01-01

    The earth observing system (EOS) data information system (EOSDIS) mission includes the cost-effective management and distribution of large amounts of data to the earth science community. The effect of the introduction of new information system technologies on the evolution of EOSDIS is considered. One of the steps taken by NASA to enable the introduction of new information system technologies into the EOSDIS is the funding of technology development through prototyping. Recent and ongoing prototyping efforts and their potential impact on the performance and cost-effectiveness of the EOSDIS are discussed. The technology evolution process as it related to the effective operation of EOSDIS is described, and methods are identified for the support of the transfer of relevant technology to EOSDIS components.

  13. Public health surveillance and meaningful use regulations: a crisis of opportunity.

    PubMed

    Lenert, Leslie; Sundwall, David N

    2012-03-01

    The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health's information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints.

  14. Biomedical data mining in clinical routine: expanding the impact of hospital information systems.

    PubMed

    Müller, Marcel; Markó, Kornel; Daumke, Philipp; Paetzold, Jan; Roesner, Arnold; Klar, Rüdiger

    2007-01-01

    In this paper we want to describe how the promising technology of biomedical data mining can improve the use of hospital information systems: a large set of unstructured, narrative clinical data from a dermatological university hospital like discharge letters or other dermatological reports were processed through a morpho-semantic text retrieval engine ("MorphoSaurus") and integrated with other clinical data using a web-based interface and brought into daily clinical routine. The user evaluation showed a very high user acceptance - this system seems to meet the clinicians' requirements for a vertical data mining in the electronic patient records. What emerges is the need for integration of biomedical data mining into hospital information systems for clinical, scientific, educational and economic reasons.

  15. Developing a Large Lexical Database for Information Retrieval, Parsing, and Text Generation Systems.

    ERIC Educational Resources Information Center

    Conlon, Sumali Pin-Ngern; And Others

    1993-01-01

    Important characteristics of lexical databases and their applications in information retrieval and natural language processing are explained. An ongoing project using various machine-readable sources to build a lexical database is described, and detailed designs of individual entries with examples are included. (Contains 66 references.) (EAM)

  16. Using Computers in Early Childhood Classrooms: Teachers' Attitudes, Skills and Practices

    ERIC Educational Resources Information Center

    Chen, Jie-Qi; Chang, Charles

    2006-01-01

    To better prepare early childhood teachers for computer use, more information about their current skills and classroom practices is needed. Sampling from a large metropolitan public school system in the USA, the study surveyed 297 state pre-kindergarten teachers, gathering information about their attitudes, skills, and instructional methods…

  17. Data Sharing to Inform School-Based Asthma Services

    ERIC Educational Resources Information Center

    Portwood, Sharon G.; Nelson, Elissa B.

    2013-01-01

    Background: This article examines results and lessons learned from a collaborative project involving a large urban school district, its county health department, multiple community partners, and the local university to establish an effective system for data sharing to inform monitoring and evaluation of the Charlotte Mecklenburg Schools (CMS)…

  18. A Hierarchic System for Information Usage.

    ERIC Educational Resources Information Center

    Lu, John; Markham, David

    This paper demonstrates an approach which enables one to reduce in a systematic way the immense complexity of a large body of knowledge. This approach provides considerable insight into what is known and unknown in a given academic field by systematically and pragmatically ordering the information. As a case study, the authors selected…

  19. 75 FR 1757 - Office of Special Education and Rehabilitative Services (OSERS); Overview Information; National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-13

    ... as artificial intelligence or information technology devices, software, and systems. For more... in an accessible format (e.g., braille, large print, audiotape, or computer diskette) by contacting... electronic application, you may wish to print a copy of it for your records. After you electronically submit...

  20. Early childhood education: Status trends, and issues related to electronic delivery

    NASA Technical Reports Server (NTRS)

    Rothenberg, D.

    1973-01-01

    The status of, and trends and issues within, early childhood education which are related to the possibilities of electronic delivery of educational service are considered in a broader investigation of the role of large scale, satellite based educational telecommunications systems. Data are analyzed and trends and issues discussed to provide information useful to the system designer who wishes to identify and assess the opportunities for large scale electronic delivery in early childhood education.

  1. The T.M.R. Data Dictionary: A Management Tool for Data Base Design

    PubMed Central

    Ostrowski, Maureen; Bernes, Marshall R.

    1984-01-01

    In January 1981, a dictionary-driven ambulatory care information system known as TMR (The Medical Record) was installed at a large private medical group practice in Los Angeles. TMR's data dictionary has enabled the medical group to adapt the software to meet changing user needs largely without programming support. For top management, the dictionary is also a tool for navigating through the system's complexity and assuring the integrity of management goals.

  2. Developing an automated speech-recognition telephone diabetes intervention.

    PubMed

    Goldman, Roberta E; Sanchez-Hernandez, Maya; Ross-Degnan, Dennis; Piette, John D; Trinacty, Connie Mah; Simon, Steven R

    2008-08-01

    Many patients do not receive guideline-recommended care for diabetes and other chronic conditions. Automated speech-recognition telephone outreach to supplement in-person physician-patient communication may enhance patient care for chronic illness. We conducted this study to inform the development of an automated telephone outreach intervention for improving diabetes care among members of a large, not-for-profit health plan. In-depth telephone interviews with qualitative analysis. participants Individuals with diabetes (n=36) enrolled in a large regional health plan in the USA. Main outcome measure Patients' opinions about automated speech-recognition telephone technology. Patients who were recently diagnosed with diabetes and some with diabetes for a decade or more expressed basic informational needs. While most would prefer to speak with a live person rather than a computer-recorded voice, many felt that the automated system could successfully supplement the information they receive from their physicians and could serve as an integral part of their care. Patients suggested that such a system could provide specific dietary advice, information about diabetes and its self-care, a call-in menu of information topics, reminders about laboratory test results and appointments, tracking of personal laboratory results and feedback about their self-monitoring. While some patients expressed negative attitudes toward automated speech recognition telephone systems generally, most felt that a variety of functions of such a system could be beneficial to their diabetes care. In-depth interviews resulted in substantive input from health plan members for the design of an automated telephone outreach system to supplement in-person physician-patient communication in this population.

  3. Organisation of biotechnological information into knowledge.

    PubMed

    Boh, B

    1996-09-01

    The success of biotechnological research, development and marketing depends to a large extent on the international transfer of information and on the ability to organise biotechnology information into knowledge. To increase the efficiency of information-based approaches, an information strategy has been developed and consists of the following stages: definition of the problem, its structure and sub-problems; acquisition of data by targeted processing of computer-supported bibliographic, numeric, textual and graphic databases; analysis of data and building of specialized in-house information systems; information processing for structuring data into systems, recognition of trends and patterns of knowledge, particularly by information synthesis using the concept of information density; design of research hypotheses; testing hypotheses in the laboratory and/or pilot plant; repeated evaluation and optimization of hypotheses by information methods and testing them by further laboratory work. The information approaches are illustrated by examples from the university-industry joint projects in biotechnology, biochemistry and agriculture.

  4. A cooperative strategy for parameter estimation in large scale systems biology models.

    PubMed

    Villaverde, Alejandro F; Egea, Jose A; Banga, Julio R

    2012-06-22

    Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs ("threads") that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems.

  5. A cooperative strategy for parameter estimation in large scale systems biology models

    PubMed Central

    2012-01-01

    Background Mathematical models play a key role in systems biology: they summarize the currently available knowledge in a way that allows to make experimentally verifiable predictions. Model calibration consists of finding the parameters that give the best fit to a set of experimental data, which entails minimizing a cost function that measures the goodness of this fit. Most mathematical models in systems biology present three characteristics which make this problem very difficult to solve: they are highly non-linear, they have a large number of parameters to be estimated, and the information content of the available experimental data is frequently scarce. Hence, there is a need for global optimization methods capable of solving this problem efficiently. Results A new approach for parameter estimation of large scale models, called Cooperative Enhanced Scatter Search (CeSS), is presented. Its key feature is the cooperation between different programs (“threads”) that run in parallel in different processors. Each thread implements a state of the art metaheuristic, the enhanced Scatter Search algorithm (eSS). Cooperation, meaning information sharing between threads, modifies the systemic properties of the algorithm and allows to speed up performance. Two parameter estimation problems involving models related with the central carbon metabolism of E. coli which include different regulatory levels (metabolic and transcriptional) are used as case studies. The performance and capabilities of the method are also evaluated using benchmark problems of large-scale global optimization, with excellent results. Conclusions The cooperative CeSS strategy is a general purpose technique that can be applied to any model calibration problem. Its capability has been demonstrated by calibrating two large-scale models of different characteristics, improving the performance of previously existing methods in both cases. The cooperative metaheuristic presented here can be easily extended to incorporate other global and local search solvers and specific structural information for particular classes of problems. PMID:22727112

  6. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.

  7. An expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1992-01-01

    A new rule-based, machine independent analytical tool was designed for diagnosing spacecraft anomalies using an expert system. Expert systems provide an effective method for saving knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms, which allow approximate reasoning and inference and the ability to attack problems not rigidly defined. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The modularity of the expert system allows for easy updates and modifications. It not only provides scientists with needed risk analysis and confidence not found in algorithmic programs, but is also an effective learning tool, and the window implementation makes it very easy to use. The system currently runs on a Micro VAX II at Goddard Space Flight Center (GSFC). The inference engine used is NASA's C Language Integrated Production System (CLIPS).

  8. Medical sieve: a cognitive assistant for radiologists and cardiologists

    NASA Astrophysics Data System (ADS)

    Syeda-Mahmood, T.; Walach, E.; Beymer, D.; Gilboa-Solomon, F.; Moradi, M.; Kisilev, P.; Kakrania, D.; Compas, C.; Wang, H.; Negahdar, R.; Cao, Y.; Baldwin, T.; Guo, Y.; Gur, Y.; Rajan, D.; Zlotnick, A.; Rabinovici-Cohen, S.; Ben-Ari, R.; Guy, Amit; Prasanna, P.; Morey, J.; Boyko, O.; Hashoul, S.

    2016-03-01

    Radiologists and cardiologists today have to view large amounts of imaging data relatively quickly leading to eye fatigue. Further, they have only limited access to clinical information relying mostly on their visual interpretation of imaging studies for their diagnostic decisions. In this paper, we present Medical Sieve, an automated cognitive assistant for radiologists and cardiologists designed to help in their clinical decision-making. The sieve is a clinical informatics system that collects clinical, textual and imaging data of patients from electronic health records systems. It then analyzes multimodal content to detect anomalies if any, and summarizes the patient record collecting all relevant information pertinent to a chief complaint. The results of anomaly detection are then fed into a reasoning engine which uses evidence from both patient-independent clinical knowledge and large-scale patient-driven similar patient statistics to arrive at potential differential diagnosis to help in clinical decision making. In compactly summarizing all relevant information to the clinician per chief complaint, the system still retains links to the raw data for detailed review providing holistic summaries of patient conditions. Results of clinical studies in the domains of cardiology and breast radiology have already shown the promise of the system in differential diagnosis and imaging studies summarization.

  9. MendeLIMS: a web-based laboratory information management system for clinical genome sequencing.

    PubMed

    Grimes, Susan M; Ji, Hanlee P

    2014-08-27

    Large clinical genomics studies using next generation DNA sequencing require the ability to select and track samples from a large population of patients through many experimental steps. With the number of clinical genome sequencing studies increasing, it is critical to maintain adequate laboratory information management systems to manage the thousands of patient samples that are subject to this type of genetic analysis. To meet the needs of clinical population studies using genome sequencing, we developed a web-based laboratory information management system (LIMS) with a flexible configuration that is adaptable to continuously evolving experimental protocols of next generation DNA sequencing technologies. Our system is referred to as MendeLIMS, is easily implemented with open source tools and is also highly configurable and extensible. MendeLIMS has been invaluable in the management of our clinical genome sequencing studies. We maintain a publicly available demonstration version of the application for evaluation purposes at http://mendelims.stanford.edu. MendeLIMS is programmed in Ruby on Rails (RoR) and accesses data stored in SQL-compliant relational databases. Software is freely available for non-commercial use at http://dna-discovery.stanford.edu/software/mendelims/.

  10. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  11. Amalgamation of management information system into anaesthesiology practice: A boon for the modern anaesthesiologists

    PubMed Central

    Bajwa, Sukhminder Jit Singh

    2014-01-01

    Over the years, traditional anaesthesia record keeping system has been the backbone of anaesthesiology ever since its introduction in the 1890s by Dr. Harvey Cushing and Dr. Ernest A. Codman. Besides providing the important information regarding patients’ vital physiologic parameters, paper records had been a reliable source for various clinical research activities. The introduction of electronic monitoring gadgets and electronic record keeping systems has revolutionised the anaesthesiology practice to a large extent. Recently, the introduction of anaesthesia information management system (AIMS), which incorporates all the features of monitoring gadgets, such as electronic storage of large accurate data, quality assurance in anaesthesia, enhancing patient safety, ensuring legal protection, improved billing services and effecting an organisational change, is almost a revolution in modern-day anaesthesiology practice. The clinical research activities that are responsible for taking anaesthesiology discipline to higher peaks have also been boosted by the amalgamation of AIMS, enabling multicenter studies and sharing of clinical data. Barring few concerns in its installation, cost factors and functional aspects, the future of AIMS seems to be bright and will definitely prove to be a boon for modern-day anaesthesiology practice. PMID:24963173

  12. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  13. Use of Semantic Technology to Create Curated Data Albums

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Li, Xiang; Sainju, Roshan; Bakare, Rohan; Basyal, Sabin

    2014-01-01

    One of the continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available online. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the data sets they need can obtain the specific files using these systems. However, in cases where researchers are interested in studying an event of research interest, they must manually assemble a variety of relevant data sets by searching the different distributed data systems. Consequently, there is a need to design and build specialized search and discover tools in Earth science that can filter through large volumes of distributed online data and information and only aggregate the relevant resources needed to support climatology and case studies. This paper presents a specialized search and discovery tool that automatically creates curated Data Albums. The tool was designed to enable key elements of the search process such as dynamic interaction and sense-making. The tool supports dynamic interaction via different modes of interactivity and visual presentation of information. The compilation of information and data into a Data Album is analogous to a shoebox within the sense-making framework. This tool automates most of the tedious information/data gathering tasks for researchers. Data curation by the tool is achieved via an ontology-based, relevancy ranking algorithm that filters out nonrelevant information and data. The curation enables better search results as compared to the simple keyword searches provided by existing data systems in Earth science.

  14. Use of Semantic Technology to Create Curated Data Albums

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Li, Xiang; Sainju, Roshan; Bakare, Rohan; Basyal, Sabin; Fox, Peter (Editor); Norack, Tom (Editor)

    2014-01-01

    One of the continuing challenges in any Earth science investigation is the discovery and access of useful science content from the increasingly large volumes of Earth science data and related information available online. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. Those who know exactly the data sets they need can obtain the specific files using these systems. However, in cases where researchers are interested in studying an event of research interest, they must manually assemble a variety of relevant data sets by searching the different distributed data systems. Consequently, there is a need to design and build specialized search and discovery tools in Earth science that can filter through large volumes of distributed online data and information and only aggregate the relevant resources needed to support climatology and case studies. This paper presents a specialized search and discovery tool that automatically creates curated Data Albums. The tool was designed to enable key elements of the search process such as dynamic interaction and sense-making. The tool supports dynamic interaction via different modes of interactivity and visual presentation of information. The compilation of information and data into a Data Album is analogous to a shoebox within the sense-making framework. This tool automates most of the tedious information/data gathering tasks for researchers. Data curation by the tool is achieved via an ontology-based, relevancy ranking algorithm that filters out non-relevant information and data. The curation enables better search results as compared to the simple keyword searches provided by existing data systems in Earth science.

  15. Visual attention mitigates information loss in small- and large-scale neural codes.

    PubMed

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Understanding managerial behaviour during initial steps of a clinical information system adoption

    PubMed Central

    2011-01-01

    Background While the study of the information technology (IT) implementation process and its outcomes has received considerable attention, the examination of pre-adoption and pre-implementation stages of configurable IT uptake appear largely under-investigated. This paper explores managerial behaviour during the periods prior the effective implementation of a clinical information system (CIS) by two Canadian university multi-hospital centers. Methods Adopting a structurationist theoretical stance and a case study research design, the processes by which CIS managers' patterns of discourse contribute to the configuration of the new technology in their respective organizational contexts were longitudinally examined over 33 months. Results Although managers seemed to be aware of the risks and organizational impact of the adoption of a new clinical information system, their decisions and actions over the periods examined appeared rather to be driven by financial constraints and power struggles between different groups involved in the process. Furthermore, they largely emphasized technological aspects of the implementation, with organizational dimensions being put aside. In view of these results, the notion of 'rhetorical ambivalence' is proposed. Results are further discussed in relation to the significance of initial decisions and actions for the subsequent implementation phases of the technology being configured. Conclusions Theoretical and empirically grounded, the paper contributes to the underdeveloped body of literature on information system pre-implementation processes by revealing the crucial role played by managers during the initial phases of a CIS adoption. PMID:21682885

  17. Adapting forest health assessments to changing perspectives on threats--a case example from Sweden.

    PubMed

    Wulff, Sören; Lindelöw, Åke; Lundin, Lars; Hansson, Per; Axelsson, Anna-Lena; Barklund, Pia; Wijk, Sture; Ståhl, Göran

    2012-04-01

    A revised Swedish forest health assessment system is presented. The assessment system is composed of several interacting components which target information needs for strategic and operational decision making and accommodate a continuously expanding knowledge base. The main motivation for separating information for strategic and operational decision making is that major damage outbreaks are often scattered throughout the landscape. Generally, large-scale inventories (such as national forest inventories) cannot provide adequate information for mitigation measures. In addition to broad monitoring programs that provide time-series information on known damaging agents and their effects, there is also a need for local and regional inventories adapted to specific damage events. While information for decision making is the major focus of the health assessment system, the system also contributes to expanding the knowledge base of forest conditions. For example, the integrated monitoring programs provide a better understanding of ecological processes linked to forest health. The new health assessment system should be able to respond to the need for quick and reliable information and thus will be an important part of the future monitoring of Swedish forests.

  18. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  19. Changes, disruption and innovation: An investigation of the introduction of new health information technology in a microbiology laboratory.

    PubMed

    Toouli, George; Georgiou, Andrew; Westbrook, Johanna

    2012-01-01

    It is expected that health information technology (HIT) will deliver a safer, more efficient and effective health care system. The aim of this study was to undertake a qualitative and video-ethnographic examination of the impact of information technologies on work processes in the reception area of a Microbiology Department, to ascertain what changed, how it changed and the impact of the change. The setting for this study was the microbiology laboratory of a large tertiary hospital in Sydney. The study consisted of qualitative (interview and focus group) data and observation sessions for the period August 2005 to October 2006 along with video footage shot in three sessions covering the original system and the two stages of the Cerner implementation. Data analysis was assisted by NVivo software and process maps were produced from the video footage. There were two laboratory information systems observed in the video footage with computerized provider order entry introduced four months later. Process maps highlighted the large number of pre data entry steps with the original system whilst the newer system incorporated many of these steps in to the data entry stage. However, any time saved with the new system was offset by the requirement to complete some data entry of patient information not previously required. Other changes noted included the change of responsibilities for the reception staff and the physical changes required to accommodate the increased activity around the data entry area. Implementing a new HIT is always an exciting time for any environment but ensuring that the implementation goes smoothly and with minimal trouble requires the administrator and their team to plan well in advance for staff training, physical layout and possible staff resource reallocation.

  20. Comparing Objective Measures and Perceptions of Cognitive Learning in an ERP Simulation Game: A Research Note

    ERIC Educational Resources Information Center

    Cronan, Timothy Paul; Leger, Pierre-Majorique; Robert, Jacques; Babin, Gilbert; Charland, Patrick

    2012-01-01

    Enterprise Resource Planning (ERP) systems have had a significant impact on business organizations. These large systems offer opportunities for companies regarding the integration and functionality of information technology systems; in effect, companies can realize a competitive advantage that is necessary in today's global companies. However,…

  1. The CD-ROM Services of SilverPlatter Information, Inc.

    ERIC Educational Resources Information Center

    Allen, Robert J.

    1985-01-01

    The SilverPlatter system is a complete, stand-alone system, consisting of an IBM (or compatible) personal computer, compact disc with read-only memory (CD-ROM) drive, software, and one or more databases. Large databases (e.g., ERIC, PsycLIT) will soon be available on the system for "local" installation in schools, libraries, and…

  2. Risk Management Technique for design and operation of facilities and equipment

    NASA Technical Reports Server (NTRS)

    Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.

    1975-01-01

    The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.

  3. Outlier Detection for Sensor Systems (ODSS): A MATLAB Macro for Evaluating Microphone Sensor Data Quality.

    PubMed

    Vasta, Robert; Crandell, Ian; Millican, Anthony; House, Leanna; Smith, Eric

    2017-10-13

    Microphone sensor systems provide information that may be used for a variety of applications. Such systems generate large amounts of data. One concern is with microphone failure and unusual values that may be generated as part of the information collection process. This paper describes methods and a MATLAB graphical interface that provides rapid evaluation of microphone performance and identifies irregularities. The approach and interface are described. An application to a microphone array used in a wind tunnel is used to illustrate the methodology.

  4. Teleoperation with large time delay using a prevision system

    NASA Astrophysics Data System (ADS)

    Bergamasco, Massimo; De Paolis, Lucio; Ciancio, Stefano; Pinna, Sebastiano

    1997-12-01

    In teleoperation technology various techniques have been proposed in order to alleviate the effects of time delayed communication and to avoid the instability of the system. This paper describes a different approach to robotic teleoperation with large-time delay and a teleoperation system, based on teleprogramming paradigm, has been developed with the intent to improve the slave autonomy and to decrease the amount of information exchanged between master and slave system. The goal concept, specific of AI, has been used. In order to minimize the total task completion time has been introduced a prevision system, called Merlino, able to know in advance the slave's choices taking into account both the operator's actions and the information about the remote environment. The prevision system allows, in case of environment changes, to understand if the slave can solve the goal. Otherwise, Merlino is able to signal a 'fail situation.' Some experiments have been carried out by means of an advanced human-machine interface with force feedback, designed at PERCRO Laboratory of Scuola Superiore S. Anna, which gives a better sensation of presence in the remote environment.

  5. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    PubMed Central

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811

  6. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    PubMed

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  7. Cross-Organizational Knowledge Sharing: Information Reuse in Small Organizations

    ERIC Educational Resources Information Center

    White, Kevin Forsyth

    2010-01-01

    Despite the potential value of leveraging organizational memory and expertise, small organizations have been unable to capitalize on its promised value. Existing solutions have largely side-stepped the unique needs of these organizations, which are relegated to systems designed to take advantage of large pools of experts or to use Internet sources…

  8. Fire management over large landscapes: a hierarchical approach

    Treesearch

    Kenneth G. Boykin

    2008-01-01

    Management planning for fires becomes increasingly difficult as scale increases. Stratification provides land managers with multiple scales in which to prepare plans. Using statistical techniques, Geographic Information Systems (GIS), and meetings with land managers, we divided a large landscape of over 2 million acres (White Sands Missile Range) into parcels useful in...

  9. Reflections on CD-ROM: Bridging the Gap between Technology and Purpose.

    ERIC Educational Resources Information Center

    Saviers, Shannon Smith

    1987-01-01

    Provides a technological overview of CD-ROM (Compact Disc-Read Only Memory), an optically-based medium for data storage offering large storage capacity, computer-based delivery system, read-only medium, and economic mass production. CD-ROM database attributes appropriate for information delivery are also reviewed, including large database size,…

  10. VALUING ACID MINE DRAINAGE REMEDIATION IN WEST VIRGINIA: A HEDONIC MODELING APPROACH INCORPORATING GEOGRAPHIC INFORMATION SYSTEMS

    EPA Science Inventory

    States with active and abandoned mines face large private and public costs to remediate damage to streams and rivers from acid mine drainage (AMD). Appalachian states have an especially large number of contaminated streams and rivers, and the USGS places AMD as the primary source...

  11. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  12. Efficient feature extraction from wide-area motion imagery by MapReduce in Hadoop

    NASA Astrophysics Data System (ADS)

    Cheng, Erkang; Ma, Liya; Blaisse, Adam; Blasch, Erik; Sheaff, Carolyn; Chen, Genshe; Wu, Jie; Ling, Haibin

    2014-06-01

    Wide-Area Motion Imagery (WAMI) feature extraction is important for applications such as target tracking, traffic management and accident discovery. With the increasing amount of WAMI collections and feature extraction from the data, a scalable framework is needed to handle the large amount of information. Cloud computing is one of the approaches recently applied in large scale or big data. In this paper, MapReduce in Hadoop is investigated for large scale feature extraction tasks for WAMI. Specifically, a large dataset of WAMI images is divided into several splits. Each split has a small subset of WAMI images. The feature extractions of WAMI images in each split are distributed to slave nodes in the Hadoop system. Feature extraction of each image is performed individually in the assigned slave node. Finally, the feature extraction results are sent to the Hadoop File System (HDFS) to aggregate the feature information over the collected imagery. Experiments of feature extraction with and without MapReduce are conducted to illustrate the effectiveness of our proposed Cloud-Enabled WAMI Exploitation (CAWE) approach.

  13. Integrating language models into classifiers for BCI communication: a review

    NASA Astrophysics Data System (ADS)

    Speier, W.; Arnold, C.; Pouratian, N.

    2016-06-01

    Objective. The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. Approach. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Main results. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Significance. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.

  14. Integrating language models into classifiers for BCI communication: a review.

    PubMed

    Speier, W; Arnold, C; Pouratian, N

    2016-06-01

    The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.

  15. Electric chiller handbook. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-02-01

    Electric chillers have dominated the market for large commercial cooling systems due to their history of reliable, economical operation. The phaseout of CFCs and deregulation of the utility industry are two factors that significantly impact the chiller market. The CFC phaseout is resulting in the upgrading or replacement of thousands of electric chillers nationwide. In a deregulated environment, utilities are finding increasing need to provide services that can win and retain new customers. Utility representatives need current information on applying and selecting cost-effective chiller systems. The objective of this report was to develop a comprehensive handbook that helps utility technicalmore » and marketing staff, their customers, and design professionals evaluate and select the best options for chilled-water systems in commercial buildings. Investigators used a variety of industry data sources to develop market-share information for electric and gas chiller systems and to determine applications according to building age, type, and region. Discussions with chiller manufacturers provided information on product availability, performance, and ownership cost. Using EPRI`s COMTECH software, investigators performed comprehensive cost analyses for placement of large and small chillers in three representative cities. Case studies of actual installations support these analyses. Electric Chiller Handbook provides a single source of current information on all major issues associated with chiller selection and application. Key issues include chiller availability and markets, rated performance, future viability of various refrigerant options, the cost-effectiveness of alternative chillers, and chilled-water system optimization. The Handbook also describes available hardware, outlines the features and costs of gas-fired competitive systems, and provides methods and comparisons of life-cycle costing of various chiller system options. Analyses of chiller features and economics show that electric chillers are preferable to gas chillers in the large majority of applications, consistent with current market trends. Furthermore, today`s chillers offer a wide range of efficiencies and refrigerant options to serve cooling system needs for the 20-year lifetime of the chiller. Finally, new higher-efficiency models of electric chillers offer very attractive paybacks.« less

  16. Energy carries information

    NASA Astrophysics Data System (ADS)

    Ilgin, Irfan; Yang, I.-Sheng

    2014-08-01

    We show that for every qubit of quantum information, there is a well-defined notion of "the amount of energy that carries it," because it is a conserved quantity. This generalizes to larger systems and any conserved quantities: the eigenvalue spectrum of conserved charges has to be preserved while transferring quantum information. It is possible to "apparently" violate these conservations by losing a small fraction of information, but that must invoke a specific process which requires a large scale coherence. We discuss its implication regarding the black hole information paradox.

  17. Identification of informative subgraphs in brain networks

    NASA Astrophysics Data System (ADS)

    Marinazzo, D.; Wu, G.; Pellicoro, M.; Stramaglia, S.

    2013-01-01

    Measuring directed interactions in the brain in terms of information flow is a promising approach, mathematically treatable and amenable to encompass several methods. Here we present a formal expansion of the transfer entropy to put in evidence irreducible sets of variables which provide information for the future state of each assigned target. Multiplets characterized by a large contribution to the expansion are associated to informational circuits present in the system, with an informational character (synergetic or redundant) which can be inferred from the sign of the contribution.

  18. Diffeomorphometry and geodesic positioning systems for human anatomy.

    PubMed

    Miller, Michael I; Younes, Laurent; Trouvé, Alain

    2014-03-01

    The Computational Anatomy project has largely been a study of large deformations within a Riemannian framework as an efficient point of view for generating metrics between anatomical configurations. This approach turns D'Arcy Thompson's comparative morphology of human biological shape and form into a metrizable space. Since the metric is constructed based on the geodesic length of the flows of diffeomorphisms connecting the forms, we call it diffeomorphometry . Just as importantly, since the flows describe algebraic group action on anatomical submanifolds and associated functional measurements, they become the basis for positioning information, which we term geodesic positioning . As well the geodesic connections provide Riemannian coordinates for locating forms in the anatomical orbit, which we call geodesic coordinates . These three components taken together - the metric, geodesic positioning of information, and geodesic coordinates - we term the geodesic positioning system . We illustrate via several examples in human and biological coordinate systems and machine learning of the statistical representation of shape and form.

  19. Automated biosurveillance data from England and Wales, 1991-2011.

    PubMed

    Enki, Doyo G; Noufaily, Angela; Garthwaite, Paul H; Andrews, Nick J; Charlett, André; Lane, Chris; Farrington, C Paddy

    2013-01-01

    Outbreak detection systems for use with very large multiple surveillance databases must be suited both to the data available and to the requirements of full automation. To inform the development of more effective outbreak detection algorithms, we analyzed 20 years of data (1991-2011) from a large laboratory surveillance database used for outbreak detection in England and Wales. The data relate to 3,303 distinct types of infectious pathogens, with a frequency range spanning 6 orders of magnitude. Several hundred organism types were reported each week. We describe the diversity of seasonal patterns, trends, artifacts, and extra-Poisson variability to which an effective multiple laboratory-based outbreak detection system must adjust. We provide empirical information to guide the selection of simple statistical models for automated surveillance of multiple organisms, in the light of the key requirements of such outbreak detection systems, namely, robustness, flexibility, and sensitivity.

  20. The role of digital cartographic data in the geosciences

    USGS Publications Warehouse

    Guptill, S.C.

    1983-01-01

    The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.

  1. Automated Biosurveillance Data from England and Wales, 1991–2011

    PubMed Central

    Enki, Doyo G.; Noufaily, Angela; Garthwaite, Paul H.; Andrews, Nick J.; Charlett, André; Lane, Chris

    2013-01-01

    Outbreak detection systems for use with very large multiple surveillance databases must be suited both to the data available and to the requirements of full automation. To inform the development of more effective outbreak detection algorithms, we analyzed 20 years of data (1991–2011) from a large laboratory surveillance database used for outbreak detection in England and Wales. The data relate to 3,303 distinct types of infectious pathogens, with a frequency range spanning 6 orders of magnitude. Several hundred organism types were reported each week. We describe the diversity of seasonal patterns, trends, artifacts, and extra-Poisson variability to which an effective multiple laboratory-based outbreak detection system must adjust. We provide empirical information to guide the selection of simple statistical models for automated surveillance of multiple organisms, in the light of the key requirements of such outbreak detection systems, namely, robustness, flexibility, and sensitivity. PMID:23260848

  2. Next-Generation Pathology.

    PubMed

    Caie, Peter D; Harrison, David J

    2016-01-01

    The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.

  3. 3D exploitation of large urban photo archives

    NASA Astrophysics Data System (ADS)

    Cho, Peter; Snavely, Noah; Anderson, Ross

    2010-04-01

    Recent work in computer vision has demonstrated the potential to automatically recover camera and scene geometry from large collections of uncooperatively-collected photos. At the same time, aerial ladar and Geographic Information System (GIS) data are becoming more readily accessible. In this paper, we present a system for fusing these data sources in order to transfer 3D and GIS information into outdoor urban imagery. Applying this system to 1000+ pictures shot of the lower Manhattan skyline and the Statue of Liberty, we present two proof-of-concept examples of geometry-based photo enhancement which are difficult to perform via conventional image processing: feature annotation and image-based querying. In these examples, high-level knowledge projects from 3D world-space into georegistered 2D image planes and/or propagates between different photos. Such automatic capabilities lay the groundwork for future real-time labeling of imagery shot in complex city environments by mobile smart phones.

  4. Evolution of a Patient Information Management System in a Local Area Network Environment at Loyola University of Chicago Medical Center

    PubMed Central

    Price, Ronald N; Chandrasekhar, Arcot J; Tamirisa, Balaji

    1990-01-01

    The Department of Medicine at Loyola University Medical Center (LUMC) of Chicago has implemented a local area network (LAN) based Patient Information Management System (PIMS) as part of its integrated departmental database management system. PIMS consists of related database applications encompassing demographic information, current medications, problem lists, clinical data, prior events, and on-line procedure results. Integration into the existing departmental database system permits PIMS to capture and manipulate data in other departmental applications. Standardization of clinical data is accomplished through three data tables that verify diagnosis codes, procedures codes and a standardized set of clinical data elements. The modularity of the system, coupled with standardized data formats, allowed the development of a Patient Information Protocol System (PIPS). PIPS, a userdefinable protocol processor, provides physicians with individualized data entry or review screens customized for their specific research protocols or practice habits. Physician feedback indicates that the PIMS/PIPS combination enhances their ability to collect and review specific patient information by filtering large amount of clinical data.

  5. Extracting spatial information from large aperture exposures of diffuse sources

    NASA Technical Reports Server (NTRS)

    Clarke, J. T.; Moos, H. W.

    1981-01-01

    The spatial properties of large aperture exposures of diffuse emission can be used both to investigate spatial variations in the emission and to filter out camera noise in exposures of weak emission sources. Spatial imaging can be accomplished both parallel and perpendicular to dispersion with a resolution of 5-6 arc sec, and a narrow median filter running perpendicular to dispersion across a diffuse image selectively filters out point source features, such as reseaux marks and fast particle hits. Spatial information derived from observations of solar system objects is presented.

  6. Design of an intelligent information system for in-flight emergency assistance

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan; Karamouzis, Stamos

    1991-01-01

    The present research has as its goal the development of AI tools to help flight crews cope with in-flight malfunctions. The relevant tasks in such situations include diagnosis, prognosis, and recovery plan generation. Investigation of the information requirements of these tasks has shown that the determination of paths figures largely: what components or systems are connected to what others, how are they connected, whether connections satisfying certain criteria exist, and a number of related queries. The formulation of such queries frequently requires capabilities of the second-order predicate calculus. An information system is described that features second-order logic capabilities, and is oriented toward efficient formulation and execution of such queries.

  7. geophylobuilder 1.0: an arcgis extension for creating 'geophylogenies'.

    PubMed

    Kidd, David M; Liu, Xianhua

    2008-01-01

    Evolution is inherently a spatiotemporal process; however, despite this, phylogenetic and geographical data and models remain largely isolated from one another. Geographical information systems provide a ready-made spatial modelling, analysis and dissemination environment within which phylogenetic models can be explicitly linked with their associated spatial data and subsequently integrated with other georeferenced data sets describing the biotic and abiotic environment. geophylobuilder 1.0 is an extension for the arcgis geographical information system that builds a 'geophylogenetic' data model from a phylogenetic tree and associated geographical data. Geophylogenetic database objects can subsequently be queried, spatially analysed and visualized in both 2D and 3D within a geographical information systems. © 2007 The Authors.

  8. Instruction in Pharmacokinetics: A Computer-Assisted Demonstration System

    ERIC Educational Resources Information Center

    Kahn, Norman; Bigger, J. Thomas

    1974-01-01

    The emerging discipline of clinical pharmacology is generating an ever increasing data base on the physiological disposition of a large number of drugs in man. Presents a system which would render this information readily understandable to students, regardless of their mathematical facility. (Author/PG)

  9. NASA/Howard University Large Space Structures Institute

    NASA Technical Reports Server (NTRS)

    Broome, T. H., Jr.

    1984-01-01

    Basic research on the engineering behavior of large space structures is presented. Methods of structural analysis, control, and optimization of large flexible systems are examined. Topics of investigation include the Load Correction Method (LCM) modeling technique, stabilization of flexible bodies by feedback control, mathematical refinement of analysis equations, optimization of the design of structural components, deployment dynamics, and the use of microprocessors in attitude and shape control of large space structures. Information on key personnel, budgeting, support plans and conferences is included.

  10. Systems Analysis for Large Army Formations.

    DTIC Science & Technology

    1984-06-01

    Science & Technology Division, April 1982. 8. Deitel , H.M., An Introduction to Operating Systems, Addison-Wesley Systems Programming Series, 1982, pp...information about enemy units and their structure. The corresponding application program should provide the user with the capability to enter, maintain and...corres- ponding application program should provide the Operations Sub- system personnel the capability to enter, retrieve, and modify proposed changes to

  11. The ATLAS Eventlndex: data flow and inclusion of other metadata

    NASA Astrophysics Data System (ADS)

    Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration

    2016-10-01

    The ATLAS EventIndex is the catalogue of the event-related metadata for the information collected from the ATLAS detector. The basic unit of this information is the event record, containing the event identification parameters, pointers to the files containing this event as well as trigger decision information. The main use case for the EventIndex is event picking, as well as data consistency checks for large production campaigns. The EventIndex employs the Hadoop platform for data storage and handling, as well as a messaging system for the collection of information. The information for the EventIndex is collected both at Tier-0, when the data are first produced, and from the Grid, when various types of derived data are produced. The EventIndex uses various types of auxiliary information from other ATLAS sources for data collection and processing: trigger tables from the condition metadata database (COMA), dataset information from the data catalogue AMI and the Rucio data management system and information on production jobs from the ATLAS production system. The ATLAS production system is also used for the collection of event information from the Grid jobs. EventIndex developments started in 2012 and in the middle of 2015 the system was commissioned and started collecting event metadata, as a part of ATLAS Distributed Computing operations.

  12. Cockpit displayed traffic information and distributed management in air traffic control

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.

    1980-01-01

    A graphical display of information (such as surrounding aircraft and navigation routes) in the cockpit on a cathode ray tube has been proposed for improving the safety, orderliness, and expeditiousness of the air traffic control system. An investigation of this method at NASA-Ames indicated a large reduction in controller verbal work load without increasing pilot verbal load; the visual work may be increased. The cockpit displayed traffic and navigation information system reduced response delays permitting pilots to maintain their spacing more closely and precisely than when depending entirely on controller-issued radar vectors and speed command.

  13. Orienting health care information systems toward quality: how Group Health Cooperative of Puget Sound did it.

    PubMed

    Goverman, I L

    1994-11-01

    Group Health Cooperative of Puget Sound (GHC), a large staff-model health maintenance organization based in Seattle, is redesigning its information systems to provide the systems and information needed to support its quality agenda. Long-range planning for GHC's information resources was done in three phases. In assessment, interviews, surveys, and a benchmarking effort identified strengths and weaknesses of the existing information systems. We concluded that we needed to improve clinical care and patient management systems and enhance health plan applications. In direction setting, we developed six objectives (for example, approach information systems in a way that is consistent with quality improvement principles). Detailed planning was used to define projects, timing, and resource allocations. Some of the most important efforts in the resulting five-year plan include the development of (1) a computerized patient record; (2) a provider-based clinical workstation for access to patient information, order entry, results reporting, guidelines, and reminders; (3) a comprehensive set of patient management and service quality systems; (4) reengineered structures, policies, and processes within the health plan, supported by a complete set of integrated information systems; (5) a standardized, high-capacity communications network to provide linkages both within GHC and among its business partners; and (6) a revised oversight structure for information services, which forms partnerships with users. A quality focus ensured that each project not only produced its own benefits but also supported the larger organizational goals associated with "total" quality.

  14. [The computer assisted pacemaker clinic at the regional hospital of Udine (author's transl)].

    PubMed

    Feruglio, G A; Lestuzzi, L; Carminati, D

    1978-01-01

    For a close follow-up of large groups of pacemaker patients and for evaluation of long term pacing on a reliable statistical basis, many pacemaker centers in the world are now using computer systems. A patient data system with structured display records, designed to give complete, comprehensive and surveyable information and which are immediately retrievable 24 hours a day, on display or printed sets, seems to offer an ideal solution. The pacemaker clinic at the Regional Hospital of Udine has adopted this type of system. The clinic in linked to a live, on-line patient data system (G/3, Informatica Friuli-Venezia Giulia). The input and retrieval of information are made through a conventional keyboard. The input formats have fixed headings with coded alternatives and a limited space for comments in free text. The computer edits the coded information to surveyable reviews. Searches can be made on coded information and data of interest.

  15. Human interface to large multimedia databases

    NASA Astrophysics Data System (ADS)

    Davis, Ben; Marks, Linn; Collins, Dave; Mack, Robert; Malkin, Peter; Nguyen, Tam

    1994-04-01

    The emergence of high-speed networking for multimedia will have the effect of turning the computer screen into a window on a very large information space. As this information space increases in size and complexity, providing users with easy and intuitive means of accessing information will become increasingly important. Providing access to large amounts of text has been the focus of work for hundreds of years and has resulted in the evolution of a set of standards, from the Dewey Decimal System for libraries to the recently proposed ANSI standards for representing information on-line: KIF, Knowledge Interchange Format, and CG's, Conceptual Graphs. Certain problems remain unsolved by these efforts, though: how to let users know the contents of the information space, so that they know whether or not they want to search it in the first place, how to facilitate browsing, and, more specifically, how to facilitate visual browsing. These issues are particularly important for users in educational contexts and have been the focus of much of our recent work. In this paper we discuss some of the solutions we have prototypes: specifically, visual means, visual browsers, and visual definitional sequences.

  16. Large Deployable Reflector (LDR) system concept and technology definition study. Volume 1: Executive summary, analyses and trades, and system concepts

    NASA Technical Reports Server (NTRS)

    Agnew, Donald L.; Jones, Peter A.

    1989-01-01

    A study was conducted to define reasonable and representative large deployable reflector (LDR) system concepts for the purpose of defining a technology development program aimed at providing the requisite technological capability necessary to start LDR development by the end of 1991. This volume includes the executive summary for the total study, a report of thirteen system analysis and trades tasks (optical configuration, aperture size, reflector material, segmented mirror, optical subsystem, thermal, pointing and control, transportation to orbit, structures, contamination control, orbital parameters, orbital environment, and spacecraft functions), and descriptions of three selected LDR system concepts. Supporting information is contained in appendices.

  17. Reconciling disparate information in continuity of care documents: Piloting a system to consolidate structured clinical documents.

    PubMed

    Hosseini, Masoud; Jones, Josette; Faiola, Anthony; Vreeman, Daniel J; Wu, Huanmei; Dixon, Brian E

    2017-10-01

    Due to the nature of information generation in health care, clinical documents contain duplicate and sometimes conflicting information. Recent implementation of Health Information Exchange (HIE) mechanisms in which clinical summary documents are exchanged among disparate health care organizations can proliferate duplicate and conflicting information. To reduce information overload, a system to automatically consolidate information across multiple clinical summary documents was developed for an HIE network. The system receives any number of Continuity of Care Documents (CCDs) and outputs a single, consolidated record. To test the system, a randomly sampled corpus of 522 CCDs representing 50 unique patients was extracted from a large HIE network. The automated methods were compared to manual consolidation of information for three key sections of the CCD: problems, allergies, and medications. Manual consolidation of 11,631 entries was completed in approximately 150h. The same data were automatically consolidated in 3.3min. The system successfully consolidated 99.1% of problems, 87.0% of allergies, and 91.7% of medications. Almost all of the inaccuracies were caused by issues involving the use of standardized terminologies within the documents to represent individual information entries. This study represents a novel, tested tool for de-duplication and consolidation of CDA documents, which is a major step toward improving information access and the interoperability among information systems. While more work is necessary, automated systems like the one evaluated in this study will be necessary to meet the informatics needs of providers and health systems in the future. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The agent-based spatial information semantic grid

    NASA Astrophysics Data System (ADS)

    Cui, Wei; Zhu, YaQiong; Zhou, Yong; Li, Deren

    2006-10-01

    Analyzing the characteristic of multi-Agent and geographic Ontology, The concept of the Agent-based Spatial Information Semantic Grid (ASISG) is defined and the architecture of the ASISG is advanced. ASISG is composed with Multi-Agents and geographic Ontology. The Multi-Agent Systems are composed with User Agents, General Ontology Agent, Geo-Agents, Broker Agents, Resource Agents, Spatial Data Analysis Agents, Spatial Data Access Agents, Task Execution Agent and Monitor Agent. The architecture of ASISG have three layers, they are the fabric layer, the grid management layer and the application layer. The fabric layer what is composed with Data Access Agent, Resource Agent and Geo-Agent encapsulates the data of spatial information system so that exhibits a conceptual interface for the Grid management layer. The Grid management layer, which is composed with General Ontology Agent, Task Execution Agent and Monitor Agent and Data Analysis Agent, used a hybrid method to manage all resources that were registered in a General Ontology Agent that is described by a General Ontology System. The hybrid method is assembled by resource dissemination and resource discovery. The resource dissemination push resource from Local Ontology Agent to General Ontology Agent and the resource discovery pull resource from the General Ontology Agent to Local Ontology Agents. The Local Ontology Agent is derived from special domain and describes the semantic information of local GIS. The nature of the Local Ontology Agents can be filtrated to construct a virtual organization what could provides a global scheme. The virtual organization lightens the burdens of guests because they need not search information site by site manually. The application layer what is composed with User Agent, Geo-Agent and Task Execution Agent can apply a corresponding interface to a domain user. The functions that ASISG should provide are: 1) It integrates different spatial information systems on the semantic The Grid management layer establishes a virtual environment that integrates seamlessly all GIS notes. 2) When the resource management system searches data on different spatial information systems, it transfers the meaning of different Local Ontology Agents rather than access data directly. So the ability of search and query can be said to be on the semantic level. 3) The data access procedure is transparent to guests, that is, they could access the information from remote site as current disk because the General Ontology Agent could automatically link data by the Data Agents that link the Ontology concept to GIS data. 4) The capability of processing massive spatial data. Storing, accessing and managing massive spatial data from TB to PB; efficiently analyzing and processing spatial data to produce model, information and knowledge; and providing 3D and multimedia visualization services. 5) The capability of high performance computing and processing on spatial information. Solving spatial problems with high precision, high quality, and on a large scale; and process spatial information in real time or on time, with high-speed and high efficiency. 6) The capability of sharing spatial resources. The distributed heterogeneous spatial information resources are Shared and realizing integrated and inter-operated on semantic level, so as to make best use of spatial information resources,such as computing resources, storage devices, spatial data (integrating from GIS, RS and GPS), spatial applications and services, GIS platforms, 7) The capability of integrating legacy GIS system. A ASISG can not only be used to construct new advanced spatial application systems, but also integrate legacy GIS system, so as to keep extensibility and inheritance and guarantee investment of users. 8) The capability of collaboration. Large-scale spatial information applications and services always involve different departments in different geographic places, so remote and uniform services are needed. 9) The capability of supporting integration of heterogeneous systems. Large-scale spatial information systems are always synthetically applications, so ASISG should provide interoperation and consistency through adopting open and applied technology standards. 10) The capability of adapting dynamic changes. Business requirements, application patterns, management strategies, and IT products always change endlessly for any departments, so ASISG should be self-adaptive. Two examples are provided in this paper, those examples provide a detailed way on how you design your semantic grid based on Multi-Agent systems and Ontology. In conclusion, the semantic grid of spatial information system could improve the ability of the integration and interoperability of spatial information grid.

  19. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  20. A Survey of Librarian Perceptions of Information Literacy Techniques

    ERIC Educational Resources Information Center

    Yearwood, Simone L.; Foasberg, Nancy M.; Rosenberg, Kenneth D.

    2015-01-01

    Teaching research competencies and information literacy is an integral part of the academic librarian's role. There has long been debate among librarians over what are the most effective methods of instruction for college students. Library Faculty members at a large urban university system were surveyed to determine their perceptions of the…

  1. New Telecommunication Uses in Tama, New Town. Summary.

    ERIC Educational Resources Information Center

    Komatsuzaki, Seisuke

    This paper deals with a Test Information Service which is being tried out in Tama, Japan. The large scale field experiment of the new telecommunication system is called Coaxial Cable Information Service (CCIS) and was started in 1976. Preliminary findings included that: 1) users are interested in the experiment; 2) two-way communication systems…

  2. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  3. America's New Deficit: The Shortage of Information Technology Workers.

    ERIC Educational Resources Information Center

    Office of Technology Policy (DOC), Washington, DC.

    According to a recent survey of midsized and large U.S. companies, approximately 190,000 information technology (IT) jobs are unfilled because of a shortage of qualified workers. The formal, four-year education system is producing only a small proportion of the workers required. IT workers can also obtain skills from two-year associate…

  4. Quantum Search in Hilbert Space

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2003-01-01

    A proposed quantum-computing algorithm would perform a search for an item of information in a database stored in a Hilbert-space memory structure. The algorithm is intended to make it possible to search relatively quickly through a large database under conditions in which available computing resources would otherwise be considered inadequate to perform such a task. The algorithm would apply, more specifically, to a relational database in which information would be stored in a set of N complex orthonormal vectors, each of N dimensions (where N can be exponentially large). Each vector would constitute one row of a unitary matrix, from which one would derive the Hamiltonian operator (and hence the evolutionary operator) of a quantum system. In other words, all the stored information would be mapped onto a unitary operator acting on a quantum state that would represent the item of information to be retrieved. Then one could exploit quantum parallelism: one could pose all search queries simultaneously by performing a quantum measurement on the system. In so doing, one would effectively solve the search problem in one computational step. One could exploit the direct- and inner-product decomposability of the unitary matrix to make the dimensionality of the memory space exponentially large by use of only linear resources. However, inasmuch as the necessary preprocessing (the mapping of the stored information into a Hilbert space) could be exponentially expensive, the proposed algorithm would likely be most beneficial in applications in which the resources available for preprocessing were much greater than those available for searching.

  5. A mobile Nursing Information System based on human-computer interaction design for improving quality of nursing.

    PubMed

    Su, Kuo-Wei; Liu, Cheng-Li

    2012-06-01

    A conventional Nursing Information System (NIS), which supports the role of nurse in some areas, is typically deployed as an immobile system. However, the traditional information system can't response to patients' conditions in real-time, causing delays on the availability of this information. With the advances of information technology, mobile devices are increasingly being used to extend the human mind's limited capacity to recall and process large numbers of relevant variables and to support information management, general administration, and clinical practice. Unfortunately, there have been few studies about the combination of a well-designed small-screen interface with a personal digital assistant (PDA) in clinical nursing. Some researchers found that user interface design is an important factor in determining the usability and potential use of a mobile system. Therefore, this study proposed a systematic approach to the development of a mobile nursing information system (MNIS) based on Mobile Human-Computer Interaction (M-HCI) for use in clinical nursing. The system combines principles of small-screen interface design with user-specified requirements. In addition, the iconic functions were designed with metaphor concept that will help users learn the system more quickly with less working-memory. An experiment involving learnability testing, thinking aloud and a questionnaire investigation was conducted for evaluating the effect of MNIS on PDA. The results show that the proposed MNIS is good on learning and higher satisfaction on symbol investigation, terminology and system information.

  6. AI in medicine on its way from knowledge-intensive to data-intensive systems.

    PubMed

    Horn, W

    2001-08-01

    The last 20 years of research and development in the field of artificial intelligence in medicine (AIM) show a path from knowledge-intensive systems, which try to capture the essential knowledge of experts in a knowledge-based system, to data-intensive systems available today. Nowadays enormous amounts of information is accessible electronically. Large datasets are collected continuously monitoring physiological parameters of patients. Knowledge-based systems are needed to make use of all these data available and to help us to cope with the information explosion. In addition, temporal data analysis and intelligent information visualization can help us to get a summarized view of the change over time of clinical parameters. Integrating AIM modules into the daily-routine software environment of our care providers gives us a great chance for maintaining and improving quality of care.

  7. Applying a framework for assessing the health system challenges to scaling up mHealth in South Africa

    PubMed Central

    2012-01-01

    Background Mobile phone technology has demonstrated the potential to improve health service delivery, but there is little guidance to inform decisions about acquiring and implementing mHealth technology at scale in health systems. Using the case of community-based health services (CBS) in South Africa, we apply a framework to appraise the opportunities and challenges to effective implementation of mHealth at scale in health systems. Methods A qualitative study reviewed the benefits and challenges of mHealth in community-based services in South Africa, through a combination of key informant interviews, site visits to local projects and document reviews. Using a framework adapted from three approaches to reviewing sustainable information and communication technology (ICT), the lessons from local experience and elsewhere formed the basis of a wider consideration of scale up challenges in South Africa. Results Four key system dimensions were identified and assessed: government stewardship and the organisational, technological and financial systems. In South Africa, the opportunities for successful implementation of mHealth include the high prevalence of mobile phones, a supportive policy environment for eHealth, successful use of mHealth for CBS in a number of projects and a well-developed ICT industry. However there are weaknesses in other key health systems areas such as organisational culture and capacity for using health information for management, and the poor availability and use of ICT in primary health care. The technological challenges include the complexity of ensuring interoperability and integration of information systems and securing privacy of information. Finally, there are the challenges of sustainable financing required for large scale use of mobile phone technology in resource limited settings. Conclusion Against a background of a health system with a weak ICT environment and limited implementation capacity, it remains uncertain that the potential benefits of mHealth for CBS would be retained with immediate large-scale implementation. Applying a health systems framework facilitated a systematic appraisal of potential challenges to scaling up mHealth for CBS in South Africa and may be useful for policy and practice decision-making in other low- and middle-income settings. PMID:23126370

  8. Bioinspired Principles for Large-Scale Networked Sensor Systems: An Overview

    PubMed Central

    Jacobsen, Rune Hylsberg; Zhang, Qi; Toftegaard, Thomas Skjødeberg

    2011-01-01

    Biology has often been used as a source of inspiration in computer science and engineering. Bioinspired principles have found their way into network node design and research due to the appealing analogies between biological systems and large networks of small sensors. This paper provides an overview of bioinspired principles and methods such as swarm intelligence, natural time synchronization, artificial immune system and intercellular information exchange applicable for sensor network design. Bioinspired principles and methods are discussed in the context of routing, clustering, time synchronization, optimal node deployment, localization and security and privacy. PMID:22163841

  9. A vision for an ultra-high resolution integrated water cycle observation and prediction system

    NASA Astrophysics Data System (ADS)

    Houser, P. R.

    2013-05-01

    Society's welfare, progress, and sustainable economic growth—and life itself—depend on the abundance and vigorous cycling and replenishing of water throughout the global environment. The water cycle operates on a continuum of time and space scales and exchanges large amounts of energy as water undergoes phase changes and is moved from one part of the Earth system to another. We must move toward an integrated observation and prediction paradigm that addresses broad local-to-global science and application issues by realizing synergies associated with multiple, coordinated observations and prediction systems. A central challenge of a future water and energy cycle observation strategy is to progress from single variable water-cycle instruments to multivariable integrated instruments in electromagnetic-band families. The microwave range in the electromagnetic spectrum is ideally suited for sensing the state and abundance of water because of water's dielectric properties. Eventually, a dedicated high-resolution water-cycle microwave-based satellite mission may be possible based on large-aperture antenna technology that can harvest the synergy that would be afforded by simultaneous multichannel active and passive microwave measurements. A partial demonstration of these ideas can even be realized with existing microwave satellite observations to support advanced multivariate retrieval methods that can exploit the totality of the microwave spectral information. The simultaneous multichannel active and passive microwave retrieval would allow improved-accuracy retrievals that are not possible with isolated measurements. Furthermore, the simultaneous monitoring of several of the land, atmospheric, oceanic, and cryospheric states brings synergies that will substantially enhance understanding of the global water and energy cycle as a system. The multichannel approach also affords advantages to some constituent retrievals—for instance, simultaneous retrieval of vegetation biomass would improve soil-moisture retrieval by avoiding the need for auxiliary vegetation information. This multivariable water-cycle observation system must be integrated with high-resolution, application relevant prediction systems to optimize their information content and utility is addressing critical water cycle issues. One such vision is a real-time ultra-high resolution locally-moasiced global land modeling and assimilation system, that overlays regional high-fidelity information over a baseline global land prediction system. Such a system would provide the best possible local information for use in applications, while integrating and sharing information globally for diagnosing larger water cycle variability. In a sense, this would constitute a hydrologic telecommunication system, where the best local in-situ gage, Doppler radar, and weather station can be shared internationally, and integrated in a consistent manner with global observation platforms like the multivariable water cycle mission. To realize such a vision, large issues must be addressed, such as international data sharing policy, model-observation integration approaches that maintain local extremes while achieving global consistency, and methods for establishing error estimates and uncertainty.

  10. The Impact of Information Technology on the Design, Development, and Implementation of a Lunar Exploration Mission

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Sims, Michael H.; Briggs, Geoffrey A.

    1996-01-01

    From the beginning to the present expeditions to the Moon have involved a large investment of human labor. This has been true for all aspects of the process, from the initial design of the mission, whether scientific or technological, through the development of the instruments and the spacecraft, to the flight and operational phases. In addition to the time constraints that this situation imposes, there is also a significant cost associated with the large labor costs. As a result lunar expeditions have been limited to a few robotic missions and the manned Apollo program missions of the 1970s. With the rapid rise of the new information technologies, new paradigms are emerging that promise to greatly reduce both the time and cost of such missions. With the rapidly increasing capabilities of computer hardware and software systems, as well as networks and communication systems, a new balance of work is being developed between the human and the machine system. This new balance holds the promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new information technologies, utilizing knowledge-based software and very highspeed computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities that have hitherto been unavailable to the mission and spacecraft designer and the system operator. This paper will utilize typical lunar missions, both robotic and crewed, as a basis to describe and illustrate how these new information system technologies could be applied to all aspects such missions. In particular, new system design tradeoff tools will be described along with technologies that will allow a very much greater degree of autonomy of exploration vehicles than has heretofore been possible. In addition, new information technologies that will significantly reduce the human operational requirements will be discussed.

  11. Information Flows? A Critique of Transfer Entropies

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Barnett, Nix; Crutchfield, James P.

    2016-06-01

    A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.

  12. [Introduction of hospital information system and anesthesia information management system into the perianesthetic practice at Osaka City University Hospital].

    PubMed

    Shimizu, Motoko; Tanaka, Katsuaki; Hagiwara, Chie; Ikenaga, Kazutake; Yoshioka, Miwako; Asada, Akira

    2011-06-01

    Recently, the hospital information systems (HIS) and anesthesia information management systems (AIMS) have been rapidly improved and have been introduced into the clinical practice in Japan drastically; however, few reports have detailed their influences on clinical practice. We here report our experience. We introduced HIS (EGMAIN-EX, Fujitsu Co., Ltd.) in our preoperative evaluation clinic and in the postoperative care unit. AIMS (ORSYS, Philips Electronics Japan) was introduced almost only to the intraoperative management. It became easy for us to acquire patient's information and to share it with the medical staffs in the other departments. However, we had to invest large human resources for the introduction and maintenance of the HIS and the AIMS. Though AIMS is more useful in anesthetic management than HIS, it seems to be more suitable for coordination with the medical staffs in the other departments to use HIS for perioperative management than to use AIMS.

  13. Modelling End-User of Electronic-Government Service: The Role of Information quality, System Quality and Trust

    NASA Astrophysics Data System (ADS)

    Witarsyah Jacob, Deden; Fudzee, Mohd Farhan Md; Aizi Salamat, Mohamad; Kasim, Shahreen; Mahdin, Hairulnizam; Azhar Ramli, Azizul

    2017-08-01

    Many governments around the world increasingly use internet technologies such as electronic government to provide public services. These services range from providing the most basic informational website to deploying sophisticated tools for managing interactions between government agencies and beyond government. Electronic government (e-government) aims to provide a more accurate, easily accessible, cost-effective and time saving for the community. In this study, we develop a new model of e-government adoption service by extending the Unified Theory of Acceptance and Use of Technology (UTAUT) through the incorporation of some variables such as System Quality, Information Quality and Trust. The model is then tested using a large-scale, multi-site survey research of 237 Indonesian citizens. This model will be validated by using Structural Equation Modeling (SEM). The result indicates that System Quality, Information Quality and Trust variables proven to effect user behavior. This study extends the current understanding on the influence of System Quality, Information Quality and Trust factors to researchers, practitioners, and policy makers.

  14. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  15. Information Technology Supports Integration of Satellite Imagery with Irrigation Management in California's Central Valley

    USDA-ARS?s Scientific Manuscript database

    Remotely sensed data can potentially be used to develop crop coefficient estimates over large areas and make irrigation scheduling more practical, convenient, and accurate. A demonstration system is being developed under NASA's Terrestrial Observation and Prediction System (TOPS) to automatically r...

  16. Alex Swindler | NREL

    Science.gov Websites

    distributed computing, Web information systems engineering, software engineering, computer graphics, and Dashboard, NREL Energy Story visualization, Green Button data integration, as well as a large number of Web of an R&D 100 Award. Prior to joining NREL, Alex worked as a system administrator, Web developer

  17. Decision-Guided Recommenders with Composite Alternatives

    ERIC Educational Resources Information Center

    Alodhaibi, Khalid

    2011-01-01

    Recommender systems aim to support users in their decision-making process while interacting with large information spaces and recommend items of interest to users based on preferences they have expressed, either explicitly or implicitly. Recommender systems are increasingly used with product and service selection over the Internet. Although…

  18. Systems and Cascades in Cognitive Development and Academic Achievement

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Hahn, Chun-Shin; Wolke, Dieter

    2013-01-01

    A large-scale ("N" = 552) controlled multivariate prospective 14-year longitudinal study of a developmental cascade embedded in a developmental system showed that information-processing efficiency in infancy (4 months), general mental development in toddlerhood (18 months), behavior difficulties in early childhood (36 months),…

  19. Characteristics of Urbanization in Five Watersheds of Anchorage, Alaska: Geographic Information System Data

    USGS Publications Warehouse

    Moran, Edward H.

    2002-01-01

    The report contains environmental and urban geographic information system data for 14 sites in 5 watersheds in Anchorage, Alaska. These sites were examined during summer in 1999 and 2000 to determine effects of urbanization on water quality. The data sets are Environmental Systems Research Institute, Inc., shapefiles, coverages, and images. Also included are an elevation grid and a triangulated irregular network. Although the data are intended for users with advanced geographic information system capabilities, simple images of the data also are available. ArcView? 3.2 project, an ArcGIS? project, and 16 ArcExplorer2? projects are linked to the PDF file based report. Some of these coverages are large files over 10 MB. The largest coverage, impervious cover, is 208 MB.

  20. Integrating remote sensing, geographic information systems and global positioning system techniques with hydrological modeling

    NASA Astrophysics Data System (ADS)

    Thakur, Jay Krishna; Singh, Sudhir Kumar; Ekanthalu, Vicky Shettigondahalli

    2017-07-01

    Integration of remote sensing (RS), geographic information systems (GIS) and global positioning system (GPS) are emerging research areas in the field of groundwater hydrology, resource management, environmental monitoring and during emergency response. Recent advancements in the fields of RS, GIS, GPS and higher level of computation will help in providing and handling a range of data simultaneously in a time- and cost-efficient manner. This review paper deals with hydrological modeling, uses of remote sensing and GIS in hydrological modeling, models of integrations and their need and in last the conclusion. After dealing with these issues conceptually and technically, we can develop better methods and novel approaches to handle large data sets and in a better way to communicate information related with rapidly decreasing societal resources, i.e. groundwater.

  1. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  2. The Systems Analysis and Design Course: An Educators' Assessment of the Importance and Coverage of Topics

    ERIC Educational Resources Information Center

    Guidry, Brandi N.; Stevens, David P.; Totaro, Michael W.

    2011-01-01

    This study examines instructors' perceptions regarding the skills and topics that are most important in the teaching of a Systems Analysis and Design ("SAD") course and the class time devoted to each. A large number of Information Systems ("IS") educators at AACSB accredited schools across the United States were surveyed.…

  3. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  4. Extinguishing agent for magnesium fire, phases 5 and 6

    NASA Astrophysics Data System (ADS)

    Beeson, H. D.; Tapscott, R. E.; Mason, B. E.

    1987-07-01

    This report documents the validation testing of the extinguishing system for metal fires developed as part of Phases 1 to 4. The results of this validation testing form the basis of information from which draft military specifications necessary to procure the agent and the agent delivery system may be developed. The developed system was tested against a variety of large-scale metal fire scenarios and the capabilities of the system were assessed. In addition the response of the system to storage and to changes in ambient conditions was tested. Results of this testing revealed that the developed system represented a reliable metal fire extinguishing system that could control and extinguish very large metal fires. The specifications developed for the agent and for the delivery system are discussed in detail.

  5. National information network and database system of hazardous waste management in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma Hongchang

    1996-12-31

    Industries in China generate large volumes of hazardous waste, which makes it essential for the nation to pay more attention to hazardous waste management. National laws and regulations, waste surveys, and manifest tracking and permission systems have been initiated. Some centralized hazardous waste disposal facilities are under construction. China`s National Environmental Protection Agency (NEPA) has also obtained valuable information on hazardous waste management from developed countries. To effectively share this information with local environmental protection bureaus, NEPA developed a national information network and database system for hazardous waste management. This information network will have such functions as information collection, inquiry,more » and connection. The long-term objective is to establish and develop a national and local hazardous waste management information network. This network will significantly help decision makers and researchers because it will be easy to obtain information (e.g., experiences of developed countries in hazardous waste management) to enhance hazardous waste management in China. The information network consists of five parts: technology consulting, import-export management, regulation inquiry, waste survey, and literature inquiry.« less

  6. A novel architecture for information retrieval system based on semantic web

    NASA Astrophysics Data System (ADS)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  7. Information Resource Management Planning in the Office of the Under Secretary of Defense (Acquisition)

    DTIC Science & Technology

    1989-08-01

    Include in this plan the role of the Defense Technical Information Center (DTIC), the Defense Technology Security Administration ( DTSA ), and ODDR&E’s...DTIC = Defense Technical Information Center DTSA = Defense Technology Security Administration DUSD = Deputy Under Secretary of Defense Gloss. 2 DUSD...technologically sensitive requests. The Defense Technology Security Administi ation ( DTSA ) is developing a large system to track foreign military sales

  8. Plan Debugging Using Approximate Domain Theories.

    DTIC Science & Technology

    1995-03-01

    compelling suggestion that generative plan- ning systems solving large problems will need to exploit the control information implicit in uncertain...control information implicit in uncertain information may well lead the planner to expand one portion of a plan at one point, and a separate portion of...solutions that have been proposed are to abandon declarativism (as suggested in the work on situated automata theory and its variants [1, 16, 56, 72

  9. Implementing Large-Scale Instructional Technology in Kenya: Changing Instructional Practice and Developing Accountability in a National Education System

    ERIC Educational Resources Information Center

    Piper, Benjamin; Oyanga, Arbogast; Mejia, Jessica; Pouezevara, Sarah

    2017-01-01

    Previous large-scale education technology interventions have shown only modest impacts on student achievement. Building on results from an earlier randomized controlled trial of three different applications of information and communication technologies (ICTs) on primary education in Kenya, the Tusome Early Grade Reading Activity developed the…

  10. An On-Line Nutrition Information System for the Clinical Dietitian

    PubMed Central

    Petot, Grace J.; Houser, Harold B.; Uhrich, Roberta V.

    1980-01-01

    A university based computerized nutrient data base has been integrated into an on-line nutrition information system in a large acute care hospital. Key elements described in the design and installation of the system are the addition of hospital menu items to the existing nutrient data base, the creation of a unique recipe file in the computer, production of a customized menu/nutrient handbook, preparation of forms and establishment of output formats. Standardization of nutrient calculations in the clinical and food production areas, variety and purposes of various format options, the advantages of timesharing and plans for expansion of the system are discussed.

  11. Cogeneration technology alternatives study. Volume 6: Computer data

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The potential technical capabilities of energy conversion systems in the 1985 - 2000 time period were defined with emphasis on systems using coal, coal-derived fuels or alternate fuels. Industrial process data developed for the large energy consuming industries serve as a framework for the cogeneration applications. Ground rules for the study were established and other necessary equipment (balance-of-plant) was defined. This combination of technical information, energy conversion system data ground rules, industrial process information and balance-of-plant characteristics was analyzed to evaluate energy consumption, capital and operating costs and emissions. Data in the form of computer printouts developed for 3000 energy conversion system-industrial process combinations are presented.

  12. Large Scale Portability of Hospital Information System Software

    PubMed Central

    Munnecke, Thomas H.; Kuhn, Ingeborg M.

    1986-01-01

    As part of its Decentralized Hospital Computer Program (DHCP) the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985. The application software for these systems is based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent. The software, developed by VA employees, is built upon a layered approach, where application packages layer on a common data dictionary which is supported by a Kernel of software. Communications between facilities are based on public domain Department of Defense ARPA net standards for domain naming, mail transfer protocols, and message formats, layered on a variety of communications technologies.

  13. Axiope tools for data management and data sharing.

    PubMed

    Goddard, Nigel H; Cannon, Robert C; Howell, Fred W

    2003-01-01

    Many areas of biological research generate large volumes of very diverse data. Managing this data can be a difficult and time-consuming process, particularly in an academic environment where there are very limited resources for IT support staff such as database administrators. The most economical and efficient solutions are those that enable scientists with minimal IT expertise to control and operate their own desktop systems. Axiope provides one such solution, Catalyzer, which acts as flexible cataloging system for creating structured records describing digital resources. The user is able specify both the content and structure of the information included in the catalog. Information and resources can be shared by a variety of means, including automatically generated sets of web pages. Federation and integration of this information, where needed, is handled by Axiope's Mercat server. Where there is a need for standardization or compatibility of the structures usedby different researchers this canbe achieved later by applying user-defined mappings in Mercat. In this way, large-scale data sharing can be achieved without imposing unnecessary constraints or interfering with the way in which individual scientists choose to record and catalog their work. We summarize the key technical issues involved in scientific data management and data sharing, describe the main features and functionality of Axiope Catalyzer and Axiope Mercat, and discuss future directions and requirements for an information infrastructure to support large-scale data sharing and scientific collaboration.

  14. Modeling systems-level dynamics: Understanding without mechanistic explanation in integrative systems biology.

    PubMed

    MacLeod, Miles; Nersessian, Nancy J

    2015-02-01

    In this paper we draw upon rich ethnographic data of two systems biology labs to explore the roles of explanation and understanding in large-scale systems modeling. We illustrate practices that depart from the goal of dynamic mechanistic explanation for the sake of more limited modeling goals. These processes use abstract mathematical formulations of bio-molecular interactions and data fitting techniques which we call top-down abstraction to trade away accurate mechanistic accounts of large-scale systems for specific information about aspects of those systems. We characterize these practices as pragmatic responses to the constraints many modelers of large-scale systems face, which in turn generate more limited pragmatic non-mechanistic forms of understanding of systems. These forms aim at knowledge of how to predict system responses in order to manipulate and control some aspects of them. We propose that this analysis of understanding provides a way to interpret what many systems biologists are aiming for in practice when they talk about the objective of a "systems-level understanding." Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Multiparty quantum mutual information: An alternative definition

    NASA Astrophysics Data System (ADS)

    Kumar, Asutosh

    2017-07-01

    Mutual information is the reciprocal information that is common to or shared by two or more parties. Quantum mutual information for bipartite quantum systems is non-negative, and bears the interpretation of total correlation between the two subsystems. This may, however, no longer be true for three or more party quantum systems. In this paper, we propose an alternative definition of multipartite information, taking into account the shared information between two and more parties. It is non-negative, observes monotonicity under partial trace as well as completely positive maps, and equals the multipartite information measure in literature for pure states. We then define multiparty quantum discord, and give some examples. Interestingly, we observe that quantum discord increases when a measurement is performed on a large number of subsystems. Consequently, the symmetric quantum discord, which involves a measurement on all parties, reveals the maximal quantumness. This raises a question on the interpretation of measured mutual information as a classical correlation.

  16. Development of an Ada package library

    NASA Technical Reports Server (NTRS)

    Burton, Bruce; Broido, Michael

    1986-01-01

    A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.

  17. Climate in Earth history

    NASA Technical Reports Server (NTRS)

    Berger, W. H.; Crowell, J. C.

    1982-01-01

    Complex atmosphere-ocean-land interactions govern the climate system and its variations. During the course of Earth history, nature has performed a large number of experiments involving climatic change; the geologic record contains much information regarding these experiments. This information should result in an increased understanding of the climate system, including climatic stability and factors that perturb climate. In addition, the paleoclimatic record has been demonstrated to be useful in interpreting the origin of important resources-petroleum, natural gas, coal, phosphate deposits, and many others.

  18. Design and Development of a Prototype Organizational Effectiveness Information System

    DTIC Science & Technology

    1984-11-01

    information from a large number of people. The existing survey support process for the GOQ is not satisfac- * tory. Most OESOs elect not to use it, because...reporting process uses screen queries and menus to simplify data entry, it is estimated that only 4-6 hours of data entry time would be required for ...description for the file named EVEDIR. The Resource System allows users of the Event Directory to select from the following processing options. o Add a new

  19. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  20. An initiative to improve the management of clinically significant test results in a large health care network.

    PubMed

    Roy, Christopher L; Rothschild, Jeffrey M; Dighe, Anand S; Schiff, Gordon D; Graydon-Baker, Erin; Lenoci-Edwards, Jennifer; Dwyer, Cheryl; Khorasani, Ramin; Gandhi, Tejal K

    2013-11-01

    The failure of providers to communicate and follow up clinically significant test results (CSTR) is an important threat to patient safety. The Massachusetts Coalition for the Prevention of Medical Errors has endorsed the creation of systems to ensure that results can be received and acknowledged. In 2008 a task force was convened that represented clinicians, laboratories, radiology, patient safety, risk management, and information systems in a large health care network with the goals of providing recommendations and a road map for improvement in the management of CSTR and of implementing this improvement plan during the sub-force sequent five years. In drafting its charter, the task broadened the scope from "critical" results to "clinically significant" ones; clinically significant was defined as any result that requires further clinical action to avoid morbidity or mortality, regardless of the urgency of that action. The task force recommended four key areas for improvement--(1) standardization of policies and definitions, (2) robust identification of the patient's care team, (3) enhanced results management/tracking systems, and (4) centralized quality reporting and metrics. The task force faced many challenges in implementing these recommendations, including disagreements on definitions of CSTR and on who should have responsibility for CSTR, changes to established work flows, limitations of resources and of existing information systems, and definition of metrics. This large-scale effort to improve the communication and follow-up of CSTR in a health care network continues with ongoing work to address implementation challenges, refine policies, prepare for a new clinical information system platform, and identify new ways to measure the extent of this important safety problem.

  1. A search for applications of Fiber Optics in early warning systems for natural hazards.

    NASA Astrophysics Data System (ADS)

    Wenker, Koen; Bogaard, Thom

    2013-04-01

    In order to reduce the societal risk associated with natural hazards novel technologies could help to advance in early warning systems. In our study we evaluate the use of multi-sensor technologies as possible early-warning systems for landslides and man-made structures, and the integration of the information in a simple Decision Support System (DSS). In this project, particular attention will be paid to some new possibilities available in the field of distributed monitoring systems of relevant parameters for landslide and man-made structures monitoring (such as large dams and bridges), and among them the distributed monitoring of temperature, strain and acoustic signals by FO cables. Fiber Optic measurements are becoming more and more popular. Fiber optic cables have been developed in the telecommunication business to send large amounts of information over large distances with the speed of light. Because of the commercial application, production costs are relatively low. Using fiber optics for measurements has several advantages. This novel technology is, for instance, immune to electromagnetic interference, appears stable, very accurate, and has the potential to measure several independent physical properties in a distributed manner. The high resolution spatial and temporal distributed information on e.g. temperature or strain (or both) make fiber optics an interesting measurement technique. Several applications have been developed in both engineering as science and the possibilities seem numerous. We will present a thorough literature review that was done to assess the applicability and limitations of FO cable technology. This review was focused but not limited to application in landslide research. Several examples of current practices will be shown, also from outside the natural hazard practice and possible application will be discussed.

  2. An intermediate level of abstraction for computational systems chemistry.

    PubMed

    Andersen, Jakob L; Flamm, Christoph; Merkle, Daniel; Stadler, Peter F

    2017-12-28

    Computational techniques are required for narrowing down the vast space of possibilities to plausible prebiotic scenarios, because precise information on the molecular composition, the dominant reaction chemistry and the conditions for that era are scarce. The exploration of large chemical reaction networks is a central aspect in this endeavour. While quantum chemical methods can accurately predict the structures and reactivities of small molecules, they are not efficient enough to cope with large-scale reaction systems. The formalization of chemical reactions as graph grammars provides a generative system, well grounded in category theory, at the right level of abstraction for the analysis of large and complex reaction networks. An extension of the basic formalism into the realm of integer hyperflows allows for the identification of complex reaction patterns, such as autocatalysis, in large reaction networks using optimization techniques.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  3. D and D Knowledge Management Information Tool - 2012 - 12106

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Lagos, L.; Quintero, W.

    2012-07-01

    Deactivation and decommissioning (D and D) work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with the different ALARA (As-Low-As-Reasonably-Achievable) Centers, DOE sites, Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintainmore » this valuable information in a universally available and easily usable system. D and D KM-IT provides single point access to all D and D related activities through its knowledge base. It is a community driven system. D and D KM-IT makes D and D knowledge available to the people who need it at the time they need it and in a readily usable format. It uses the World Wide Web as the primary source for content in addition to information collected from subject matter specialists and the D and D community. It brings information in real time through web based custom search processes and its dynamic knowledge repository. Future developments include developing a document library, providing D and D information access on mobile devices for the Technology module and Hotline, and coordinating multiple subject matter specialists to support the Hotline. The goal is to deploy a high-end sophisticated and secured system to serve as a single large knowledge base for all the D and D activities. The system consolidates a large amount of information available on the web and presents it to users in the simplest way possible. (authors)« less

  4. Formation Flight of Multiple UAVs via Onboard Sensor Information Sharing.

    PubMed

    Park, Chulwoo; Cho, Namhoon; Lee, Kyunghyun; Kim, Youdan

    2015-07-17

    To monitor large areas or simultaneously measure multiple points, multiple unmanned aerial vehicles (UAVs) must be flown in formation. To perform such flights, sensor information generated by each UAV should be shared via communications. Although a variety of studies have focused on the algorithms for formation flight, these studies have mainly demonstrated the performance of formation flight using numerical simulations or ground robots, which do not reflect the dynamic characteristics of UAVs. In this study, an onboard sensor information sharing system and formation flight algorithms for multiple UAVs are proposed. The communication delays of radiofrequency (RF) telemetry are analyzed to enable the implementation of the onboard sensor information sharing system. Using the sensor information sharing, the formation guidance law for multiple UAVs, which includes both a circular and close formation, is designed. The hardware system, which includes avionics and an airframe, is constructed for the proposed multi-UAV platform. A numerical simulation is performed to demonstrate the performance of the formation flight guidance and control system for multiple UAVs. Finally, a flight test is conducted to verify the proposed algorithm for the multi-UAV system.

  5. Formation Flight of Multiple UAVs via Onboard Sensor Information Sharing

    PubMed Central

    Park, Chulwoo; Cho, Namhoon; Lee, Kyunghyun; Kim, Youdan

    2015-01-01

    To monitor large areas or simultaneously measure multiple points, multiple unmanned aerial vehicles (UAVs) must be flown in formation. To perform such flights, sensor information generated by each UAV should be shared via communications. Although a variety of studies have focused on the algorithms for formation flight, these studies have mainly demonstrated the performance of formation flight using numerical simulations or ground robots, which do not reflect the dynamic characteristics of UAVs. In this study, an onboard sensor information sharing system and formation flight algorithms for multiple UAVs are proposed. The communication delays of radiofrequency (RF) telemetry are analyzed to enable the implementation of the onboard sensor information sharing system. Using the sensor information sharing, the formation guidance law for multiple UAVs, which includes both a circular and close formation, is designed. The hardware system, which includes avionics and an airframe, is constructed for the proposed multi-UAV platform. A numerical simulation is performed to demonstrate the performance of the formation flight guidance and control system for multiple UAVs. Finally, a flight test is conducted to verify the proposed algorithm for the multi-UAV system. PMID:26193281

  6. How JCAHO, WEDI, ANSI, HCFA, and Hillary Clinton will turn your systems upside down.

    PubMed

    Howe, R C

    1994-01-01

    JCAHO, WEDI, ANSI, HCFA, the Clinton Administration health care reform task force, and other local, state, and national organizations are having a major impact on the health care system. Health care providers will become part of larger health care organizations, such as accountable health plans (AHPs), to provide health care services under a managed care or contracted fee-for-service basis. Information systems that were designed under the old health care model will no longer be applicable to the new health care reform system. The new information systems will have to be patient-centered, operate under a managed care environment, and function to handle patients throughout the continuum of care across a multiple-provider organization. The new information system will require extensive network infrastructures operating at high speeds, integration of LANs and WANs across large geographic areas, sophisticated interfacing tools, consolidation of core patient data bases, and consolidation of the supporting IS infrastructure (applications, data centers, staff, etc.). The changes associated with the health care reform initiatives may, indeed, turn current information systems upside down.

  7. Benefit-cost assessment of the commercial vehicle information systems and networks (CVISN) in Maryland

    DOT National Transportation Integrated Search

    1998-11-01

    The objective of this study is to answer questions regarding the net benefits of CVISN deployment by the State of Maryland. The hypothesis is that the net benefits of CVISN deployment are positive and large but vary among system components and betwee...

  8. Development of Rural Road Bridge Weigh-in-Motion System to Assess Weight and Configuration of Farm-to-Market Vehicles

    DOT National Transportation Integrated Search

    2018-05-01

    The weights and configurations of large vehicles traveling the primary interstate system are known with relative certainty due to the information collected at numerous weigh stations. It is uncommon, however, that farm-to-market vehicles and other im...

  9. Information Systems and Performance Measures in Schools.

    ERIC Educational Resources Information Center

    Coleman, James S.; Karweit, Nancy L.

    Large school systems bring various administrative problems in handling scheduling, records, and avoiding making red tape casualties of students. The authors review a portion of the current use of computers to handle these problems and examine the range of activities for which computer processing could provide aid. Since automation always brings…

  10. Enabling the Interoperability of Large-Scale Legacy Systems

    DTIC Science & Technology

    2008-01-01

    information retrieval systems ( Salton and McGill 1983). We use this method because, in the schema mapping task, only one instance per class is...2001). A survey of approaches to automatic schema matching. The VLDB Journal, 10, 334-350. Salton , G., & McGill, M.J. (1983). Introduction to

  11. Soil classification and carbon storage in cacao agroforestry farming systems of Bahia, Brazil

    USDA-ARS?s Scientific Manuscript database

    Information concerning the classification of soils and their properties under cacao agroforestry systems of the Atlantic rain forest biome region in the Southeast of Bahia Brazil is largely unknown. Soil and climatic conditions in this region are favorable for high soil carbon storage. This study is...

  12. APPLYING OPERATIONAL ANALYSIS TO URBAN EDUCATIONAL SYSTEMS, A WORKING PAPER.

    ERIC Educational Resources Information Center

    SISSON, ROGER L.

    OPERATIONS RESEARCH CONCEPTS ARE POTENTIALLY USEFUL FOR STUDY OF SUCH LARGE URBAN SCHOOL DISTRICT PROBLEMS AS INFORMATION FLOW, PHYSICAL STRUCTURE OF THE DISTRICT, ADMINISTRATIVE DECISION MAKING BOARD POLICY FUNCTIONS, AND THE BUDGET STRUCTURE. OPERATIONAL ANALYSIS REQUIRES (1) IDENTIFICATION OF THE SYSTEM UNDER STUDY, (2) IDENTIFICATION OF…

  13. The Intellectual Assembly Line is Already Here

    ERIC Educational Resources Information Center

    Vanderburg, Willem H.

    2004-01-01

    The universal attempt to link computers by means of business process reengineering, enterprise integration, and the management of technology is creating large systems that structure and control the flows of information within institutions. Human work associated with these systems must be reorganized in the image of these technologies. The…

  14. Importance sampling large deviations in nonequilibrium steady states. I.

    PubMed

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T

    2018-03-28

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  15. Importance sampling large deviations in nonequilibrium steady states. I

    NASA Astrophysics Data System (ADS)

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.

    2018-03-01

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  16. The application of similar image retrieval in electronic commerce.

    PubMed

    Hu, YuPing; Yin, Hua; Han, Dezhi; Yu, Fei

    2014-01-01

    Traditional online shopping platform (OSP), which searches product information by keywords, faces three problems: indirect search mode, large search space, and inaccuracy in search results. For solving these problems, we discuss and research the application of similar image retrieval in electronic commerce. Aiming at improving the network customers' experience and providing merchants with the accuracy of advertising, we design a reasonable and extensive electronic commerce application system, which includes three subsystems: image search display subsystem, image search subsystem, and product information collecting subsystem. This system can provide seamless connection between information platform and OSP, on which consumers can automatically and directly search similar images according to the pictures from information platform. At the same time, it can be used to provide accuracy of internet marketing for enterprises. The experiment shows the efficiency of constructing the system.

  17. The Application of Similar Image Retrieval in Electronic Commerce

    PubMed Central

    Hu, YuPing; Yin, Hua; Han, Dezhi; Yu, Fei

    2014-01-01

    Traditional online shopping platform (OSP), which searches product information by keywords, faces three problems: indirect search mode, large search space, and inaccuracy in search results. For solving these problems, we discuss and research the application of similar image retrieval in electronic commerce. Aiming at improving the network customers' experience and providing merchants with the accuracy of advertising, we design a reasonable and extensive electronic commerce application system, which includes three subsystems: image search display subsystem, image search subsystem, and product information collecting subsystem. This system can provide seamless connection between information platform and OSP, on which consumers can automatically and directly search similar images according to the pictures from information platform. At the same time, it can be used to provide accuracy of internet marketing for enterprises. The experiment shows the efficiency of constructing the system. PMID:24883411

  18. Impacts of e-health on the outcomes of care in low- and middle-income countries: where do we go from here?

    PubMed

    Piette, John D; Lun, K C; Moura, Lincoln A; Fraser, Hamish S F; Mechael, Patricia N; Powell, John; Khoja, Shariq R

    2012-05-01

    E-health encompasses a diverse set of informatics tools that have been designed to improve public health and health care. Little information is available on the impacts of e-health programmes, particularly in low- and middle-income countries. We therefore conducted a scoping review of the published and non-published literature to identify data on the effects of e-health on health outcomes and costs. The emphasis was on the identification of unanswered questions for future research, particularly on topics relevant to low- and middle-income countries. Although e-health tools supporting clinical practice have growing penetration globally, there is more evidence of benefits for tools that support clinical decisions and laboratory information systems than for those that support picture archiving and communication systems. Community information systems for disease surveillance have been implemented successfully in several low- and middle-income countries. Although information on outcomes is generally lacking, a large project in Brazil has documented notable impacts on health-system efficiency. Meta-analyses and rigorous trials have documented the benefits of text messaging for improving outcomes such as patients' self-care. Automated telephone monitoring and self-care support calls have been shown to improve some outcomes of chronic disease management, such as glycaemia and blood pressure control, in low- and middle-income countries. Although large programmes for e-health implementation and research are being conducted in many low- and middle-income countries, more information on the impacts of e-health on outcomes and costs in these settings is still needed.

  19. Impacts of e-health on the outcomes of care in low- and middle-income countries: where do we go from here?

    PubMed Central

    Lun, KC; Moura, Lincoln A; Fraser, Hamish SF; Mechael, Patricia N; Powell, John; Khoja, Shariq R

    2012-01-01

    Abstract E-health encompasses a diverse set of informatics tools that have been designed to improve public health and health care. Little information is available on the impacts of e-health programmes, particularly in low- and middle-income countries. We therefore conducted a scoping review of the published and non-published literature to identify data on the effects of e-health on health outcomes and costs. The emphasis was on the identification of unanswered questions for future research, particularly on topics relevant to low- and middle-income countries. Although e-health tools supporting clinical practice have growing penetration globally, there is more evidence of benefits for tools that support clinical decisions and laboratory information systems than for those that support picture archiving and communication systems. Community information systems for disease surveillance have been implemented successfully in several low- and middle-income countries. Although information on outcomes is generally lacking, a large project in Brazil has documented notable impacts on health-system efficiency. Meta-analyses and rigorous trials have documented the benefits of text messaging for improving outcomes such as patients’ self-care. Automated telephone monitoring and self-care support calls have been shown to improve some outcomes of chronic disease management, such as glycaemia and blood pressure control, in low- and middle-income countries. Although large programmes for e-health implementation and research are being conducted in many low- and middle-income countries, more information on the impacts of e-health on outcomes and costs in these settings is still needed. PMID:22589570

  20. Improving biomedical information retrieval by linear combinations of different query expansion techniques.

    PubMed

    Abdulla, Ahmed AbdoAziz Ahmed; Lin, Hongfei; Xu, Bo; Banbhrani, Santosh Kumar

    2016-07-25

    Biomedical literature retrieval is becoming increasingly complex, and there is a fundamental need for advanced information retrieval systems. Information Retrieval (IR) programs scour unstructured materials such as text documents in large reserves of data that are usually stored on computers. IR is related to the representation, storage, and organization of information items, as well as to access. In IR one of the main problems is to determine which documents are relevant and which are not to the user's needs. Under the current regime, users cannot precisely construct queries in an accurate way to retrieve particular pieces of data from large reserves of data. Basic information retrieval systems are producing low-quality search results. In our proposed system for this paper we present a new technique to refine Information Retrieval searches to better represent the user's information need in order to enhance the performance of information retrieval by using different query expansion techniques and apply a linear combinations between them, where the combinations was linearly between two expansion results at one time. Query expansions expand the search query, for example, by finding synonyms and reweighting original terms. They provide significantly more focused, particularized search results than do basic search queries. The retrieval performance is measured by some variants of MAP (Mean Average Precision) and according to our experimental results, the combination of best results of query expansion is enhanced the retrieved documents and outperforms our baseline by 21.06 %, even it outperforms a previous study by 7.12 %. We propose several query expansion techniques and their combinations (linearly) to make user queries more cognizable to search engines and to produce higher-quality search results.

Top