Science.gov

Sample records for information processing system

  1. Advanced information processing system

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  2. Advanced Information Processing System (AIPS)

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  3. Information Processing in Living Systems

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper; Bialek, William

    2016-03-01

    Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.

  4. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  5. Processing information system for highly specialized information in corporate networks

    NASA Astrophysics Data System (ADS)

    Petrosyan, M. O.; Kovalev, I. V.; Zelenkov, P. V.; Brezitskaya, VV; Prohorovich, G. A.

    2016-11-01

    The new structure for formation system and management system for highly specialized information in corporate systems is offered. The main distinguishing feature of this structure is that it involves the processing of multilingual information in a single user request.

  6. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  7. Information processing capacity of dynamical systems.

    PubMed

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  8. Information Processing Capacity of Dynamical Systems

    PubMed Central

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  9. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  10. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  11. Information Processing in Decision-Making Systems

    PubMed Central

    van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A. David

    2015-01-01

    Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a “forward” mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders. PMID:22492194

  12. Empathizers and systemizers process social information differently.

    PubMed

    Riekki, Tapani; Svedholm-Häkkinen, Annika M; Lindeman, Marjaana

    2017-08-24

    Using the empathizing-systemizing theory as our framework, we investigated how people with high self-reported empathizing (having good social skills and being interested in people) and systemizing (being interested in physical things and processes) differ in the social information processing of emotionally negative photographs of people during "spontaneous watching" and emotional and cognitive empathy tasks. Empathizers evaluated the pictures as more emotionally touching and the reactions in the photographs more understandable than the systemizers. Compared to the empathizers, systemizers had stronger activations in the posterior cingulate cortex, an area related to cognitive empathy, as well as in the left superior temporal gyrus and middle frontal gyrus when watching emotional photographs spontaneously. During guided emotional and cognitive empathy tasks, these differences disappeared. However, during the emotional empathy task, higher systemizing was associated with weaker activation of the right inferior frontal gyrus /insula. Furthermore, during emotional and cognitive empathy tasks, empathizing was related to increased activations of the amygdala which were in turn related to higher behavioral ratings of emotional and cognitive empathy. The results suggest that empathizers and systemizers engage in social information processing differently: systemizers in more cognitive terms and empathizers with stronger automatic emotional reactions.

  13. Hydrocarbon Processing`s Advanced control and information systems `95

    SciTech Connect

    1995-09-01

    This special report presents control strategies and information systems for most hydrocarbon processes and plants. Each summary (76 in all) contains information on application, control strategy, economics, commercial installations, and licensor. The processes include NGL recovery, alkylation, blending, catalytic reforming, caustic treating, cryogenic separation, delayed coking, fractionation, hydrocracking, hydrogen production, isomerization, lube oil extraction, oil transport and storage, pipeline management, information management, sulfur recovery, waste water treatments, and others.

  14. Information processing in the mammalian olfactory system.

    PubMed

    Lledo, Pierre-Marie; Gheusi, Gilles; Vincent, Jean-Didier

    2005-01-01

    Recently, modern neuroscience has made considerable progress in understanding how the brain perceives, discriminates, and recognizes odorant molecules. This growing knowledge took over when the sense of smell was no longer considered only as a matter for poetry or the perfume industry. Over the last decades, chemical senses captured the attention of scientists who started to investigate the different stages of olfactory pathways. Distinct fields such as genetic, biochemistry, cellular biology, neurophysiology, and behavior have contributed to provide a picture of how odor information is processed in the olfactory system as it moves from the periphery to higher areas of the brain. So far, the combination of these approaches has been most effective at the cellular level, but there are already signs, and even greater hope, that the same is gradually happening at the systems level. This review summarizes the current ideas concerning the cellular mechanisms and organizational strategies used by the olfactory system to process olfactory information. We present findings that exemplified the high degree of olfactory plasticity, with special emphasis on the first central relay of the olfactory system. Recent observations supporting the necessity of such plasticity for adult brain functions are also discussed. Due to space constraints, this review focuses mainly on the olfactory systems of vertebrates, and primarily those of mammals.

  15. Advances in Neural Information Processing Systems

    NASA Astrophysics Data System (ADS)

    Jordan, Michael I.; Lecun, Yann; Solla, Sara A.

    2001-11-01

    The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This CD-ROM contains the entire proceedings of the twelve Neural Information Processing Systems conferences from 1988 to 1999. The files are available in the DjVu image format developed by Yann LeCun and his group at AT&T Labs. The CD-ROM includes free browsers for all major platforms. Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley. Yann LeCun is Head of the Image Processing Research Department at AT&T Labs-Research. Sara A. Solla is Professor of Physics and of Physiology at Northwestern University.

  16. The standards process: X3 information processing systems

    NASA Technical Reports Server (NTRS)

    Emard, Jean-Paul

    1993-01-01

    The topics are presented in viewgraph form and include the following: International Organization for Standards (ISO); International Electrotechnical Committee (IEC); ISO/IEC Joint Technical Committee 1 (JTC-1); U.S. interface to JTC-1; ANSI; national organizations; U.S. standards development processes; national and international standards developing organizations; regional organizations; and X3 information processing systems.

  17. Latency Minimizing Tasking for Information Processing Systems

    SciTech Connect

    Horey, James L; Lagesse, Brent J

    2011-01-01

    Real-time cyber-physical systems and information processing clusters require system designers to consider the total latency involved in collecting and aggregating data. For example, applications such as wild-fire monitoring require data to be presented to users in a timely manner. However, most models and algorithms for sensor networks have focused on alternative metrics such as energy efficiency. In this paper, we present a new model of sensor network aggregation that focuses on total latency. Our model is flexible and enables users to configure varying transmission and computation time on a node-by-node basis, and thus enables the simulation of complex computational phenomena. In addition, we present results from three tasking algorithms that trade-off local communication for overall latency performance. These algorithms are evaluated in simulated networks of up to 200 nodes. We've presented an aggregation-focused model of sensor networks that can be used to study the trade-offs between computational coverage and total latency. Our model explicitly takes into account transmission and computation times, and enables users to define different values for the basestation. In addition, we've presented three different tasking algorithms that operate over model to produce aggregation schedules of varying quality. In the future, we expect to continue exploring distributed tasking algorithms for information processing systems. We've shown that the gap between highly optimized schedules that use global information is quite large relative to our distributed algorithms. This gives us encouragement that future distributed tasking algorithms can still make large gains.

  18. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  19. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  20. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  1. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  2. Gathering Information from Transport Systems for Processing in Supply Chains

    NASA Astrophysics Data System (ADS)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  3. Advanced information processing system: Input/output system services

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Alger, Linda

    1989-01-01

    The functional requirements and detailed specifications for the Input/Output (I/O) Systems Services of the Advanced Information Processing System (AIPS) are discussed. The introductory section is provided to outline the overall architecture and functional requirements of the AIPS system. Section 1.1 gives a brief overview of the AIPS architecture as well as a detailed description of the AIPS fault tolerant network architecture, while section 1.2 provides an introduction to the AIPS systems software. Sections 2 and 3 describe the functional requirements and design and detailed specifications of the I/O User Interface and Communications Management modules of the I/O System Services, respectively. Section 4 illustrates the use of the I/O System Services, while Section 5 concludes with a summary of results and suggestions for future work in this area.

  4. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    DTIC Science & Technology

    2008-07-01

    IMPRINT Analysis of an Unmanned Air System Geospatial Information Process by Bruce P. Hunn, Kristin M. Schweitzer, John A. Cahir, and Mary M...21005-5425 ARL-TR-4513 July 2008 IMPRINT Analysis of an Unmanned Air System Geospatial Information Process Bruce P. Hunn and Kristin M...AND SUBTITLE IMPRINT Analysis of an Unmanned Air System Geospatial Information Process 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER 622716.AH70

  5. Large Scale Information Processing System. Volume I. Compiler, Natural Language, and Information Processing.

    ERIC Educational Resources Information Center

    Peterson, Philip L.; And Others

    This volume, the first of three dealing with a number of investigations and studies into the formal structure, advanced technology and application of large scale information processing systems, is concerned with the areas of compiler languages, natural languages and information storage and retrieval. The first report is entitled "Semantics and…

  6. Model for Process Description: From Picture to Information System

    NASA Technical Reports Server (NTRS)

    Zak, A.

    1996-01-01

    A new model for the development of proces information systems is proposed. It is robust and inexpensive, capable of providing timely, neccessary information to the user by integrating Products, Instructions, Examples, Tools, and Process.

  7. Model for Process Description: From Picture to Information System

    NASA Technical Reports Server (NTRS)

    Zak, A.

    1996-01-01

    A new model for the development of proces information systems is proposed. It is robust and inexpensive, capable of providing timely, neccessary information to the user by integrating Products, Instructions, Examples, Tools, and Process.

  8. Clinical information process units (CIPUs) - a system ergonomic approach to medical information systems.

    PubMed

    Friesdorf, W; Groß-Alltag, F; Konichezky, S; Arndt, K

    1994-01-01

    This article constitutes an introduction to the basic tools necessary to understand Systems Ergonomics applied to the development of clinical systems. A basic description of clinical patient care in the system ergonomics language is provided, and the current situation found in hospital information management is criticized from an ergonomic point of view. We have laid out a model of the information flow in the clinical environment, which breaks the complex process of patient care in clearly defined elements: the Clinical Information Process Units. Presented here as an example of the application of Systems Ergonomics to the clinical working processes, the Clinical Information Process Units constitute the central element in the system ergonomic model of the information flow in the clinical environment.

  9. Study on advanced information processing system

    NASA Technical Reports Server (NTRS)

    Shin, Kang G.; Liu, Jyh-Charn

    1992-01-01

    Issues related to the reliability of a redundant system with large main memory are addressed. In particular, the Fault-Tolerant Processor (FTP) for Advanced Launch System (ALS) is used as a basis for our presentation. When the system is free of latent faults, the probability of system crash due to nearly-coincident channel faults is shown to be insignificant even when the outputs of computing channels are infrequently voted on. In particular, using channel error maskers (CEMs) is shown to improve reliability more effectively than increasing the number of channels for applications with long mission times. Even without using a voter, most memory errors can be immediately corrected by CEMs implemented with conventional coding techniques. In addition to their ability to enhance system reliability, CEMs--with a low hardware overhead--can be used to reduce not only the need of memory realignment, but also the time required to realign channel memories in case, albeit rare, such a need arises. Using CEMs, we have developed two schemes, called Scheme 1 and Scheme 2, to solve the memory realignment problem. In both schemes, most errors are corrected by CEMs, and the remaining errors are masked by a voter.

  10. Distributive Processing Issues in Education Information Systems.

    ERIC Educational Resources Information Center

    Ender, Philip B.

    This is one of a series of reports based on an ongoing reality test of systemic evaluation for instructional decision making. This feasibility study is being carried out by the Center for the Study of Evaluation with the Laboratory in School and Community Relations at a suburban Los Angeles high school (called Site A). Viewing a school as a…

  11. Very Large Scale Distributed Information Processing Systems

    DTIC Science & Technology

    1991-09-27

    34Reliable Distributed Database Management", Proc. of the IEEE, May 1987, pp. 601-620. [GOTT881 Gottlob , Georg andRoberto Zicari, "Closed World Databases... Gottlob , and Gio Wiederhold, "Interfacing Relational Databases and Prolog Efficiently," in Proceedings 2nd Expert Database Systems Conference, pp. 141

  12. Nature computes: information processing in quantum dynamical systems.

    PubMed

    Wiesner, Karoline

    2010-09-01

    Nature intrinsically computes. It has been suggested that the entire universe is a computer, in particular, a quantum computer. To corroborate this idea we require tools to quantify the information processing. Here we review a theoretical framework for quantifying information processing in a quantum dynamical system. So-called intrinsic quantum computation combines tools from dynamical systems theory, information theory, quantum mechanics, and computation theory. We will review how far the framework has been developed and what some of the main open questions are. On the basis of this framework we discuss upper and lower bounds for intrinsic information storage in a quantum dynamical system.

  13. Data Processing: The Need for Programs in Business Information Systems.

    ERIC Educational Resources Information Center

    Hunter, James

    1980-01-01

    There is a demand in industry for both computer science graduates and business information systems graduates. Educators need to start this training at the secondary level with an introduction to all phases of data processing for any interested student. (CT)

  14. Scrapping Patched Computer Systems: Integrated Data Processing for Information Management.

    ERIC Educational Resources Information Center

    Martinson, Linda

    1991-01-01

    Colleges and universities must find a way to streamline and integrate information management processes across the organization. The Georgia Institute of Technology responded to an acute problem of dissimilar operating systems with a campus-wide integrated administrative system using a machine independent relational database management system. (MSE)

  15. Scrapping Patched Computer Systems: Integrated Data Processing for Information Management.

    ERIC Educational Resources Information Center

    Martinson, Linda

    1991-01-01

    Colleges and universities must find a way to streamline and integrate information management processes across the organization. The Georgia Institute of Technology responded to an acute problem of dissimilar operating systems with a campus-wide integrated administrative system using a machine independent relational database management system. (MSE)

  16. Process approach in developing or improvement of student information systems

    NASA Astrophysics Data System (ADS)

    Jaskowska, Małgorzata

    2015-02-01

    An aim of research described in the article was to evaluate usefulness of the university information system, which precedes its reorganization. The study was conducted among representatives of all stakeholders - system users: candidates, students and university authorities. A need of system users expressed in the study: change of the approach in its construction - from purely information to procedural, it is consistent with a current process approach in systems design, intensified by the fashionable service oriented architecture (SOA). This thread was developed by conducting literature research and analysis of student information systems best practices. As a result the processes were selected and described, which implementation may assist the university system. Research result can be used by system designers for its improvement.

  17. Survey of geographical information system and image processing software

    USGS Publications Warehouse

    Vanderzee, D.; Singh, A.

    1995-01-01

    The Global Resource Information Database—a part of the United Nations Environment Programme—conducts a bi-annual survey of geographical information system (GIS) and image processing (IP) software. This survey makes information about software products available in developing countries. The 1993 survey showed that the number of installations of GIS, IP, and related software products increased dramatically from 1991 to 1993, mostly in North America and Europe.

  18. Advanced Information Processing System - Fault detection and error handling

    NASA Technical Reports Server (NTRS)

    Lala, J. H.

    1985-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles, including tactical and transport aircraft, and manned and autonomous spacecraft. A proof-of-concept (POC) system is now in the detailed design and fabrication phase. This paper gives an overview of a preliminary fault detection and error handling philosophy in AIPS.

  19. The processing of information from sensors in intelligent systems

    NASA Astrophysics Data System (ADS)

    Kokovin, V. A.; Sytin, A. N.

    2017-01-01

    The article describes the processing of information obtained from sensors in intelligent systems. The paper analyzes the need of advanced treatment for a paralleling operation calculator which reduces the time of response to input events. A realization of a speculative processing algorithm in the FPGA by streaming control is based on a data flow model. This solution can be used in applications related to telecommunications networks of distributed control systems.

  20. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  1. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  2. Information systems for material flow management in construction processes

    NASA Astrophysics Data System (ADS)

    Mesároš, P.; Mandičák, T.

    2015-01-01

    The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.

  3. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  4. A new Wellsite Information System to aid the drilling process

    SciTech Connect

    Grenadier, J.A.; McCann, D.; Koch, S.; Schlumberger, A.

    1994-12-31

    The IDEAL Wellsite Information System acquires data to monitor the drilling process. It interprets the realtime data flow from both surface and downhole and displays useful information on high resolution color screens to the key decision makers on and off the wellsite. The IDEAL Wellsite Information System can support four classes of users simultaneously: The driller, the directional driller on the rig floor, logging specialists in the unit and the company representative in the customer`s office. Color displays have been customized to the specialized needs of each class of user. In particular, the IDEAL Driller`s Display is a pressurized unit located on the rig floor. The driller can select from a number of screens with a minimum number of keystrokes. This information network improves drilling efficiency, geological evaluation and subsequent production through enhanced geological steering. Data is continually stored in both the time and depth domains. These databases can be exported into a variety of formats. Data can also be transmitted in realtime to the customer`s office offsite. Backup system components allow for redundancy so that system downtime is virtually eliminated. By having system developers concentrate on making the workstation easy to operate, the users can focus on the drilling process and not on the computer system. Custom graphic displays were designed by drillers for drillers. {open_quotes}Smart Alarms{close_quotes} have been designed to alert the user of potential problems such as kicks, sticking pipe and drillpipe washout.

  5. Specifications for a Federal Information Processing Standard Data Dictionary System

    NASA Technical Reports Server (NTRS)

    Goldfine, A.

    1984-01-01

    The development of a software specification that Federal agencies may use in evaluating and selecting data dictionary systems (DDS) is discussed. To supply the flexibility needed by widely different applications and environments in the Federal Government, the Federal Information Processing Standard (FIPS) specifies a core DDS together with an optimal set of modules. The focus and status of the development project are described. Functional specifications for the FIPS DDS are examined for the dictionary, the dictionary schema, and the dictionary processing system. The DDS user interfaces and DDS software interfaces are discussed as well as dictionary administration.

  6. Neural Mechanisms and Information Processing in Recognition Systems

    PubMed Central

    Ozaki, Mamiko; Hefetz, Abraham

    2014-01-01

    Nestmate recognition is a hallmark of social insects. It is based on the match/mismatch of an identity signal carried by members of the society with that of the perceiving individual. While the behavioral response, amicable or aggressive, is very clear, the neural systems underlying recognition are not fully understood. Here we contrast two alternative hypotheses for the neural mechanisms that are responsible for the perception and information processing in recognition. We focus on recognition via chemical signals, as the common modality in social insects. The first, classical, hypothesis states that upon perception of recognition cues by the sensory system the information is passed as is to the antennal lobes and to higher brain centers where the information is deciphered and compared to a neural template. Match or mismatch information is then transferred to some behavior-generating centers where the appropriate response is elicited. An alternative hypothesis, that of “pre-filter mechanism”, posits that the decision as to whether to pass on the information to the central nervous system takes place in the peripheral sensory system. We suggest that, through sensory adaptation, only alien signals are passed on to the brain, specifically to an “aggressive-behavior-switching center”, where the response is generated if the signal is above a certain threshold. PMID:26462936

  7. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  8. Applications of the generalized information processing system (GIPSY)

    USGS Publications Warehouse

    Moody, D.W.; Kays, Olaf

    1972-01-01

    The Generalized Information Processing System (GIPSY) stores and retrieves variable-field, variable-length records consisting of numeric data, textual data, or codes. A particularly noteworthy feature of GIPSY is its ability to search records for words, word stems, prefixes, and suffixes as well as for numeric values. Moreover, retrieved records may be printed on pre-defined formats or formatted as fixed-field, fixed-length records for direct input to other-programs, which facilitates the exchange of data with other systems. At present there are some 22 applications of GIPSY falling in the general areas of bibliography, natural resources information, and management science, This report presents a description of each application including a sample input form, dictionary, and a typical formatted record. It is hoped that these examples will stimulate others to experiment with innovative uses of computer technology.

  9. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  10. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  11. Materials And Processes Technical Information System (MAPTIS) LDEF materials database

    NASA Technical Reports Server (NTRS)

    Davis, John M.; Strickland, John W.

    1992-01-01

    The Materials and Processes Technical Information System (MAPTIS) is a collection of materials data which was computerized and is available to engineers in the aerospace community involved in the design and development of spacecraft and related hardware. Consisting of various database segments, MAPTIS provides the user with information such as material properties, test data derived from tests specifically conducted for qualification of materials for use in space, verification and control, project management, material information, and various administrative requirements. A recent addition to the project management segment consists of materials data derived from the LDEF flight. This tremendous quantity of data consists of both pre-flight and post-flight data in such diverse areas as optical/thermal, mechanical and electrical properties, atomic concentration surface analysis data, as well as general data such as sample placement on the satellite, A-O flux, equivalent sun hours, etc. Each data point is referenced to the primary investigator(s) and the published paper from which the data was taken. The MAPTIS system is envisioned to become the central location for all LDEF materials data. This paper consists of multiple parts, comprising a general overview of the MAPTIS System and the types of data contained within, and the specific LDEF data element and the data contained in that segment.

  12. Research on information security system of waste terminal disposal process

    NASA Astrophysics Data System (ADS)

    Zhou, Chao; Wang, Ziying; Guo, Jing; Guo, Yajuan; Huang, Wei

    2017-05-01

    Informatization has penetrated the whole process of production and operation of electric power enterprises. It not only improves the level of lean management and quality service, but also faces severe security risks. The internal network terminal is the outermost layer and the most vulnerable node of the inner network boundary. It has the characteristics of wide distribution, long depth and large quantity. The user and operation and maintenance personnel technical level and security awareness is uneven, which led to the internal network terminal is the weakest link in information security. Through the implementation of security of management, technology and physics, we should establish an internal network terminal security protection system, so as to fully protect the internal network terminal information security.

  13. Cell/tissue processing information system for regenerative medicine.

    PubMed

    Iwayama, Daisuke; Yamato, Masayuki; Tsubokura, Tetsuya; Takahashi, Minoru; Okano, Teruo

    2016-11-01

    When conducting clinical studies of regenerative medicine, compliance to good manufacturing practice (GMP) is mandatory, and thus much time is needed for manufacturing and quality management. It is therefore desired to introduce the manufacturing execution system (MES), which is being adopted by factories manufacturing pharmaceutical products. Meanwhile, in manufacturing human cell/tissue processing autologous products, it is necessary to protect patients' personal information, prevent patients from being identified and obtain information for cell/tissue identification. We therefore considered it difficult to adopt conventional MES to regenerative medicine-related clinical trials, and so developed novel software for production/quality management to be used in cell-processing centres (CPCs), conforming to GMP. Since this system satisfies the requirements of regulations in Japan and the USA for electronic records and electronic signatures (ER/ES), the use of ER/ES has been allowed, and the risk of contamination resulting from the use of recording paper has been eliminated, thanks to paperless operations within the CPC. Moreover, to reduce the risk of mix-up and cross-contamination due to contact during production, we developed a touchless input device with built-in radio frequency identification (RFID) reader-writer devices and optical sensors. The use of this system reduced the time to prepare and issue manufacturing instructions by 50% or more, compared to the conventional handwritten system. The system contributes to producing more large-scale production and to reducing production costs for cell and tissue products in regenerative medicine. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.

  14. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  15. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  16. Enzyme-based logic systems for information processing.

    PubMed

    Katz, Evgeny; Privman, Vladimir

    2010-05-01

    In this critical review we review enzymatic systems which involve biocatalytic reactions utilized for information processing (biocomputing). Extensive ongoing research in biocomputing, mimicking Boolean logic gates has been motivated by potential applications in biotechnology and medicine. Furthermore, novel sensor concepts have been contemplated with multiple inputs processed biochemically before the final output is coupled to transducing "smart-material" electrodes and other systems. These applications have warranted recent emphasis on networking of biocomputing gates. First few-gate networks have been experimentally realized, including coupling, for instance, to signal-responsive electrodes for signal readout. In order to achieve scalable, stable network design and functioning, considerations of noise propagation and control have been initiated as a new research direction. Optimization of single enzyme-based gates for avoiding analog noise amplification has been explored, as were certain network-optimization concepts. We review and exemplify these developments, as well as offer an outlook for possible future research foci. The latter include design and uses of non-Boolean network elements, e.g., filters, as well as other developments motivated by potential novel sensor and biotechnology applications (136 references).

  17. New Information Processing Techniques for Military Systems (les Nouvelles techniques de traitement de l’information pour les systemes militaires)

    DTIC Science & Technology

    2001-04-01

    including data dissemination) • Situation analysis (incl. real-time interpretation of large amounts of battlefield information) • Processing demands...information on extended/global battlefields) 2. Situation Analysis (incl. real-time interpretation of large amounts of battlefield information) 3...systems. For this to be possible, the systems must agree to exchange and interpret information in a standardised (unambiguous) way. In other words

  18. BOOK REVIEW: Theory of Neural Information Processing Systems

    NASA Astrophysics Data System (ADS)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  19. A coherent optical feedback system for optical information processing

    NASA Technical Reports Server (NTRS)

    Jablonowski, D. P.; Lee, S. H.

    1975-01-01

    A unique optical feedback system for coherent optical data processing is described. With the introduction of feedback, the well-known transfer function for feedback systems is obtained in two dimensions. Operational details of the optical feedback system are given. Experimental results of system applications in image restoration, contrast control and analog computation are presented.

  20. Management Information System Project. Data Processors Manual to the Program Oriented Accounting System: The Budgetary Process.

    ERIC Educational Resources Information Center

    Foley, Walter; Harr, Gordon

    The purpose of this manual is to serve the needs of a data processing facility in the operation of a management information system (MIS). Included in the manual are system flowcharts, job control language, and system documentation. The system has been field tested and operates under IBM System 360/Model 65-05-MVT-HASP. The programing language is…

  1. Information Processing.

    ERIC Educational Resources Information Center

    Jennings, Carol Ann; McDonald, Sandy

    This publication contains instructional materials for teacher and student use for a course in information processing. The materials are written in terms of student performance using measurable objectives. The course includes 10 units. Each instructional unit contains some or all of the basic components of a unit of instruction: performance…

  2. Information processing in the primate visual system - An integrated systems perspective

    NASA Technical Reports Server (NTRS)

    Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.

    1992-01-01

    The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.

  3. Information processing in the primate visual system - An integrated systems perspective

    NASA Technical Reports Server (NTRS)

    Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.

    1992-01-01

    The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.

  4. Information theory and signal transduction systems: from molecular information processing to network inference.

    PubMed

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Man in Command Information Processing Systems--A Research Program,

    DTIC Science & Technology

    1963-07-01

    7AD-AO?9172 ARM PERSONEL RESEARCH OFFICE WASHINGTON C F/6 5/2 𔄃 AMY N MAN INFMA NRCS NSYTM-REECHPORTCUI JUL. 63 5 RI OELUNCLASSIFIED APORSAC TD...34N U4’A lAff 𔃻 or C J" I IN COWt4AD 1ORMATIONjROCESSING SYS RESEAARCH -POGRAM, 16 L mour/Ringelj j Submitted by Joseph Zeidner Chief, Support Systems...Ringel, S., and Hammer, C . H. Information assimilation from alpha- numeric displays--amount and density of information (in press). Ringel, S., and Smith

  6. Monitoring of the Process of System Information Broadcasting in Time.

    PubMed

    Mironowicz, P; Korbicz, J K; Horodecki, P

    2017-04-14

    One of the problems of quantum physics is how a measurement turns quantum, noncopyable data, towards copyable classical knowledge. We use the quantum state discrimination in a central system model to show how its evolution leads to the broadcasting of the information, and how orthogonalization and decoherence factors allow us to monitor the distance of the state in question to the one perfectly broadcasting information, in any moment of time not just asymptotically. We illustrate this in the spin-spin model where the distance is shown to be typically small and provide the related time scales.

  7. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  8. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  9. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  10. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  11. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  12. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as...

  13. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j)...

  14. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j)...

  15. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j)...

  16. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j)...

  17. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to 42 CFR...

  18. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an...

  19. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an...

  20. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an...

  1. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an...

  2. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency.

    PubMed

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-12-20

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.

  3. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency

    PubMed Central

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-01-01

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced. PMID:24391579

  4. Intrinsic Information Processing and Energy Dissipation in Stochastic Input-Output Dynamical Systems

    DTIC Science & Technology

    2015-07-09

    processes, which play key role in limiting computation. Fortunately, computational mechanics accounts for this diversity, but in autonomous dynamical...computation. Fortunately, computational mechanics accounts for this diversity, but in autonomous dynamical systems. Thermodynamic cycles that perform...useful information processing are nonautonomous systems. To analyze information processing in nonautonomous systems, computational mechanics must be ex

  5. Long-term care information systems: an overview of the selection process.

    PubMed

    Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara

    2006-06-01

    Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.

  6. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) and Information Retrieval System. 277.18 Section 277.18 Agriculture Regulations of the Department of... Data Processing (ADP) and Information Retrieval System. (a) Scope and application. This section.... System specifications means information about the new ADP systems, such as: Workload descriptions, input...

  7. How to build an information gathering and processing system: lessons from naturally and artificially intelligent systems.

    PubMed

    Chappell, Jackie; Demery, Zoe P; Arriola-Rios, Veronica; Sloman, Aaron

    2012-02-01

    Imagine a situation in which you had to design a physical agent that could collect information from its environment, then store and process that information to help it respond appropriately to novel situations. What kinds of information should it attend to? How should the information be represented so as to allow efficient use and re-use? What kinds of constraints and trade-offs would there be? There are no unique answers. In this paper, we discuss some of the ways in which the need to be able to address problems of varying kinds and complexity can be met by different information processing systems. We also discuss different ways in which relevant information can be obtained, and how different kinds of information can be processed and used, by both biological organisms and artificial agents. We analyse several constraints and design features, and show how they relate both to biological organisms, and to lessons that can be learned from building artificial systems. Our standpoint overlaps with Karmiloff-Smith (1992) in that we assume that a collection of mechanisms geared to learning and developing in biological environments are available in forms that constrain, but do not determine, what can or will be learnt by individuals.

  8. Microelectronic Information Processing Systems: Computing Systems. Summary of Awards Fiscal Year 1994.

    ERIC Educational Resources Information Center

    National Science Foundation, Arlington, VA. Directorate for Computer and Information Science and Engineering.

    The purpose of this summary of awards is to provide the scientific and engineering communities with a summary of the grants awarded in 1994 by the National Science Foundation's Division of Microelectronic Information Processing Systems. Similar areas of research are grouped together. Grantee institutions and principal investigators are identified…

  9. Advanced information processing system for advanced launch system: Avionics architecture synthesis

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1991-01-01

    The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.

  10. Customer information and the quality improvement process: developing a customer information system.

    PubMed

    Orme, C N; Parsons, R J; McBride, G Z

    1992-01-01

    As growing numbers of health care organizations institute quality improvement programs, the demand within these organizations for reliable information about customers increases. By establishing a customer information system (CIS)--a model for collecting, archiving, and accessing customer information--health care organizations can eliminate the duplication of research, ensure that customer information is properly collected and interpreted, and provide decision makers access to better, more reliable customer information. Customer-supplier relationships are defined, guidelines for determining information needs are provided, and ways to set up and manage a CIS are suggested.

  11. Explainable expert systems: A research program in information processing

    NASA Technical Reports Server (NTRS)

    Paris, Cecile L.

    1993-01-01

    Our work in Explainable Expert Systems (EES) had two goals: to extend and enhance the range of explanations that expert systems can offer, and to ease their maintenance and evolution. As suggested in our proposal, these goals are complementary because they place similar demands on the underlying architecture of the expert system: they both require the knowledge contained in a system to be explicitly represented, in a high-level declarative language and in a modular fashion. With these two goals in mind, the Explainable Expert Systems (EES) framework was designed to remedy limitations to explainability and evolvability that stem from related fundamental flaws in the underlying architecture of current expert systems.

  12. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  13. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  14. Information Processing as a Key Factor for Modern Federations of Combat Information Systems

    DTIC Science & Technology

    2001-04-01

    and vice versa becomes possible. Thus, in order to achieve data modeling experts of almost all nations in NATO a collaborative information processing...Data Harmonization in the Integration Process Lncal Schema ATCCIS Schema Dtbase Systm The data mediation implementation described in this document favors

  15. The role of the microprocessor in onboard image processing for the information adaptive system

    NASA Technical Reports Server (NTRS)

    Kelly, W. L., IV; Meredith, B. D.

    1980-01-01

    The preliminary design of the Information Adaptive System is presented. The role of the microprocessor in the implementation of the individual processing elements is discussed. Particular emphasis is placed on multispectral image data processing.

  16. On AIPS++, a new astronomical information processing system

    NASA Technical Reports Server (NTRS)

    Croes, G. A.

    1992-01-01

    The AIPS system that has served the needs of the radio astronomical community remarkably well during the last 15 years is showing signs of age and is being replaced by a more modern system, AIPS++. As the name implies, AIPS++ will be developed in a object oriented fashion and will use C++ as its main programming language. The work is being done by a consortium of seven organizations, with coordinated activities worldwide. After a review of the history of the project to this date from management, astronomical and technical viewpoints, and the current state of the project, the paper concentrates on the tradeoffs implied by the choice of implementation style and the lessons we have learned, good and bad.

  17. Advanced information processing system: Fault injection study and results

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura F.; Masotto, Thomas K.; Lala, Jaynarayan H.

    1992-01-01

    The objective of the AIPS program is to achieve a validated fault tolerant distributed computer system. The goals of the AIPS fault injection study were: (1) to present the fault injection study components addressing the AIPS validation objective; (2) to obtain feedback for fault removal from the design implementation; (3) to obtain statistical data regarding fault detection, isolation, and reconfiguration responses; and (4) to obtain data regarding the effects of faults on system performance. The parameters are described that must be varied to create a comprehensive set of fault injection tests, the subset of test cases selected, the test case measurements, and the test case execution. Both pin level hardware faults using a hardware fault injector and software injected memory mutations were used to test the system. An overview is provided of the hardware fault injector and the associated software used to carry out the experiments. Detailed specifications are given of fault and test results for the I/O Network and the AIPS Fault Tolerant Processor, respectively. The results are summarized and conclusions are given.

  18. Certifying single-system steering for quantum-information processing

    NASA Astrophysics Data System (ADS)

    Li, Che-Ming; Chen, Yueh-Nan; Lambert, Neill; Chiu, Ching-Yi; Nori, Franco

    2015-12-01

    Einstein-Podolsky-Rosen (EPR) steering describes how different ensembles of quantum states can be remotely prepared by measuring one particle of an entangled pair. Here, we investigate quantum steering for single quantum d -dimensional systems (qudits) and devise efficient conditions to certify the steerability therein, which we find are applicable both to single-system steering and EPR steering. In the single-system case our steering conditions enable the unambiguous ruling out of generic classical means of mimicking steering. Ruling out "false-steering" scenarios has implications for securing channels against both cloning-based individual attack and coherent attacks when implementing quantum key distribution using qudits. We also show that these steering conditions also have applications in quantum computation, in that they can serve as an efficient criterion for the evaluation of quantum logic gates of arbitrary size. Finally, we describe how the nonlocal EPR variant of these conditions also function as tools for identifying faithful one-way quantum computation, secure entanglement-based quantum communication, and genuine multipartite EPR steering.

  19. Enhancements and Algorithms for Avionic Information Processing System Design Methodology.

    DTIC Science & Technology

    1982-06-16

    when a task finishes executing it may write its results to a file. Another task reads the file, then uses the data to perform its job. The file must be...tasks that write to it or read from it. A certain amount of memory is needed for each file. We can then allocate the files along with the tasks in the...Optimal Scheduling of Multi- Processor Systems. Ph.D. Dissertation . Department of Computer Science, Stanford University. [35] KEMENY, J.G. and SNELL, J.L

  20. Advanced information processing system for advanced launch system: Hardware technology survey and projections

    NASA Technical Reports Server (NTRS)

    Cole, Richard

    1991-01-01

    The major goals of this effort are as follows: (1) to examine technology insertion options to optimize Advanced Information Processing System (AIPS) performance in the Advanced Launch System (ALS) environment; (2) to examine the AIPS concepts to ensure that valuable new technologies are not excluded from the AIPS/ALS implementations; (3) to examine advanced microprocessors applicable to AIPS/ALS, (4) to examine radiation hardening technologies applicable to AIPS/ALS; (5) to reach conclusions on AIPS hardware building blocks implementation technologies; and (6) reach conclusions on appropriate architectural improvements. The hardware building blocks are the Fault-Tolerant Processor, the Input/Output Sequencers (IOS), and the Intercomputer Interface Sequencers (ICIS).

  1. Advanced information processing system: Authentication protocols for network communication

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.

    1994-01-01

    In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.

  2. Dynamic systems and inferential information processing in human communication.

    PubMed

    Grammer, Karl; Fink, Bernhard; Renninger, LeeAnn

    2002-12-01

    Research in human communication on an ethological basis is almost obsolete. The reasons for this are manifold and lie partially in methodological problems connected to the observation and description of behavior, as well as the nature of human behavior itself. In this chapter, we present a new, non-intrusive, technical approach to the analysis of human non-verbal behavior, which could help to solve the problem of categorization that plagues the traditional approaches. We utilize evolutionary theory to propose a new theory-driven methodological approach to the 'multi-unit multi-channel modulation' problem of human nonverbal communication. Within this concept, communication is seen as context-dependent (the meaning of a signal is adapted to the situation), as a multi-channel and a multi-unit process (a string of many events interrelated in 'communicative' space and time), and as related to the function it serves. Such an approach can be utilized to successfully bridge the gap between evolutionary psychological research, which focuses on social cognition adaptations, and human ethology, which describes every day behavior in an objective, systematic way.

  3. Advanced Information Processing System (AIPS) proof-of-concept system functional design I/O network system services

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The function design of the Input/Output (I/O) services for the Advanced Information Processing System (AIPS) proof of concept system is described. The data flow diagrams, which show the functional processes in I/O services and the data that flows among them, are contained. A complete list of the data identified on the data flow diagrams and in the process descriptions are provided.

  4. Medicaid Program; Mechanized Claims Processing and Information Retrieval Systems (90/10). Final rule.

    PubMed

    2015-12-04

    This final rule will extend enhanced funding for Medicaid eligibility systems as part of a state's mechanized claims processing system, and will update conditions and standards for such systems, including adding to and updating current Medicaid Management Information Systems (MMIS) conditions and standards. These changes will allow states to improve customer service and support the dynamic nature of Medicaid eligibility, enrollment, and delivery systems.

  5. SURVEY OF HIGHLY PARALLEL INFORMATION PROCESSING TECHNOLOGY AND SYSTEMS. PHASE I OF AN IMPLICATIONS STUDY,

    DTIC Science & Technology

    The purpose of this report is to present the results of a survey of the technology of highly parallel information processing technology and systems... processing technology and systems. Completion of this study will require a survey of naval systems to determine which can benefit from the technology

  6. Perioperative blood ordering optimization process using information from an anesthesia information management system.

    PubMed

    Rinehart, Joseph B; Lee, Tiffany C; Kaneshiro, Kayleigh; Tran, Minh-Ha; Sun, Coral; Kain, Zeev N

    2016-04-01

    As part of ongoing perioperative surgical home implantation process, we applied a previously published algorithm for creation of a maximum surgical blood order schedule (MSBOS) to our operating rooms. We hypothesized that using the MSBOS we could show a reduction in unnecessary preoperative blood testing and associated costs. Data regarding all surgical cases done at UC Irvine Health's operating rooms from January 1, 2011, to January 1, 2014 were extracted from the anesthesia information management systems (AIMS). After the data were organized into surgical specialties and operative sites, blood order recommendations were generated based on five specific case characteristics of the group. Next, we assessed current ordering practices in comparison to actual blood utilization to identify potential areas of wastage and performed a cost analysis comparing the annual hospital costs from preoperative blood orders if the blood order schedule were to be followed to historical practices. Of the 19,138 patients who were categorized by the MSBOS as needing no blood sample, 2694 (14.0%) had a type and screen (T/S) ordered and 1116 (5.8%) had a type and crossmatch ordered. Of the 6073 procedures where MSBOS recommended only a T/S, 2355 (38.8%) had blood crossmatched. The cost analysis demonstrated an annual reduction in actual hospital costs of $57,335 with the MSBOS compared to historical blood ordering practices. We showed that the algorithm for development of a multispecialty blood order schedule is transferable and yielded reductions in preoperative blood product screening at our institution. © 2016 AABB.

  7. Quantify information system benefits

    SciTech Connect

    Koppel, L.B.

    1995-06-01

    What are information systems and how do they relate to control systems? How do information systems produce benefits in hydrocarbon processing? What are some examples of benefit-generating information system applications? Information System Benefits (ISBEN) is a structured methodology for estimating information system benefits in hydrocarbon processing. The paper discusses information and control systems, information system benefits and applications, objectives, strategies and measures of ISBEN, ISBEN business drivers, ISBEN database, ISBEN methodology, and implementation.

  8. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    PubMed

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  9. Research and Development in the Computer and Information Sciences. Volume 2, Processing, Storage, and Output Requirements in Information Processing Systems: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    Areas of concern with respect to processing, storage, and output requirements of a generalized information processing system are considered. Special emphasis is placed on multiple-access systems. Problems of system management and control are discussed, including hierarchies of storage levels. Facsimile, digital, and mass random access storage…

  10. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optical signal-processing systems based on anisotropic media

    NASA Astrophysics Data System (ADS)

    Kiyashko, B. V.

    1995-10-01

    Partially coherent optical systems for signal processing are considered. The transfer functions are formed in these systems by interference of polarised light transmitted by an anisotropic medium. It is shown that such systems can perform various integral transformations of both optical and electric signals, in particular, two-dimensional Fourier and Fresnel transformations, as well as spectral analysis of weak light sources. It is demonstrated that such systems have the highest luminosity and vibration immunity among the systems with interference formation of transfer functions. An experimental investigation is reported of the application of these systems in the processing of signals from a linear hydroacoustic antenna array, and in measurements of the optical spectrum and of the intrinsic noise.

  11. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    PubMed

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  13. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  14. The Effectiveness of Information Systems Teams as Change Agents in the Implementation of Business Process Reengineering

    ERIC Educational Resources Information Center

    Griffith, Gary L.

    2009-01-01

    Changes to information systems and technology (IS/IT) are happening faster than ever before. A literature review suggested within business process reengineering (BPR) there is limited information on what an IS/IT team could do to reduce resistance to change and increase user acceptance. The purpose of this ethnographic case study was to explore…

  15. The Effectiveness of Information Systems Teams as Change Agents in the Implementation of Business Process Reengineering

    ERIC Educational Resources Information Center

    Griffith, Gary L.

    2009-01-01

    Changes to information systems and technology (IS/IT) are happening faster than ever before. A literature review suggested within business process reengineering (BPR) there is limited information on what an IS/IT team could do to reduce resistance to change and increase user acceptance. The purpose of this ethnographic case study was to explore…

  16. A KPI framework for process-based benchmarking of hospital information systems.

    PubMed

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  17. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  18. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  19. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  20. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  1. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  2. Low-cost Landsat digital processing system for state and local information systems

    NASA Technical Reports Server (NTRS)

    Hooper, N. J.; Spann, G. W.; Faust, N. L.; Paludan, C. T. N.

    1979-01-01

    The paper details a minicomputer-based system which is well within the budget of many state, regional, and local agencies that previously could not afford digital processing capability. In order to achieve this goal a workable small-scale Landsat system is examined to provide low-cost automated processing. It is anticipated that the alternative systems will be based on a single minicomputer, but that the peripherals will vary depending on the capability emphasized in a particular system.

  3. Natural language processing systems for capturing and standardizing unstructured clinical information: A systematic review.

    PubMed

    Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis

    2017-09-01

    We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Polarization information processing and software system design for simultaneously imaging polarimetry

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Liu, Jing; Jin, Weiqi; Wen, Renjie

    2015-08-01

    Simultaneous imaging polarimetry can realize real-time polarization imaging of the dynamic scene, which has wide application prospect. This paper first briefly illustrates the design of the double separate Wollaston Prism simultaneous imaging polarimetry, and then emphases are put on the polarization information processing methods and software system design for the designed polarimetry. Polarization information processing methods consist of adaptive image segmentation, high-accuracy image registration, instrument matrix calibration. Morphological image processing was used for image segmentation by taking dilation of an image; The accuracy of image registration can reach 0.1 pixel based on the spatial and frequency domain cross-correlation; Instrument matrix calibration adopted four-point calibration method. The software system was implemented under Windows environment based on C++ programming language, which realized synchronous polarization images acquisition and preservation, image processing and polarization information extraction and display. Polarization data obtained with the designed polarimetry shows that: the polarization information processing methods and its software system effectively performs live realize polarization measurement of the four Stokes parameters of a scene. The polarization information processing methods effectively improved the polarization detection accuracy.

  5. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets.

    PubMed

    Yang, Jianji; Cohen, Aaron; Hersh, William

    2009-02-05

    Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS) is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html.

  6. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  7. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  8. How Process Helps You in Developing a High Quality Medical Information System

    NASA Astrophysics Data System (ADS)

    Akiyama, Yoshihiro

    A medical information system is one extreme in using tacit knowledge that patricians and medical experts such as medical doctors use a lot but the knowledge may include a lot of experience information and be not explicitly formulated or implied. This is simply different from other discipline areas such as embedded engineering systems. Developing a mechanical system critically depends on how effectively such various knowledge is organized and integrated in implementing a system. As such, the development process that customers, management, engineers, and teams are involved must be evaluated from this view point. Existence of tacit knowledge may not be sensed well enough at project beginning, however it is necessary for project success. This paper describes the problems and how the Personal Software Process (PSP) and Team Software Process (TSP2) manage this problem and then typical performance results are discussed. It may be said that PSP individual and TSP team are CMMI level 4 units respectively.

  9. Methodologies for automating the collection and processing of GPS-GIS information for transportation systems

    NASA Astrophysics Data System (ADS)

    Zhao, Bingyan

    All transportation departments have large amounts of data and information that are needed for planning and operation of their systems. This information can be textual, graphical or spatial in nature. Spatial information is generally in the form of maps and these maps are increasingly being stored and processed as digital GIS files that can be linked to other types of information generally referred to as attribute information. In the NYSDOT database, there are many kinds of features for which information must be maintained. For example, there are about 22,500 bridges within the New York State road systems. The current spatial location for these bridges may not have the level of accuracy that would be desired by today's standards and that can be achieved with new spatial measuring techniques. Although the updating of bridge locations and the location of other features can be done using new techniques such as GPS, if this is done manually it presents a forbidding task. The main objective of this study is to find a way to automatically collect feature location data using GPS equipment and to automate the transfer of this information into archival databases. Among the objectives of this dissertation are: how to automatically download information from the DOT database; how to collect field data following a uniform procedure; how to convert the surveying results into Arc/View shape files and how to update the DOT feature location map information using field data. The end goal is to develop feasible methodologies to automate updating of mapping information using GPS by creating a systems design for the process and to create the scripts and programming needed to make this system work. This has been accomplished and is demonstrated in a sample program. Details of the Automated Acquisition System are described in this dissertation.

  10. Prioritizing factors influencing nurses' satisfaction with hospital information systems: a fuzzy analytic hierarchy process approach.

    PubMed

    Kimiafar, Khalil; Sadoughi, Farahnaz; Sheikhtaheri, Abbas; Sarbaz, Masoumeh

    2014-04-01

    Our aim was to use the fuzzy analytic hierarchy process approach to prioritize the factors that influence nurses' satisfaction with a hospital information system. First, we reviewed the related literature to identify and select possible factors. Second, we developed an analytic hierarchy process framework with three main factors (quality of services, of systems, and of information) and 22 subfactors. Third, we developed a questionnaire based on pairwise comparisons and invited 10 experienced nurses who were identified through snowball sampling to rate these factors. Finally, we used Chang's fuzzy extent analysis method to compute the weights of these factors and prioritize them. We found that information quality was the most important factor (58%), followed by service quality (22%) and then system quality (19%). In conclusion, although their weights were not similar, all factors were important and should be considered in evaluating nurses' satisfaction.

  11. PREFACE: Quantum information processing

    NASA Astrophysics Data System (ADS)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  12. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  13. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  14. International Ultraviolet Explorer New Spectral Image Processing System Information Manual: Version 2.0

    NASA Astrophysics Data System (ADS)

    Garhart, M. P.; Smith, M. A.; Turnrose, B. E.; Levay, K. L.; Thompson, R. W.

    1997-10-01

    This document is intended for use by researchers who wish to analyze data acquired by the International Ultraviolet Explorer (IUE) and processed for the IUE Final Archive with the New Spectral Image Processing System (NEWSIPS) at either Goddard Space Flight Center (GSFC) or the European Space Agency (ESA) Villafranca del Castillo IUE Observatory (VILSPA). The information contained in this document explains the instrument characteristics and the processing methodology and calibration techniques used in the NEWSIPS system to produce the output products available to researchers. This second version of the IUE NEWSIPS Information Manual has been updated to include the processing techniques for LWR low-dispersion and LWP, LWR, and SWP high-dispersion data.

  15. Implementing an Enterprise Information System to Reengineer and Streamline Administrative Processes in a Distance Learning Unit

    ERIC Educational Resources Information Center

    Abdous, M'hammed; He, Wu

    2009-01-01

    During the past three years, we have developed and implemented an enterprise information system (EIS) to reengineer and facilitate the administrative process for preparing and teaching distance learning courses in a midsized-to-large university (with 23,000 students). The outcome of the implementation has been a streamlined and efficient process…

  16. Information-Processing Architectures in Multidimensional Classification: A Validation Test of the Systems Factorial Technology

    ERIC Educational Resources Information Center

    Fific, Mario; Nosofsky, Robert M.; Townsend, James T.

    2008-01-01

    A growing methodology, known as the systems factorial technology (SFT), is being developed to diagnose the types of information-processing architectures (serial, parallel, or coactive) and stopping rules (exhaustive or self-terminating) that operate in tasks of multidimensional perception. Whereas most previous applications of SFT have been in…

  17. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  18. A Collaborative Knowledge Management Process for Implementing Healthcare Enterprise Information Systems

    NASA Astrophysics Data System (ADS)

    Cheng, Po-Hsun; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    This paper illustrates a feasible health informatics domain knowledge management process which helps gather useful technology information and reduce many knowledge misunderstandings among engineers who have participated in the IBM mainframe rightsizing project at National Taiwan University (NTU) Hospital. We design an asynchronously sharing mechanism to facilitate the knowledge transfer and our health informatics domain knowledge management process can be used to publish and retrieve documents dynamically. It effectively creates an acceptable discussion environment and even lessens the traditional meeting burden among development engineers. An overall description on the current software development status is presented. Then, the knowledge management implementation of health information systems is proposed.

  19. Automated system function allocation and display format: Task information processing requirements

    NASA Technical Reports Server (NTRS)

    Czerwinski, Mary P.

    1993-01-01

    An important consideration when designing the interface to an intelligent system concerns function allocation between the system and the user. The display of information could be held constant, or 'fixed', leaving the user with the task of searching through all of the available information, integrating it, and classifying the data into a known system state. On the other hand, the system, based on its own intelligent diagnosis, could display only relevant information in order to reduce the user's search set. The user would still be left the task of perceiving and integrating the data and classifying it into the appropriate system state. Finally, the system could display the patterns of data. In this scenario, the task of integrating the data is carried out by the system, and the user's information processing load is reduced, leaving only the tasks of perception and classification of the patterns of data. Humans are especially adept at this form of display processing. Although others have examined the relative effectiveness of alphanumeric and graphical display formats, it is interesting to reexamine this issue together with the function allocation problem. Currently, Johnson Space Center is the test site for an intelligent Thermal Control System (TCS), TEXSYS, being tested for use with Space Station Freedom. Expert TCS engineers, as well as novices, were asked to classify several displays of TEXSYS data into various system states (including nominal and anomalous states). Three different display formats were used: fixed, subset, and graphical. The hypothesis tested was that the graphical displays would provide for fewer errors and faster classification times by both experts and novices, regardless of the kind of system state represented within the display. The subset displays were hypothesized to be the second most effective display format/function allocation condition, based on the fact that the search set is reduced in these displays. Both the subset and the

  20. [Use of the automated system of information processing in tuberculosis control].

    PubMed

    Podolinnyĭ, G I; Degtiarev, V P; Brumar', A G; Krivenko, G T; Mokan, B G

    1991-01-01

    A notification variant has been designed on the basis of the No. 089 form "Notification of a patient with newly diagnosed active tuberculosis", which is used for information feeding into computer on persons with newly diagnosed tuberculosis and on those registered in a dispensary; it is also used as a corrective coupon. The automated system for information processing "Tuberculosis" has been developed which allows a centralized control over the early detection and dispensarization of patients, efficient assessment of the accumulated information and proper decision making. The effectiveness of the electronic devices used in dispensary work can be improved on the condition that phthisiologists are provided with personal computers.

  1. Systems Factorial Technology provides new insights on global–local information processing in autism spectrum disorders

    PubMed Central

    Johnson, Shannon A.; Blaha, Leslie M.; Houpt, Joseph W.; Townsend, James T.

    2010-01-01

    Previous studies of global–local processing in autism spectrum disorders (ASDs) have indicated mixed findings, with some evidence of a local processing bias, or preference for detail-level information, and other results suggesting typical global advantage, or preference for the whole or gestalt. Findings resulting from this paradigm have been used to argue for or against a detail focused processing bias in ASDs, and thus have important theoretical implications. We applied Systems Factorial Technology, and the associated Double Factorial Paradigm (both defined in the text), to examine information processing characteristics during a divided attention global–local task in high-functioning individuals with an ASD and typically developing controls. Group data revealed global advantage for both groups, contrary to some current theories of ASDs. Information processing models applied to each participant revealed that task performance, although showing no differences at the group level, was supported by different cognitive mechanisms in ASD participants compared to controls. All control participants demonstrated inhibitory parallel processing and the majority demonstrated a minimum-time stopping rule. In contrast, ASD participants showed exhaustive parallel processing with mild facilitatory interactions between global and local information. Thus our results indicate fundamental differences in the stopping rules and channel dependencies in individuals with an ASD. PMID:23750050

  2. Practical Applications of Space Systems, Supporting Paper 13: Information Services and Information Processing.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Assembly of Engineering.

    This report summarizes the findings of one of fourteen panels that studied progress in space science applications and defined user needs potentially capable of being met by space-system applications. The study was requested by the National Aeronautics and Space Administration (NASA) and was conducted by the Space Applications Board. The panels…

  3. Geographic information system programs for use in the water-supply-allocation permitting process

    USGS Publications Warehouse

    Dunne, Paul; Price, C.V.

    1995-01-01

    Computer programs designed for use in a geographic information system as an aid in the water-supply- allocation permitting process are described. These programs were developed by the U.S. Geological Survey during a project conducted in cooperation with the New Jersey Department of Environmental Protection. The programs enable a user to display proposed water-supply-allocation sites in a defined area together with present sites and important hydrologic and geographic features on a computer screen or on hardcopy plots. The programs are menu-driven and do not require familiarity with geographic information systems. Source codes for the programs are included in appendixes.

  4. Application of symbolic processing to command and control: An Advanced Information Presentation System. Volume 1: Overview

    NASA Astrophysics Data System (ADS)

    Zdybel, F.; Gibbons, J.; Greenfeld, N.; Yonke, M.

    1981-08-01

    This report describes the work performed in the second year of the three-year contract to explore the application of symbolic processing to command and control (C2); specifically, the graphics interface between the C2 user and a complex C2 decision support system. In Volume 1, the goals and approaches used in the design of the prototype system, AIPS (Advanced Information Presentation System), are discussed, as well as the year's efforts to extend the prototype. An overview of the current AIPS system is also provided.

  5. A CMOS Visual Sensing System for Welding Control and Information Acquirement in SMAW Process

    NASA Astrophysics Data System (ADS)

    Anren, Yao; Zhen, Luo; Sansan, Ao

    A sequential research work on visual information of manual arc welding pool dynamics are presented in this paper. An optical inspection system, for monitoring the shielded manual arc welding (SMAW) process is described. The system consisted of a vision sensor that consisted of a Complementary Metal Oxide Semiconductor (CMOS) camera and lenses, image processing algorithms, and a computer controller. During welding, an image of the weld pool and its vicinity was captured when basic current of welding power. Experimental results showed that the temperature signal varies greatly in the case of instabilities of the weld pool that cause weld defects. The visual information acquirement methods are focused in computer vision sensing, image processing and characteristic extraction of the weld pool surface from the single-item pool images by particular algorithms control strategies are developed to control welding pool dynamics during SMAW.

  6. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  7. Potential capabilities for compression of information of certain data processing systems

    NASA Technical Reports Server (NTRS)

    Khodarev, Y. K.; Yevdokimov, V. P.; Pokras, V. M.

    1974-01-01

    This article undertakes to study a generalized block diagram of a data collection and processing system of a spacecraft in which a number of sensors or outputs of scientific instruments are cyclically interrogated by a commutator, methods of writing the supplementary information in a frame on the example of a certain hypothetical telemetry system, and the influence of statistics of number of active channels in a frame on frame compression factor. The separation of the data compression factor of the collection and processing system of spacecraft into two parts used in this work allows determination of the compression factor of an active frame depending not only on the statistics of activity of channels in the telemetry frame, but also on the method of introduction of the additional address and time information to each frame.

  8. Integration of health care process analysis in the design of a clinical information system: applying to the blood transfusion process.

    PubMed Central

    Staccini, P.; Joubert, M.; Quaranta, J. F.; Fieschi, D.; Fieschi, M.

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The requirements of the system have to meet users' needs in relation to both the quality (efficacy, conformity, safety) and the monitoring of all health care activities (traceability). Information analysts need complementary methods to conceptualize clinical information systems that provide actors with immediate individual benefits and guide collective behavioral changes. A methodology is proposed to elicit users' needs using a process-oriented analysis, and it is applied to the field of blood transfusion. We defined a process data model, the main components of which are: activities, resources, constrains, guidelines and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be assessed. PMID:11079999

  9. Integration of health care process analysis in the design of a clinical information system: applying to the blood transfusion process.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The requirements of the system have to meet users' needs in relation to both the quality (efficacy, conformity, safety) and the monitoring of all health care activities (traceability). Information analysts need complementary methods to conceptualize clinical information systems that provide actors with immediate individual benefits and guide collective behavioral changes. A methodology is proposed to elicit users' needs using a process-oriented analysis, and it is applied to the field of blood transfusion. We defined a process data model, the main components of which are: activities, resources, constrains, guidelines and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be assessed.

  10. Weather Information Processing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  11. The impacts of standardized information management processes on NDA and NDE waste characterization systems

    SciTech Connect

    Lemons, C.J.; Conrad, K.W.

    1995-12-31

    The evolution of standards-based information management has not fully penetrated nondestructive assay and nondestructive examination operating platforms. Interoperability concepts, which are fundamental to successful information management architecture and structure, are sweeping the markets and redefining the computing industries. The need for the federal government to improve its effectiveness from an informed position has invoked Information Management (IM) concepts into the federal government`s strategies and policies. These strategies and policies are becoming regulatory mandates which are to be imposed contractually, directly delegating responsibility to the contractors to ensure compliance. Participants in the waste clean-up arena will need to ensure that their analyses systems and reporting practices fulfill these emerging life-cycle information management requirements, both to meet the customer`s need and to protect themselves from legal liability. The challenge today faced by the NDA/NDE industry is to adopt these IM concepts, and utilize them in the assay systems and structures. The robust systems developed to perform the NDA/NDE analyses must be equally robust to address these regulatory and contractual mandates. The greatest impact of the regulations in the NDA/NDE arena will be to actually standardize and produce standards-based analyses reports that include integration capability with all the elements of NDA/NDE processes and be interchangeable with all other ancillary processes.

  12. A salient information processing system for bionic eye with application to obstacle avoidance.

    PubMed

    Stacey, Ashley; Li, Yi; Barnes, Nick

    2011-01-01

    In this paper we present a visual processing system for bionic eye with a focus on obstacle avoidance. Bionic eye aims at restoring the sense of vision to people living with blindness and low vision. However, current hardware implant technology limits the image resolution of the electrical stimulation device to be very low (e.g., 100 electrode arrays, which is approx. 12 × 9 pixels). Therefore, we need a visual processing unit that extracts salient information in an unknown environment for assisting patients in daily tasks such as obstacle avoidance. We implemented a fully portable system that includes a camera for capturing videos, a laptop for processing information using a state-of-the-art saliency detection algorithm, and a head-mounted display to visualize results. The experimental environment consists of a number of objects, such as shoes, boxes, and foot stands, on a textured ground plane. Our results show that the system efficiently processes the images, effectively identifies the obstacles, and eventually provides useful information for obstacle avoidance.

  13. Mobile mammography: An evaluation of organizational, process, and information systems challenges.

    PubMed

    Browder, Casey; Eberth, Jan M; Schooley, Benjamin; Porter, Nancy R

    2015-03-01

    The purpose of this case study was to evaluate the information systems, personnel, and processes involved in mobile mammography settings, and offer recommendations to improve efficiency and satisfaction among patients and staff. Data includes on-site observations, interviews, and an electronic medical record review of a hospital who offers both mobile and fixed facility mammography services to their community. The optimal expectations for the process of mobile mammography from multiple perspectives were defined as (1) patient receives mammogram the day of their visit, (2) patient has efficient intake process with little wait time, (3) follow-up is completed and timely, (4) site contact and van staff are satisfied with van visit and choose to schedule future visits, and (5) the MMU is able to assess its performance and set goals for improvement. Challenges that prevent the realization of those expectations include a low patient pre-registration rate, difficulty obtaining required physician orders, frequent information system downtime/Internet connectivity issues, ill-defined organizational communication/roles, insufficient site host/patient education, and disparate organizational and information systems. Our recommendations include employing a dedicated mobile mammography team for end-to-end oversight, mitigating for system connectivity issues, allowing for patient self-referrals, integrating scheduling and registration processes, and a focused approach to educating site hosts and respective patients about expectations for the day of the visit. The MMU is an important community resource; we recommend simple process improvements and information flow improvements to further enable the MMU׳s goals. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. On-line systems in the information process of INSERM researchers.

    PubMed

    Sturge-Moore, L

    1980-01-01

    The process by which biomedical researchers at INSERM search for the information they need in the course of their research work is described from the evidence of a two-part questionnaire-based survey. In particular, the specific problems involved in the use of on-line retrieval systems, such as MEDLINE, are investigated, with a view to improving and expanding the use of such systems. Owing to the very nature of documentation practices, the situation seems likely to evolve towards greater use of these systems in the future, but this evolution will remain slow.

  15. Realization of quantum information processing in quantum star network constituted by superconducting hybrid systems

    NASA Astrophysics Data System (ADS)

    Li, Wenlin; Li, Chong; Song, Heshan

    2016-12-01

    In the framework of superconducting hybrid systems, we construct a star quantum network in which a superconducting transmission line resonator as a quantum bus and multiple units constituted by transmission line resonator and superconducting qubits as the carriers of quantum information. We further propose and analyze a theoretical scheme to realize quantum information processing in the quantum network. The coupling between the bus and any two superconducting qubits can be selectively implemented based on the dark state resonances of the highly dissipative transmission line resonators, and it can be found that quantum information processing between any two units can be completed in one step. As examples, the transmission of unknown quantum states and the preparation of quantum entanglement in this quantum network are investigated. At last, we exhibit our simulation results and complete the relevant discussions in order to show the advantages of this kind of quantum network.

  16. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; hide

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  17. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; Kraft, S.

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  18. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  19. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing.

    PubMed

    Russom, Diana; Ahmed, Amira; Gonzalez, Nancy; Alvarnas, Joseph; DiGiusto, David

    2012-01-01

    Regulatory requirements for the manufacturing of cell products for clinical investigation require a significant level of record-keeping, starting early in process development and continuing through to the execution and requisite follow-up of patients on clinical trials. Central to record-keeping is the management of documentation related to patients, raw materials, processes, assays and facilities. To support these requirements, we evaluated several laboratory information management systems (LIMS), including their cost, flexibility, regulatory compliance, ongoing programming requirements and ability to integrate with laboratory equipment. After selecting a system, we performed a pilot study to develop a user-configurable LIMS for our laboratory in support of our pre-clinical and clinical cell-production activities. We report here on the design and utilization of this system to manage accrual with a healthy blood-donor protocol, as well as manufacturing operations for the production of a master cell bank and several patient-specific stem cell products. The system was used successfully to manage blood donor eligibility, recruiting, appointments, billing and serology, and to provide annual accrual reports. Quality management reporting features of the system were used to capture, report and investigate process and equipment deviations that occurred during the production of a master cell bank and patient products. Overall the system has served to support the compliance requirements of process development and phase I/II clinical trial activities for our laboratory and can be easily modified to meet the needs of similar laboratories.

  20. Designing web services in health information systems: from process to application level.

    PubMed

    Mykkänen, Juha; Riekkinen, Annamari; Sormunen, Marko; Karhunen, Harri; Laitinen, Pertti

    2007-01-01

    Service-oriented architectures (SOAs) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability. We illustrate these aspects with examples from our current work from the service-enabled HIS.

  1. Designing web services in health information systems: from process to application level.

    PubMed

    Mykkänen, Juha; Riekkinen, Annamari; Laitinen, Pertti; Karhunen, Harri; Sormunen, Marko

    2005-01-01

    Service-oriented architectures (SOA) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model, which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability and illustrate these aspects with examples from our current work in service-enabled HIS.

  2. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  3. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  4. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  5. The Information Adaptive System - A demonstration of real-time onboard image processing

    NASA Technical Reports Server (NTRS)

    Thomas, G. L.; Carney, P. C.; Meredith, B. D.

    1983-01-01

    The Information Adaptive System (IAS) program has the objective to develop and demonstrate, at the brassboard level, an architecture which can be used to perform advanced signal procesing functions on board the spacecraft. Particular attention is given to the processing of high-speed multispectral imaging data in real-time, and the development of advanced technology which could be employed for future space applications. An IAS functional description is provided, and questions of radiometric correction are examined. Problems of data packetization are considered along with data selection, a distortion coefficient processor, an adaptive system controller, an image processing demonstration system, a sensor simulator and output data buffer, a test support and demonstration controller, and IAS demonstration operating modes.

  6. Kuhlthau's Information Search Process.

    ERIC Educational Resources Information Center

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  7. Controlling Atomic, Solid-State and Hybrid Systems for Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Gullans, Michael John

    Quantum information science involves the use of precise control over quantum systems to explore new technologies. However, as quantum systems are scaled up they require an ever deeper understanding of many-body physics to achieve the required degree of control. Current experiments are entering a regime which requires active control of a mesoscopic number of coupled quantum systems or quantum bits (qubits). This thesis describes several approaches to this goal and shows how mesoscopic quantum systems can be controlled and utilized for quantum information tasks. The first system we consider is the nuclear spin environment of GaAs double quantum dots containing two electrons. We show that the through appropriate control of dynamic nuclear polarization one can prepare the nuclear spin environment in three distinct collective quantum states which are useful for quantum information processing with electron spin qubits. We then investigate a hybrid system in which an optical lattice is formed in the near field scattering off an array of metallic nanoparticles by utilizing the plasmonic resonance of the nanoparticles. We show that such a system would realize new regimes of dense, ultra-cold quantum matter and can be used to create a quantum network of atoms and plasmons. Finally we investigate quantum nonlinear optical systems. We show that the intrinsic nonlinearity for plasmons in graphene can be large enough to make a quantum gate for single photons. We also consider two nonlinear optical systems based on ultracold gases of atoms. In one case, we demonstrate an all-optical single photon switch using cavity quantum electrodynamics (QED) and slow light. In the second case, we study few photon physics in strongly interacting Rydberg polariton systems, where we demonstrate the existence of two and three photon bound states and study their properties.

  8. Intelligent information system: for automation of airborne early warning crew decision processes

    NASA Astrophysics Data System (ADS)

    Chin, Hubert H.

    1991-03-01

    This paper describes an automation of AEW crew decision processed implemented in an intelligent information system for an advanced AEW aircraft platform. The system utilizes the existing AEW aircraft database and knowledge base such that the database can provide sufficient data to solve the sizable AEW problems. A database management system is recommended for managing the large amount of data. In order to expand a conventional expert system so that is has the capacity to solve the sizable problems, a cooperative model is required to coordinate with five expert systems in the cooperative decision process. The proposed model partitions the traditional knowledge base into a set of disjoint portions which cover the needs of and are shared by the expert systems. Internal communications take place on common shared portions. A cooperative algorithm is required for updating synchronization and concurrent control. The purpose of this paper is to present a cooperative model for enhancing standard rule-based expert systems to make cooperative decision and to superimpose the global knowledge base and database in a more natural fashion. The tools being used for developing the prototype are the ADA programming language and the ORACLE relational database management system.

  9. Language Processing in Information Retrieval.

    ERIC Educational Resources Information Center

    Doszkocs, Tamase

    1986-01-01

    Examines role and contributions of natural-language processing in information retrieval and artificial intelligence research in context of large operational information retrieval systems and services. State-of-the-art information retrieval systems combining the functional capabilities of conventional inverted file term adjacency approach with…

  10. Evaluation of an intravenous preparation information system for improving the reconstitution and dilution process.

    PubMed

    Jo, Yun Hee; Shin, Wan Gyoon; Lee, Ju-Yeun; Yang, Bo Ram; Yu, Yun Mi; Jung, Sun Hoi; Kim, Hyang Sook

    2016-10-01

    There are very few studies reporting the impact of providing intravenous (IV) preparation information on quality use of antimicrobials, particularly regarding their reconstitution and dilution. Therefore, to improve these processes in IV antimicrobial administration, an IV preparation information system (IPIS) was implemented in a hospital. We aimed to evaluate the effect of improving reconstitution and dilution by implementing an IPIS in the electronic medical record (EMR) system. Prescriptions and activity records of nurses for injectable antimicrobials that required reconstitution and dilution for IV preparation from January 2008 to December 2013 were retrieved from EMR, and assessed based on packaging label information for reconstituting and diluting solutions. We defined proper reconstitution and dilution as occurring when the reconstitution and dilution solutions prescribed were consistent with the nurses' acting records. The types of intervention in the IPIS were as follows: a pop-up alert for proper reconstitution and passive guidance for proper dilution. We calculated the monthly proper reconstitution rate (PRR) and proper dilution rate (PDR) and evaluated the changes in these rates and trends using interrupted time series analyses. Prior to the initiation of the reconstitution alert and dilution information, the PRR and PDR were 12.7 and 46.1%, respectively. The reconstitution alert of the IPIS rapidly increased the PRR by 41% (p<0.001), after which the PRR decreased by 0.9% (p=0.013) per month after several months. However, there was no significant change in the rate or trend of the PDR during the study period. This study demonstrated that the provision of reconstitution alerts by the IPIS contributed to improving the reconstitution process of IV antimicrobial injection administration. However, providing passive information on dilution solutions was ineffective. Furthermore, solutions to ensure the continuous effectiveness of alert systems are warranted

  11. Materials And Processes Technical Information System (MAPTIS) LDEF materials data base

    NASA Technical Reports Server (NTRS)

    Funk, Joan G.; Strickland, John W.; Davis, John M.

    1993-01-01

    A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materials data generated from LDEF. The data is searchable by optical, thermal, and mechanical properties, exposure parameters (such as atomic oxygen flux) and author(s) or principal investigator(s). Tne LDEF Materials Data Base was incorporated into the Materials and Processes Technical Information System (MAPTIS). MAPTIS is a collection of materials data which has been computerized and is available to engineers, designers, and researchers in the aerospace community involved in the design and development of spacecraft and related hardware. The LDEF Materials Data Base is described and step-by-step example searches using the data base are included. Information on how to become an authorized user of the system is included.

  12. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  13. MILITARY INFORMATION SYSTEMS,

    DTIC Science & Technology

    upward are usually indications of how effectively the system is developing or operating. The use of computers in information systems tends to increase...computers into information systems must always begin at the lowest level of aggregation in the job hierarchy. Only those information-processing jobs

  14. A highly reliable, autonomous data communication subsystem for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Masotto, Thomas; Alger, Linda

    1990-01-01

    The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.

  15. A highly reliable, autonomous data communication subsystem for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Masotto, Thomas; Alger, Linda

    1990-01-01

    The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.

  16. Behavioural evidence for parallel information processing in the visual system of insects.

    PubMed

    Zhang, S; Srinivasan, M V

    1993-01-01

    Many flying insects display remarkable visual agility in capturing prey or pursuing a potential mate. They are capable of detecting, recognising, tracking and capturing a rapidly moving object on the wing. These manoeuvres are usually completed in a couple of seconds. The interval of time between the absorption of light quanta by the photoreceptors and the generation of an appropriate behavioural response is very short, encompassing only a few tens of milliseconds. In this time the visual nervous system has abstracted the essential features of the object, and recognized it (where appropriate), or measured its movement and computed an interception course. As an elementary unit of computation, we know that a neuron in the nervous system is considerably slower than, say, a flip-flop in the CPU of a modern computer. However, it is evident from the visual performance of an insect that the nervous system as a whole processes optical information much faster than a modern computer does. Rapid processing of visual information by animals therefore has to be attributed to the structure and the modus operandi of the nervous system.

  17. NASA End-to-End Data System /NEEDS/ information adaptive system - Performing image processing onboard the spacecraft

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Howle, W. M.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  18. Information Processing - Administrative Data Processing

    NASA Astrophysics Data System (ADS)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  19. Improving waste management through a process of learning: the South African waste information system.

    PubMed

    Godfrey, Linda; Scott, Dianne

    2011-05-01

    Piloting of the South African Waste Information System (SAWIS) provided an opportunity to research whether the collection of data for a national waste information system could, through a process of learning, change the way that waste is managed in the country, such that there is a noticeable improvement. The interviews with officials from municipalities and private waste companies, conducted as part of the piloting of the SAWIS, highlighted that certain organizations, typically private waste companies have been successful in collecting waste data. Through a process of learning, these organizations have utilized this waste data to inform and manage their operations. The drivers of such data collection efforts were seen to be financial (business) sustainability and environmental reporting obligations, particularly where the company had an international parent company. However, participants highlighted a number of constraints, particularly within public (municipal) waste facilities which hindered both the collection of waste data and the utilization of this data to effect change in the way waste is managed. These constraints included a lack of equipment and institutional capacity in the collection of data. The utilization of this data in effecting change was further hindered by governance challenges such as politics, bureaucracy and procurement, evident in a developing country context such as South Africa. The results show that while knowledge is a necessary condition for resultant action, a theoretical framework of learning does not account for all observed factors, particularly external influences.

  20. A flexible statistics web processing service--added value for information systems for experiment data.

    PubMed

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  1. Towards information processing from nonlinear physical chemistry: a synthetic electrochemical cognitive system.

    PubMed

    Sadeghi, Saman; Thompson, Michael

    2010-01-01

    It is evident that complex animate materials, which operate far from equilibrium, exhibit sensory responses to the environment through emergent patterns. Formation of such patterns is often the underlying mechanism of an active response to environmental changes and can be interpreted as a result of the distributed parallel information processing taking place within the material. Such emergent patterns are not limited to biological entities; indeed there is a wide range of complex nonlinear dissipative systems which exhibit interesting emergent patterns within a range of parameters. As one example, the present paper describes the detection of emergent phenomena associated with surface electrochemical processes that allow the system to respond to input information through evolving patterns in space and time. Associative mapping of this sort offers the opportunity to devise an electrochemical cognitive system (ECS), where pattern formation can be looked at as a macroscopic phenomenon resulting from the extensive distributive computing that occurs at the microscopic level. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Challenges in Scheduling Aggregation in CyberPhysical Information Processing Systems

    SciTech Connect

    Horey, James L; Lagesse, Brent J

    2011-01-01

    Data aggregation (a.k.a reduce operations) is an important element in information processing systems, including MapReduce clusters and cyberphysical networks. Unlike simple sensor networks, all the data in information processing systems must be eventually aggregated. Our goal is to lower overall latency in these systems by intelligently scheduling aggregation on intermediate routing nodes. Unlike previous models, our model explicitly takes into account link latency and computa- tion time. Our model also considers heterogeneous computing capabilities. In order to understand the potential challenges associated with constructing a distributed scheduler that minimizes la- tency, we ve developed a simulation of our model and tested the results of randomly scheduling nodes. Although these experiments were designed to provide data for a null-model, preliminary results have yielded a few interesting observations. We show that in cases where the computation time is larger than transmission time, in-network aggregation can have a large effect (reducing latency by 50% or more), but that naive scheduling can have a detrimental effect. Specifically, we show that when the root node (a.k.a the basestation) is faster than the other nodes, the latency can increase with increased coverage, and that these effects vary with the number of nodes present.

  3. Hybrid quantum information processing

    SciTech Connect

    Furusawa, Akira

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  4. The Contextualized Technology Adaptation Process (CTAP): Optimizing Health Information Technology to Improve Mental Health Systems.

    PubMed

    Lyon, Aaron R; Wasse, Jessica Knaster; Ludwig, Kristy; Zachry, Mark; Bruns, Eric J; Unützer, Jürgen; McCauley, Elizabeth

    2016-05-01

    Health information technologies have become a central fixture in the mental healthcare landscape, but few frameworks exist to guide their adaptation to novel settings. This paper introduces the contextualized technology adaptation process (CTAP) and presents data collected during Phase 1 of its application to measurement feedback system development in school mental health. The CTAP is built on models of human-centered design and implementation science and incorporates repeated mixed methods assessments to guide the design of technologies to ensure high compatibility with a destination setting. CTAP phases include: (1) Contextual evaluation, (2) Evaluation of the unadapted technology, (3) Trialing and evaluation of the adapted technology, (4) Refinement and larger-scale implementation, and (5) Sustainment through ongoing evaluation and system revision. Qualitative findings from school-based practitioner focus groups are presented, which provided information for CTAP Phase 1, contextual evaluation, surrounding education sector clinicians' workflows, types of technologies currently available, and influences on technology use. Discussion focuses on how findings will inform subsequent CTAP phases, as well as their implications for future technology adaptation across content domains and service sectors.

  5. The Contextualized Technology Adaptation Process (CTAP): Optimizing Health Information Technology to Improve Mental Health Systems

    PubMed Central

    Lyon, Aaron R.; Wasse, Jessica Knaster; Ludwig, Kristy; Zachry, Mark; Bruns, Eric J.; Unützer, Jürgen; McCauley, Elizabeth

    2015-01-01

    Health information technologies have become a central fixture in the mental healthcare landscape, but few frameworks exist to guide their adaptation to novel settings. This paper introduces the Contextualized Technology Adaptation Process (CTAP) and presents data collected during Phase 1 of its application to measurement feedback system development in school mental health. The CTAP is built on models of human-centered design and implementation science and incorporates repeated mixed methods assessments to guide the design of technologies to ensure high compatibility with a destination setting. CTAP phases include: (1) Contextual evaluation, (2) Evaluation of the unadapted technology, (3) Trialing and evaluation of the adapted technology, (4) Refinement and larger-scale implementation, and (5) Sustainment through ongoing evaluation and system revision. Qualitative findings from school-based practitioner focus groups are presented, which provided information for CTAP Phase 1, contextual evaluation, surrounding education sector clinicians’ workflows, types of technologies currently available, and influences on technology use. Discussion focuses on how findings will inform subsequent CTAP phases, as well as their implications for future technology adaptation across content domains and service sectors. PMID:25677251

  6. Strategic Information Systems Planning.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1995-01-01

    Strategic Information Systems Planning (SISP) is the process of establishing a program for implementation and use of information systems in ways that will optimize effectiveness of information resources and use them to support the objectives of the organization. Basic steps in SISP methodology are outlined. (JKP)

  7. Reengineering the biomedical-equipment procurement process through an integrated management information system.

    PubMed

    Larios, Y G; Matsopoulos, G K; Askounis, D T; Nikita, K S

    2000-01-01

    The design of each new hospital site is typically preceded by decisions on the most appropriate level of biomedical equipment which significantly influences the layout of the hospital departments which are under construction. The most appropriate biomedical equipment should ideally be decided upon considering a series of demographic and social parameters of the hospital and international regulations and standards. This information should ultimately be distilled to proper technical specifications. This paper proposes a streamlined management process related to the procurement of biomedical equipment for new hospital sites or for those under expansion. The new management process aims to increase the efficiency of the experts involved in the definition of the most appropriate level of equipment and its technical specifications. It also addresses all aspects of the biomedical equipment-selection cycle, including the evaluation of the bids submitted by the equipment suppliers. The proposed process is assisted by a management information system, which integrates all related data-handling operations. It provides extensive decision-support facilities to the expert and a platform for the support of knowledge re-use in the field of biomedical-equipment selection.

  8. Evaluation of availability of cluster distributed disaster tolerant systems for control and information processing based on a cluster quorum

    NASA Astrophysics Data System (ADS)

    Tsarev, R. Yu; Gruzenkin, D. V.; Kovalev, I. V.; Prokopenko, A. V.; Knyazkov, A. N.

    2016-11-01

    Control and information processing systems, which often executes critical functions, must satisfy requirements not only of fault tolerance, but also of disaster tolerance. Cluster architecture is reasonable to be applied to provide disaster tolerance of these systems. In this case clusters are separate control and information processing centers united by means of communication channels. Thus, clusters are a single hardware resource interacting with each other to achieve system objectives. Remote cluster positioning allows ensuring system availability and disaster tolerance even in case of some units’ failures or a whole cluster crash. A technique for evaluation of availability of cluster distributed systems for control and information processing based on a cluster quorum is presented in the paper. This technique can be applied to different cluster distributed control and information processing systems, claimed to be based on the disaster tolerance principles. In the article we discuss a communications satellite system as an example of a cluster distributed disaster tolerant control and information processing system. Evaluation of availability of the communications satellite system is provided. Possible scenarios of communications satellite system cluster-based components failures were analyzed. The analysis made it possible to choose the best way to implement the cluster structure for a distributed control and information processing system.

  9. Business Marketing Information Systems Skills. Voc-Ed Project. Business Data Processing Career Area. Report.

    ERIC Educational Resources Information Center

    Milwaukee Area Technical Coll., WI.

    This report and research analysis relate to the Milwaukee Area Technical College Research Project, a study undertaken to determine a curriculum to meet the information processing/management training needs of individuals entering or continuing careers in the information marketing and business data processing occupational clusters. The report of…

  10. Optical-mechanical line-scan imaging process - Its information capacity and efficiency. [satellite multispectral sensing systems application

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Park, S. K.

    1975-01-01

    Optical-mechanical line-scan techniques have been applied to earth satellite multispectral imaging systems. The capability of the imaging system is generally assessed by its information capacity. An approach based on information theory is developed to formulate the capacity of the line-scan process. Included are the effects of blurring of spatial detail, photosensor noise, aliasing, and quantization. The information efficiency is shown to be dependent on sampling rate, system frequency response shape, SNR, and quantization interval.

  11. Optical-mechanical line-scan imaging process - Its information capacity and efficiency. [satellite multispectral sensing systems application

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Park, S. K.

    1975-01-01

    Optical-mechanical line-scan techniques have been applied to earth satellite multispectral imaging systems. The capability of the imaging system is generally assessed by its information capacity. An approach based on information theory is developed to formulate the capacity of the line-scan process. Included are the effects of blurring of spatial detail, photosensor noise, aliasing, and quantization. The information efficiency is shown to be dependent on sampling rate, system frequency response shape, SNR, and quantization interval.

  12. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    ERIC Educational Resources Information Center

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  13. TEX-SIS FOLLOW-UP: Student Follow-up Management Information System. Data Processing Manual.

    ERIC Educational Resources Information Center

    Tarrant County Junior Coll. District, Ft. Worth, TX.

    Project FOLLOW-UP was conducted to develop, test, and validate a statewide management information system for follow-up of Texas public junior and community college students. The result of this project was a student information system (TEX-SIS) consisting of seven subsystems: (1) Student's Educational Intent, (2) Nonreturning Student Follow-up, (3)…

  14. TEX-SIS FOLLOW-UP: Student Follow-up Management Information System. Data Processing Manual.

    ERIC Educational Resources Information Center

    Tarrant County Junior Coll. District, Ft. Worth, TX.

    Project FOLLOW-UP was conducted to develop, test, and validate a statewide management information system for follow-up of Texas public junior and community college students. The result of this project was a student information system (TEX-SIS) consisting of seven subsystems: (1) Student's Educational Intent, (2) Nonreturning Student Follow-up, (3)…

  15. Du Pont Information Flow System

    ERIC Educational Resources Information Center

    Hoffman, Warren S.

    1972-01-01

    The Information Flow System is a large-scale information retrieval system developed for processing of Du Pont information files. As currently implemented, the system stores and retrieves information on company technical reports. Extensions of the system for handling chemical structure information and on-line processing are also discussed. (3…

  16. Information systems definition architecture

    SciTech Connect

    Calapristi, A.J.

    1996-06-20

    The Tank Waste Remediation System (TWRS) Information Systems Definition architecture evaluated information Management (IM) processes in several key organizations. The intent of the study is to identify improvements in TWRS IM processes that will enable better support to the TWRS mission, and accommodate changes in TWRS business environment. The ultimate goals of the study are to reduce IM costs, Manage the configuration of TWRS IM elements, and improve IM-related process performance.

  17. Information processing during phagocytosis

    PubMed Central

    Underhill, David M.; Goodridge, Helen S.

    2017-01-01

    Phagocytosis, the process by which macrophages, dendritic cells and other myeloid phagocytes internalize diverse particulate targets, is a key mechanism of innate immunity. The molecular and cellular events underlying the binding and engulfment of targets in phagosomes have been extensively studied. More recent data suggest that the process of phagocytosis itself provides information to myeloid phagocytes about the nature of the targets they are eating and that this helps tailor inflammatory responses. In this Review, we discuss how such information is acquired during phagocytosis, and how it is processed to coordinate an immune response. PMID:22699831

  18. Neural Analog Information Processing

    NASA Astrophysics Data System (ADS)

    Hecht-Nielsen, Robert

    1982-07-01

    Neural Analog Information Processing (NAIP) is an effort to develop general purpose pattern classification architectures based upon biological information processing principles. This paper gives an overview of NAIP and its relationship to the previous work in neural modeling from which its fundamental principles are derived. It also presents a theorem concerning the stability of response of a slab (a two dimensional array of identical simple processing units) to time-invariant (spatial) patterns. An experiment (via computer emulation) demonstrating classification of a spatial pattern by a simple, but complete NAIP architecture is described. A concept for hardware implementation of NAIP architectures is briefly discussed.

  19. Stochastic thermodynamics of information processing

    NASA Astrophysics Data System (ADS)

    Cardoso Barato, Andre

    2015-03-01

    We consider two recent advancements on theoretical aspects of thermodynamics of information processing. First we show that the theory of stochastic thermodynamics can be generalized to include information reservoirs. These reservoirs can be seen as a sequence of bits which has its Shannon entropy changed due to the interaction with the system. Second we discuss bipartite systems, which provide a convenient description of Maxwell's demon. Analyzing a special class of bipartite systems we show that they can be used to study cellular information processing, allowing for the definition of an entropic rate that quantifies how much a cell learns about a fluctuating external environment and that is bounded by the thermodynamic entropy production.

  20. Hospital information system institutionalization processes in indonesian public, government-owned and privately owned hospitals.

    PubMed

    Handayani, P W; Hidayanto, A N; Ayuningtyas, Dumilah; Budi, Indra

    2016-11-01

    The Hospital Information System (HIS) could help hospitals as a public entity to provide optimal health services. One of the main challenges of HIS implementation is an institutional change. Using institutional theory as the analytical lens, this study aims to explain the institutionalization of HIS as an instance of e-health initiatives in Indonesia. Furthermore, this paper aims for hospital management and researchers to improve the understanding of the social forces that influence hospital personnel's HIS acceptance within an organizational context. We use case studies from four public, government-owned hospitals and four privately owned (public and specialty) hospitals to explain the HIS institutionalization process by exploring the three concepts of institutional theory: institutional isomorphism, institutional logic, and institutional entrepreneurship. This study reveals that differences exist between public, government-owned and private hospitals with regard to the institutionalization process: public, government-owned hospitals' management is more motivated to implement HIS to comply with the regulations, while private hospitals' management views HIS as an urgent requirement that must be achieved. The study findings also reveal that various institutional isomorphism mechanisms and forms of institutional logic emerge during the process. Finally, three factors-self-efficacy, social influence, and management support-have a significant influence on the individual acceptance of HIS. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. A Transducer/Equipment System for Capturing Speech Information for Subsequent Processing by Computer Systems

    DTIC Science & Technology

    1994-01-07

    have shown interest in a speech capture system that would operate in a noisy lobby, casino, airport and shopping mall floor for access to the Automated...control or selection. Vending machines, shopping dispenser kiosks , and entertainment virtual reality games of the future will all be voice activated...EVALUATION RESEARCH INC PAGE 1 TR-3150- 178 - High speech recognition accuracy for commercial applications; automated drive thru fast food ordering

  2. Advanced information processing system: The Army Fault-Tolerant Architecture detailed design overview

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven

    1994-01-01

    The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.

  3. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  4. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  5. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  6. Physics as Information Processing

    NASA Astrophysics Data System (ADS)

    D'Ariano, Giacomo Mauro

    2011-03-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics—including space-time and relativity—is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler. The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of ℏ, and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  7. Integrated Disbursing and Accounting Financial Information Processing Systems (IDAFIPS) Telecommunications Subsystem Project Plan (TSPP).

    DTIC Science & Technology

    1984-12-01

    4.7.1 Region 7 User Comunity . There are presently 55 different commands/activities/offices scheduled to participate as remote online /dial-up subscribers...processing sites will provile online interactive transaction- Ariven data processing support to their associated Financial Infcr- C ration Processing...terminal devices for online update of financial files. The data base will contain the information required to support the financial management

  8. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  9. PEET: a Matlab tool for estimating physical gate errors in quantum information processing systems

    NASA Astrophysics Data System (ADS)

    Hocker, David; Kosut, Robert; Rabitz, Herschel

    2016-09-01

    A Physical Error Estimation Tool (PEET) is introduced in Matlab for predicting physical gate errors of quantum information processing (QIP) operations by constructing and then simulating gate sequences for a wide variety of user-defined, Hamiltonian-based physical systems. PEET is designed to accommodate the interdisciplinary needs of quantum computing design by assessing gate performance for users familiar with the underlying physics of QIP, as well as those interested in higher-level computing operations. The structure of PEET separates the bulk of the physical details of a system into Gate objects, while the construction of quantum computing gate operations are contained in GateSequence objects. Gate errors are estimated by Monte Carlo sampling of noisy gate operations. The main utility of PEET, though, is the implementation of QuantumControl methods that act to generate and then test gate sequence and pulse-shaping techniques for QIP performance. This work details the structure of PEET and gives instructive examples for its operation.

  10. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  11. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  12. The Complex Information Process

    NASA Astrophysics Data System (ADS)

    Taborsky, Edwina

    2000-09-01

    This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  13. Modeling the Retrieval Process for an Information Retrieval System Using an Ordinal Fuzzy Linguistic Approach.

    ERIC Educational Resources Information Center

    Herrera-Viedma, E.

    2001-01-01

    Proposes a linguistic model for an Information Retrieval System (IRS) defined using an ordinal fuzzy linguistic approach. The query subsystem accepts Boolean queries with terms weighted by ordinal linguistic values and the evaluation subsystem returns documents arranged in relevance classes labeled with ordinal linguistic values. The system gives…

  14. Performance assessment and adoption processes of an information monitoring and diagnostic system prototype

    SciTech Connect

    Piette, Mary Ann

    1999-10-01

    This report addresses the problem that buildings do not perform as well as anticipated during design. We partnered with an innovative building operator to evaluate a prototype Information Monitoring and Diagnostic System (IMDS). The IMDS consists of high-quality measurements archived each minute, a data visualization tool, and a web-based capability. The operators recommend similar technology be adopted in other buildings. The IMDS has been used to identify and correct a series of control problems. It has also allowed the operators to make more effective use of the building control system, freeing up time to take care of other tenant needs. They believe they have significantly improved building comfort, potentially improving tenant health, and productivity. The reduction in hours to operate the building are worth about $20,000 per year, which could pay for the IMDS in about five years. A control system retrofit based on findings from the IMDS is expected to reduce energy use by 20 percent over the next year, worth over $30,000 per year. The main conclusion of the model-based chiller fault detection work is that steady-state models can be used as reference models to monitor chiller operation and detect faults. The ability of the IMDS to measure cooling load and chiller power to one-percent accuracy with a one-minute sampling interval permits detection of additional faults. Evolutionary programming techniques were also evaluated, showing promise in the detection of patterns in building data. We also evaluated two technology adoption processes, radical and routine. In routine adoption, managers enhance features of existing products that are already well understood. In radical adoption, innovative building managers introduce novel technology into their organizations without using the rigorous payback criteria used in routine innovations.

  15. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    NASA Astrophysics Data System (ADS)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  16. Terminal Information Processing System (TIPS) Consolidated CAB Display (CCD) Comparative Analysis.

    DTIC Science & Technology

    1982-04-01

    specification form, vendor responses to the specifications and the recommendation to include flight data management in the CCD System. 17. Key Words 15...Cab Display (CCD) 4 Summary 4 SYSTEM CONCEPTS 5 Information Management Language 5 Reliability 5 Expandability 8 Data Display 9 Real-Time Operational...terminal data management system. It briefly outlines a history of both and then presents the basic working guidelines used in developing the

  17. Process Information and Evolution.

    PubMed

    Chastain, Erick; Smith, Cameron

    2016-12-01

    Universal Semantic Communication (USC) is a theory that models communication among agents without the assumption of a fixed protocol. We demonstrate a connection, via a concept we refer to as process information, between a special case of USC and evolutionary processes. In this context, one agent attempts to interpret a potentially arbitrary signal produced within its environment. Sources of this effective signal can be modeled as a single alternative agent. Given a set of common underlying concepts that may be symbolized differently by different sources in the environment, any given entity must be able to correlate intrinsic information with input it receives from the environment in order to accurately interpret the ambient signal and ultimately coordinate its own actions. This scenario encapsulates a class of USC problems that provides insight into the semantic aspect of a model of evolution proposed by Rivoire and Leibler. Through this connection, we show that evolution corresponds to a means of solving a special class of USC problems, can be viewed as a special case of the Multiplicative Weights Updates algorithm, and that infinite population selection with no mutation and no recombination conforms to the Rivoire-Leibler model. Finally, using process information we show that evolving populations implicitly internalize semantic information about their respective environments.

  18. Utilizing geographic information systems technology in the Wyoming cumulative hydrologic impact assessment modeling process

    SciTech Connect

    Hamerlinck, J.D.; Oakleaf, J.R.

    1997-12-31

    The coal-permitting process places heavy demands on both permit applicants and regulatory authorities with respect to the management and analysis of hydrologic data. Currently, this correlation is being addressed for the Powder River Basin, Wyoming by the ongoing Cumulative Hydrologic Impact Assessment (CHIA) efforts at the University of Wyoming. One critical component of the CHIA is the use of a Geographic Information System (GIS) for support, management, manipulation, pre-analysis, and display of data associated with the chosen groundwater and surface water models. This paper will discuss the methodology in using of GIS technology as an integrated tool with the MODFLOW and HEC-1 hydrologic models. Pre-existing GIS links associated with these two models served as a foundation for this effort. However, due to established standards and site specific factors, substantial modifications were performed on existing tools to obtain adequate results. The groundwater-modeling effort required the use of a refined grid in which cell sizes varied based on the relative locations of ongoing mining activities. Surface water modeling was performed in a semi-arid region with very limited topographic relief and predominantly ephemeral stream channels. These were substantial issues that presented challenges for effective GIS/model integration.

  19. Organizing Information Systems.

    ERIC Educational Resources Information Center

    Thomas, Charles R.

    The development of information systems is described with regard to the roles of the system user and the data processing specialist. Institutional needs are best served by coordination efforts, usually handled by a management systems office, which is also responsible for the maintenance and production of an institutional data element dictionary and…

  20. A CLASSIFICATION SYSTEM FOR ANY DATA BANKING (INFORMATION STORAGE AND RETRIEVAL) PROCESS

    DTIC Science & Technology

    Study was started to discover and state explicitly the fundamentals of data banking (more commonly called information storage and retrieval). A clear...framework or hierarchical tree is displayed that includes all possible data banking processes and shows their similarities and differences. The basis of

  1. CSI-ISC--Concepts for smooth integration of health care information system components into established processes of patient care.

    PubMed

    Garde, S; Wolff, A C; Kutscha, U; Wetter, T; Knaup, P

    2006-01-01

    The introduction of information system components (ISCs) usually leads to a change in existing processes, e.g. processes of patient care. These processes might become even more complex and variable than before. An early participation of end users and a better understanding of human factors during design and introduction of ISCs are key factors for a successful introduction of ISCs in health care. Nonetheless no specialized methods have been developed until now to systematically support the integration of ISCs in existing processes of patient care while taking into account these requirements. In this paper, therefore, we introduce a procedure model to implement Concepts for Smooth Integration of ISCs (CSI-ISC). Established theories from economics and social sciences have been applied in our model, among them the stress-strain-concept, the contrastive task analysis (KABA), and the phase model for the management of information systems. CSI-ISC is based on the fact that while introducing new information system components, users experience additional workload. One essential aim during the introduction process therefore should be to systematically identify, prioritize and ameliorate workloads that are being imposed on human beings by information technology in health care. To support this, CSI-ISC consists of a static part (workload framework) and a dynamic part (guideline for the introduction of information system components into existing processes of patient care). The application of CSI-ISC offers the potential to minimize additional workload caused by information system components systematically. CSI-ISC rationalizes decisions and supports the integration of the information system component into existing processes of patient care.

  2. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  3. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  4. Test processing system (SEE)

    NASA Technical Reports Server (NTRS)

    Gaulene, P.

    1986-01-01

    The SEE data processing system, developed in 1985, manages and process test results. General information is provided on the SEE system: objectives, characteristics, basic principles, general organization, and operation. Full documentation is accessible by computer using the HELP SEE command.

  5. Quantum Information Processing

    DTIC Science & Technology

    2007-11-02

    preparation, indicating, to our surprise, that standard quantum teleportation is *not* optimal for the transmission of states from Alice to Bob if...1 August 1998-1 August. 2001 4. TITLE AND SUBTITLE Quantum Information Processing 5. FUNDING NUMBERS DAAG55-98-C-0041 6. AUTHOR(S) David P... quantum entanglement in which the transmitted quantum state is known to Alice. Very recently, with A. Winter, a new, more efficient protocol for RSP has

  6. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  7. Optimum complex processing of informative and measuring signals of navigation systems

    NASA Astrophysics Data System (ADS)

    Marciszel, Jan

    A version of the optimum complex processing of the signals of a two-channel measuring system with delay in one of the channels based on Wiener's filtration is proposed. Also a correction for Doppler-course counting is presented which is based on hyperbolic radio navigation sets using a Kalman filter included into the error compensation system.

  8. Blogs and Social Network Sites as Activity Systems: Exploring Adult Informal Learning Process through Activity Theory Framework

    ERIC Educational Resources Information Center

    Heo, Gyeong Mi; Lee, Romee

    2013-01-01

    This paper uses an Activity Theory framework to explore adult user activities and informal learning processes as reflected in their blogs and social network sites (SNS). Using the assumption that a web-based space is an activity system in which learning occurs, typical features of the components were investigated and each activity system then…

  9. A data and information system for processing, archival, and distribution of data for global change research

    NASA Technical Reports Server (NTRS)

    Graves, Sara J.

    1994-01-01

    Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.

  10. Selecting an Information System for the '90s: Can a User Driven Process Work?

    ERIC Educational Resources Information Center

    Jonas, Stephen; And Others

    In 1988, Sinclair Community College (SCC) began a comprehensive study of the need for a new administrative information system that would improve the college's effectiveness and flexibility in providing educational and administrative services. A planning committee provided college-wide coordination for the development of a request for proposal…

  11. Applications of Social Science to Management Information Systems and Evaluation Process: A Peace Corps Model.

    ERIC Educational Resources Information Center

    Lassey, William R.; And Others

    This study discusses some of the central concepts, assumptions and methods used in the development and design of a Management Information and Evaluation System for the Peace Corps in Colombia. Methodological problems encountered are reviewed. The model requires explicit project or program objectives, individual staff behavioral objectives, client…

  12. A study on airborne integrated display system and human information processing

    NASA Technical Reports Server (NTRS)

    Mizumoto, K.; Iwamoto, H.; Shimizu, S.; Kuroda, I.

    1983-01-01

    The cognitive behavior of pilots was examined in an experiment involving mock ups of an eight display electronic attitude direction indicator for an airborne integrated display. Displays were presented in digital, analog digital, and analog format to experienced pilots. Two tests were run, one involving the speed of memorization in a single exposure and the other comprising two five second exposures spaced 30 sec apart. Errors increased with the speed of memorization. Generally, the analog information was assimilated faster than the digital data, with regard to the response speed. Information processing was quantified as 25 bits for the first five second exposure and 15 bits during the second.

  13. Transaction Processing Using Remote Procedure Calls (RPC) for a Heterogeneous Distributed Clinical Information System

    PubMed Central

    Tolchin, Stephen G.; Bergan, Eric S.; Arseniev, Marina; Kuzmak, Peter; Nordquist, Roger; Siegel, Dennis

    1986-01-01

    The Johns Hopkins Hospital is developing a distributed clinical information system that integrates functionally several UNIX, IBM MVS/CICS and MUMPS computer systems. Distributed applications development is accomplished by interprocess communications across Ethernet using remote procedure calls. The remote procedure call (RPC) protocol provides a standard approach to the development of distributed applications using the metaphor of a subroutine call. The Sun Microsystems RPC and XDR (external data representation) protocols have been implemented in these environments. The systems, the distributed model, RPC implementations and applications examples are discussed.

  14. Organizations as Information Processing Systems. Environmental Characteristics, Company Performance, and Chief Executive Scanning: An Empirical Study.

    DTIC Science & Technology

    1986-04-01

    perhaps more than any other factor, affects organizatlonal structure, internal processes, and managerial decision making (Duncan, 1972; Ptetfer and...company, call. an immediate subordinate, or read an internal report or professional journal. Top management scanning tends to be irregular rather than...Mode is derived from Aguilar’s (19b7) designation of information sources as personal, impersonal, internal or external.. Scanni ngj fretncy. Research

  15. Information technology standards process guide

    SciTech Connect

    Not Available

    1994-04-01

    This document presents a logical and realistic approach to implementation of the Information Technology Standards Program throughout the Department and its management and operating contractors, as described in Department of Energy 1360.3C, Information Technology Standards, dated October 19, 1992. To take better advantage of commercial advances and investments in information technology resources, it is paramount that the Department of Energy move as rapidly as programmatically feasible to an open systems environment. The process revolves around the selection of interface standards in areas such as multi-system and multi-processor interconnects, operating and database management systems, graphics, and security. This new approach will result in reduced production, operation, and maintenance costs, and more effective system integration. The Information Technology Standards Process Guide provides a model that may be tailored to Department of Energy sites. It also assists sites in understanding the Information Technology Standards Program. It is not an architectural description to be used in implementing a corporate information systems environment.

  16. Physical limits in information processing

    NASA Astrophysics Data System (ADS)

    Keyes, Robert W.

    Fundamental physical principles limiting improvements in the speed, fabrication and operating costs, and degree of miniaturization of electronic components for information processing are explored. Topics addressed include the representation of information for communication and computation, system architectures, the nature of three- and two-terminal devices, voltage, bipolar transistors, FETs, MESFETs, soft errors; chip wiring, electromigration, and chip interconnects; fabrication methods; energy dissipation; and power-supply cooling. Extensive diagrams and graphs are provided.

  17. Information of Open Systems

    NASA Astrophysics Data System (ADS)

    Klimontovich, Yuri L.

    In the theory of communication two definitions of the concept "information" are known. One of them coincides according to its form with the Boltzmann entropy. The second definition of information is the difference between unconditional and conditional entropies. In the present work this latter is used for the definition of the information about states of open systems with various meanings of the control parameter. Two kinds of open systems are considered. The first class of systems concerns those which with zero value of the control parameter are in an equilibrium state. The information on an equilibrium state is equal to zero. During self- organizing in the process of departing from an equilibrium state the information increases. For open systems of this class the conservation law for the sum of the information and entropy with all values of control parameter is proved. In open systems of the second class the equilibrium condition is impossible. For them the concept "norm of a chaoticity" is introduced. It allows to consider two kinds of processes of self-organization and to give the corresponding definitions of information. The statement is carried out on a number of (classical and quantum) examples of physical systems. The example of a medico-biological system also is considered.

  18. Intelligent Information Systems.

    ERIC Educational Resources Information Center

    Zabezhailo, M. I.; Finn, V. K.

    1996-01-01

    An Intelligent Information System (IIS) uses data warehouse technology to facilitate the cycle of data and knowledge processing, including input, standardization, storage, representation, retrieval, calculation, and delivery. This article provides an overview of IIS products and artificial intelligence systems, illustrates examples of IIS…

  19. Novel, Web-based, information-exploration approach for improving operating room logistics and system processes.

    PubMed

    Nagy, Paul G; Konewko, Ramon; Warnock, Max; Bernstein, Wendy; Seagull, Jacob; Xiao, Yan; George, Ivan; Park, Adrian

    2008-03-01

    Routine clinical information systems now have the ability to gather large amounts of data that surgical managers can access to create a seamless and proactive approach to streamlining operations and minimizing delays. The challenge lies in aggregating and displaying these data in an easily accessible format that provides useful, timely information on current operations. A Web-based, graphical dashboard is described in this study, which can be used to interpret clinical operational data, allow managers to see trends in data, and help identify inefficiencies that were not apparent with more traditional, paper-based approaches. The dashboard provides a visual decision support tool that assists managers in pinpointing areas for continuous quality improvement. The limitations of paper-based techniques, the development of the automated display system, and key performance indicators in analyzing aggregate delays, time, specialties, and teamwork are reviewed. Strengths, weaknesses, opportunities, and threats associated with implementing such a program in the perioperative environment are summarized.

  20. Information Processing Research

    DTIC Science & Technology

    1988-01-01

    developing demonstrable vision systems. This report reviews out progress since the October 1984 workshop proceedings. The highlights in out Program in...A new class of algorithms based on the Boltzmann Machine is introduced and compared to previously developed algorithms. The report includes a review ...34compression analysis" method embodies a post -processing strategy that rewrites learned control rules, increasing their readability and reducing their match

  1. Data Systems vs. Information Systems

    PubMed Central

    Amatayakul, Margret K.

    1982-01-01

    This paper examines the current status of “hospital information systems” with respect to the distinction between data systems and information systems. It is proposed that the systems currently existing are incomplete data dystems resulting in ineffective information systems.

  2. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    PubMed

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly.

  3. Engineering Review Information System

    NASA Technical Reports Server (NTRS)

    Grems, III, Edward G. (Inventor); Henze, James E. (Inventor); Bixby, Jonathan A. (Inventor); Roberts, Mark (Inventor); Mann, Thomas (Inventor)

    2015-01-01

    A disciplinal engineering review computer information system and method by defining a database of disciplinal engineering review process entities for an enterprise engineering program, opening a computer supported engineering item based upon the defined disciplinal engineering review process entities, managing a review of the opened engineering item according to the defined disciplinal engineering review process entities, and closing the opened engineering item according to the opened engineering item review.

  4. Altered visual information processing systems in bipolar disorder: evidence from visual MMN and P3

    PubMed Central

    Maekawa, Toshihiko; Katsuki, Satomi; Kishimoto, Junji; Onitsuka, Toshiaki; Ogata, Katsuya; Yamasaki, Takao; Ueno, Takefumi; Tobimatsu, Shozo; Kanba, Shigenobu

    2013-01-01

    Objective: Mismatch negativity (MMN) and P3 are unique ERP components that provide objective indices of human cognitive functions such as short-term memory and prediction. Bipolar disorder (BD) is an endogenous psychiatric disorder characterized by extreme shifts in mood, energy, and ability to function socially. BD patients usually show cognitive dysfunction, and the goal of this study was to access their altered visual information processing via visual MMN (vMMN) and P3 using windmill pattern stimuli. Methods: Twenty patients with BD and 20 healthy controls matched for age, gender, and handedness participated in this study. Subjects were seated in front of a monitor and listened to a story via earphones. Two types of windmill patterns (standard and deviant) and white circle (target) stimuli were randomly presented on the monitor. All stimuli were presented in random order at 200-ms durations with an 800-ms inter-stimulus interval. Stimuli were presented at 80% (standard), 10% (deviant), and 10% (target) probabilities. The participants were instructed to attend to the story and press a button as soon as possible when the target stimuli were presented. Event-related potentials (ERPs) were recorded throughout the experiment using 128-channel EEG equipment. vMMN was obtained by subtracting standard from deviant stimuli responses, and P3 was evoked from the target stimulus. Results: Mean reaction times for target stimuli in the BD group were significantly higher than those in the control group. Additionally, mean vMMN-amplitudes and peak P3-amplitudes were significantly lower in the BD group than in controls. Conclusions: Abnormal vMMN and P3 in patients indicate a deficit of visual information processing in BD, which is consistent with their increased reaction time to visual target stimuli. Significance: Both bottom-up and top-down visual information processing are likely altered in BD. PMID:23898256

  5. Altered visual information processing systems in bipolar disorder: evidence from visual MMN and P3.

    PubMed

    Maekawa, Toshihiko; Katsuki, Satomi; Kishimoto, Junji; Onitsuka, Toshiaki; Ogata, Katsuya; Yamasaki, Takao; Ueno, Takefumi; Tobimatsu, Shozo; Kanba, Shigenobu

    2013-01-01

    Mismatch negativity (MMN) and P3 are unique ERP components that provide objective indices of human cognitive functions such as short-term memory and prediction. Bipolar disorder (BD) is an endogenous psychiatric disorder characterized by extreme shifts in mood, energy, and ability to function socially. BD patients usually show cognitive dysfunction, and the goal of this study was to access their altered visual information processing via visual MMN (vMMN) and P3 using windmill pattern stimuli. Twenty patients with BD and 20 healthy controls matched for age, gender, and handedness participated in this study. Subjects were seated in front of a monitor and listened to a story via earphones. Two types of windmill patterns (standard and deviant) and white circle (target) stimuli were randomly presented on the monitor. All stimuli were presented in random order at 200-ms durations with an 800-ms inter-stimulus interval. Stimuli were presented at 80% (standard), 10% (deviant), and 10% (target) probabilities. The participants were instructed to attend to the story and press a button as soon as possible when the target stimuli were presented. Event-related potentials (ERPs) were recorded throughout the experiment using 128-channel EEG equipment. vMMN was obtained by subtracting standard from deviant stimuli responses, and P3 was evoked from the target stimulus. Mean reaction times for target stimuli in the BD group were significantly higher than those in the control group. Additionally, mean vMMN-amplitudes and peak P3-amplitudes were significantly lower in the BD group than in controls. Abnormal vMMN and P3 in patients indicate a deficit of visual information processing in BD, which is consistent with their increased reaction time to visual target stimuli. Both bottom-up and top-down visual information processing are likely altered in BD.

  6. SU-F-P-01: Changing Your Oncology Information System: A Detailed Process and Lessons Learned

    SciTech Connect

    Abing, C

    2016-06-15

    Purpose: Radiation Oncology departments are faced with many options for pairing their treatment machines with record and verify systems. Recently, there is a push to have a single-vendor-solution. In order to achieve this, the department must go through an intense and rigorous transition process. Our department has recently completed this process and now offer a detailed description of the process along with lessons learned. Methods: Our cancer center transitioned from a multi-vendor department to a single-vendor department over the 2015 calendar year. Our staff was partitioned off into superuser groups, an interface team, migration team, and go-live team. Six months after successful implementation, a detailed survey was sent to the radiation oncology department to determine areas for improvement as well as successes in the process. Results: The transition between record and verify systems was considered a complete success. The results of the survey did point out some areas for improving inefficiencies with our staff; both interactions between each other and the vendors. Conclusion: Though this process was intricate and lengthy, it can be made easier with careful planning and detailed designation of project responsibilities. Our survey results and retrospective analysis of the transition are valuable to those wishing to make this change.

  7. An Ada implementation of the network manager for the advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail A.

    1986-01-01

    From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.

  8. Next generation information systems

    SciTech Connect

    Limback, Nathan P; Medina, Melanie A; Silva, Michelle E

    2010-01-01

    The Information Systems Analysis and Development (ISAD) Team of the Safeguards Systems Group at Los Alamos National Laboratory (LANL) has been developing web based information and knowledge management systems for sixteen years. Our vision is to rapidly and cost effectively provide knowledge management solutions in the form of interactive information systems that help customers organize, archive, post and retrieve nonproliferation and safeguards knowledge and information vital to their success. The team has developed several comprehensive information systems that assist users in the betterment and growth of their organizations and programs. Through our information systems, users are able to streamline operations, increase productivity, and share and access information from diverse geographic locations. The ISAD team is also producing interactive visual models. Interactive visual models provide many benefits to customers beyond the scope of traditional full-scale modeling. We have the ability to simulate a vision that a customer may propose, without the time constraints of traditional engineering modeling tools. Our interactive visual models can be used to access specialized training areas, controlled areas, and highly radioactive areas, as well as review site-specific training for complex facilities, and asset management. Like the information systems that the ISAD team develops, these models can be shared and accessed from any location with access to the internet. The purpose of this paper is to elaborate on the capabilities of information systems and interactive visual models as well as consider the possibility of combining the two capabilities to provide the next generation of infonnation systems. The collection, processing, and integration of data in new ways can contribute to the security of the nation by providing indicators and information for timely action to decrease the traditional and new nuclear threats. Modeling and simulation tied to comprehensive

  9. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  10. Examination of the Nonlinear Dynamic Systems Associated with Science Student Cognition While Engaging in Science Information Processing

    ERIC Educational Resources Information Center

    Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq

    2016-01-01

    A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…

  11. Examination of the Nonlinear Dynamic Systems Associated with Science Student Cognition While Engaging in Science Information Processing

    ERIC Educational Resources Information Center

    Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq

    2016-01-01

    A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…

  12. DOD Financial Management: Improved Controls, Processes, and Systems Are Needed for Accurate and Reliable Financial Information

    DTIC Science & Technology

    2011-09-23

    effective oversight, • well-defined enterprise architecture , and • successful implementation of the enterprise resource planning systems. View GAO...commercial off-the-shelf (COTS) software consisting of multiple, integrated functional modules that perform a variety of business related tasks such as...Remediation Plan consists of a written plan covering the initial 11 financial statement process notices of findings and recommendations ( NFR ) to comply

  13. Further argument for the existence of a pacemaker in the human information processing system.

    PubMed

    Burle, B; Bonnet, M

    1997-11-01

    To support the idea that temporal information processing may depend on an internal clock, Treisman et al. proposed a pacemaker model (Treisman, M., Faulkner, A., Naish, P.L.N., Brogan, D., 1990. The internal clock: Evidence for a temporal oscillator underlying time perception with some estimates of its characteristics frequency. Perception 19, 705-743.) and a technique for interfering with it by introducing an external periodic phenomenon. Experimental results obtained by these authors on time estimation and production tasks support this model. In another study, Treisman et al. established that the pacemaker also affects reaction times (RT) (Treisman, M., Faulkner, A., Naish, P.L.N., 1992. On the relation between time perception and the timing of motor action: Evidence for a temporal oscillator controlling the timing of movement. Quarterly Journal of Experimental Psychology 45A, 235-263.). In the present study, we addressed the question as to which information processing stage (Sanders, A.F., 1980. Stage analysis of reaction process, In: Stelmach, G.E., Requin, J. (Eds.). Tutorials in motor behavior. North-Holland, Amsterdam, pp. 331-354.) is affected by this internal clock. For this purpose, we used the Additive Factors Method (Sternberg, S., 1969. The discovery of processing stages: Extension of Donder's method. In: Koster, W.G. (Ed.). Attention and Performance II. Acta Psychologica 30, 276-315.). To vary sensorial processing time, we used two visual stimulus intensities. Stimulus-response mapping was manipulated to enhance central processing time. To modify the duration of the motor stages, the two responses could be given by two fingers on the same hand (right ring vs. middle finger) or by two fingers of the different hands (right ring vs. left middle finger). Intensity of the stimulus, stimulus-response mapping, and repertoire of responses were found to be additive. We obtained RT modulations similar to those obtained by Treisman et al. in 1992. No first order

  14. [A Medical Devices Management Information System Supporting Full Life-Cycle Process Management].

    PubMed

    Tang, Guoping; Hu, Liang

    2015-07-01

    Medical equipments are essential supplies to carry out medical work. How to ensure the safety and reliability of the medical equipments in diagnosis, and reduce procurement and maintenance costs is a topic of concern to everyone. In this paper, product lifecycle management (PLM) and enterprise resource planning (ERP) are cited to establish a lifecycle management information system. Through integrative and analysis of the various stages of the relevant data in life-cycle, it can ensure safety and reliability of medical equipments in the operation and provide the convincing data for meticulous management.

  15. Development of a prototype spatial information processing system for hydrologic research

    NASA Technical Reports Server (NTRS)

    Sircar, Jayanta K.

    1991-01-01

    Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.

  16. Application of digital image processing techniques and information systems to water quality monitoring of Lake Tahoe

    NASA Technical Reports Server (NTRS)

    Smith, A. Y.; Blackwell, R. J.

    1981-01-01

    The Tahoe basin occupies over 500 square miles of territory located in a graben straddling the boundary between California and Nevada. Lake Tahoe contains 126 million acre-feet of water. Since the 1950's the basin has experienced an ever increasing demand for land development at the expense of the natural watershed. Discharge of sediment to the lake has greatly increased owing to accelerated human interference, and alterations to the natural drainage patterns are evident in some areas. In connection with an investigation of the utility of a comprehensive system that takes into account the causes as well as the effects of lake eutrophication, it has been attempted to construct an integrated and workable data base, comprised of currently available data sources for the Lake Tahoe region. Attention is given to the image based information system (IBIS), the construction of the Lake Tahoe basin data base, and the application of the IBIS concept to the Lake Tahoe basin.

  17. Application of digital image processing techniques and information systems to water quality monitoring of Lake Tahoe

    NASA Technical Reports Server (NTRS)

    Smith, A. Y.; Blackwell, R. J.

    1981-01-01

    The Tahoe basin occupies over 500 square miles of territory located in a graben straddling the boundary between California and Nevada. Lake Tahoe contains 126 million acre-feet of water. Since the 1950's the basin has experienced an ever increasing demand for land development at the expense of the natural watershed. Discharge of sediment to the lake has greatly increased owing to accelerated human interference, and alterations to the natural drainage patterns are evident in some areas. In connection with an investigation of the utility of a comprehensive system that takes into account the causes as well as the effects of lake eutrophication, it has been attempted to construct an integrated and workable data base, comprised of currently available data sources for the Lake Tahoe region. Attention is given to the image based information system (IBIS), the construction of the Lake Tahoe basin data base, and the application of the IBIS concept to the Lake Tahoe basin.

  18. A Review of Shipboard Uniform Automated Data Processing System (SUADPS) as a Financial Information and Control System for OPTAR Funds.

    DTIC Science & Technology

    1983-06-01

    Data Processing Line Printer (U-1569)--the primary output unit for printed format computer information with an average print speed of 450 lines per...accomplished in FY 1981/82, replacing the Digital Data Recorder/Reproducer and the Data Processing Line Printer with newer technology and increased...in the Line Printers’ average print speed to 1000 lines per minute. During the formulation of this initial program, SUADPS users expressed concern

  19. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  20. Evaluation of the clinical process in a critical care information system using the Lean method: a case study.

    PubMed

    Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki

    2012-12-21

    There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  1. A Pressure Injection System for Investigating the Neuropharmacology of Information Processing in Awake Behaving Macaque Monkey Cortex

    PubMed Central

    Veith, Vera K.; Quigley, Cliodhna; Treue, Stefan

    2016-01-01

    The top-down modulation of feed-forward cortical information processing is functionally important for many cognitive processes, including the modulation of sensory information processing by attention. However, little is known about which neurotransmitter systems are involved in such modulations. A practical way to address this question is to combine single-cell recording with local and temporary neuropharmacological manipulation in a suitable animal model. Here we demonstrate a technique combining acute single-cell recordings with the injection of neuropharmacological agents in the direct vicinity of the recording electrode. The video shows the preparation of the pressure injection/recording system, including preparation of the substance to be injected. We show a rhesus monkey performing a visual attention task and the procedure of single-unit recording with block-wise pharmacological manipulations. PMID:27023110

  2. Automating the Air Force Retail-Level Equipment Management Process: An Application of Microcomputer-Based Information Systems Techniques

    DTIC Science & Technology

    1988-09-01

    could use the assistance of a microcomputer-based management information system . However, adequate system design and development requires an in-depth...understanding of the Equipment Management Section and the environment in which it functions were asked and answered. Then, a management information system was...designed, developed, and tested. The management information system is called the Equipment Management Information System (EMIS).

  3. A multichip neuromorphic system for spike-based visual information processing.

    PubMed

    Vogelstein, R Jacob; Mallik, Udayan; Culurciello, Eugenio; Cauwenberghs, Gert; Etienne-Cummings, Ralph

    2007-09-01

    We present a multichip, mixed-signal VLSI system for spike-based vision processing. The system consists of an 80 x 60 pixel neuromorphic retina and a 4800 neuron silicon cortex with 4,194,304 synapses. Its functionality is illustrated with experimental data on multiple components of an attention-based hierarchical model of cortical object recognition, including feature coding, salience detection, and foveation. This model exploits arbitrary and reconfigurable connectivity between cells in the multichip architecture, achieved by asynchronously routing neural spike events within and between chips according to a memory-based look-up table. Synaptic parameters, including conductance and reversal potential, are also stored in memory and are used to dynamically configure synapse circuits within the silicon neurons.

  4. Management Information Systems.

    ERIC Educational Resources Information Center

    Finlayson, Jean, Ed.

    1989-01-01

    This collection of papers addresses key questions facing college managers and others choosing, introducing, and living with big, complex computer-based systems. "What Use the User Requirement?" (Tony Coles) stresses the importance of an information strategy driven by corporate objectives, not technology. "Process of Selecting a…

  5. NEEDS - Information Adaptive System

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Benz, H. F.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onboard image processing. The IAS is a data preprocessing system which is closely coupled to the sensor system. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner, and provide the opportunity to design sensor systems which can be reconfigured in near real-time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  6. HS3 Information System

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Conover, H.; Ramachandran, R.; Kulkarni, A.; Mceniry, M.; Stone, B.

    2015-12-01

    The Global Hydrology Resource Center (GHRC) is developing an enterprise information system to manage and better serve data for Hurricane and Severe Storm Sentinel (HS3), a NASA airborne field campaign. HS3 is a multiyear campaign aimed at helping scientists understand the physical processes that contribute to hurricane intensification. For in-depth analysis, HS3 encompasses not only airborne data but also variety of in-situ, satellite, simulation, and flight report data. Thus, HS3 provides a unique challenge in information system design. The GHRC team is experienced with previous airborne campaigns to handle such challenge. Many supplementary information and reports collected during the mission include information rich contents that provide mission snapshots. In particular, flight information, instrument status, weather reports, and summary statistics offer vital knowledge about the corresponding science data. Furthermore, such information help narrow the science data of interest. Therefore, the GHRC team is building HS3 information system that augments the current GHRC data management framework to support search and discover of airborne science data with interactive visual exploration. Specifically, the HS3 information system is developing a tool to visually playback mission flights along with other traditional search and discover interfaces. This playback capability allows the users to follow the flight in time and visualize collected data. The flight summary and analyzed information are also presented during the playback. If the observed data is of interest, then they can order the data from GHRC using the interface. The users will be able to order just the data for the part of the flight that they are interested in. This presentation will demonstrate use of visual exploration to data download along with other components that comprise the HS3 information system.

  7. Training Management Information System

    SciTech Connect

    Rackley, M.P.

    1989-01-01

    The Training Management Information System (TMIS) is an integrated information system for all training related activities. TMIS is at the leading edge of training information systems used in the nuclear industry. The database contains all the necessary records to confirm the department's adherence to accreditation criteria and houses all test questions, student records and information needed to evaluate the training process. The key to the TMIS system is that the impact of any change (i.e., procedure change, new equipment, safety incident in the commercial nuclear industry, etc.) can be tracked throughout the training process. This ensures the best training can be performed that meets the needs of the employees. TMIS is comprised of six functional areas: Job and Task Analysis, Training Materials Design and Development, Exam Management, Student Records/Scheduling, Evaluation, and Commitment Tracking. The system consists of a VAX 6320 Cluster with IBM and MacIntosh computers tied into an ethernet with the VAX. Other peripherals are also tied into the system: Exam Generation Stations to include mark sense readers for test grading, Production PC's for Desk-Top Publishing of Training Material, and PC Image Workstations. 5 figs.

  8. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  9. The Value of Mechanistic Biophysical Information for Systems-Level Understanding of Complex Biological Processes Such as Cytokinesis

    PubMed Central

    Pollard, Thomas D.

    2014-01-01

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. PMID:25468329

  10. Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems--beyond the digital hegemony.

    PubMed

    Crutchfield, James P; Ditto, William L; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  11. Introduction to Focus Issue: Intrinsic and Designed Computation: Information Processing in Dynamical Systems-Beyond the Digital Hegemony

    NASA Astrophysics Data System (ADS)

    Crutchfield, James P.; Ditto, William L.; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws—that predicted the inexorable improvement in digital circuitry—to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  12. A stochastic model and a functional central limit theorem for information processing in large systems of neurons.

    PubMed

    Höpfner, Reinhard; Brodda, Klaus

    2006-04-01

    The paper deals with information transmission in large systems of neurons. We model the membrane potential in a single neuron belonging to a cell tissue by a non time-homogeneous Cox-Ingersoll-Ross type diffusion; in terms of its time-varying expectation, this stochastic process can convey deterministic signals. We model the spike train emitted by this neuron as a Poisson point process compensated by the occupation time of the membrane potential process beyond the excitation threshold. In a large system of neurons 1 < or = i < or = N processing independently the same deterministic signal, we prove a functional central limit theorem for the pooled spike train collected from the N neurons. This pooled spike train allows to recover the deterministic signal, up to some shape transformation which is explicit.

  13. Materials management information systems.

    PubMed

    1996-01-01

    MMIS Selection Process: Outlines steps to follow and describes factors to consider when selecting an MMIS. Also includes our Materials Management Process Evaluation and Needs Assessment Worksheet (which is also available online through ECRInet(TM)) and a list of suggested interview questions to be used when gathering user experience information for systems under consideration. Section 3A. MMIS Vendor Profiles: Presents information for the evaluated systems in a standardized, easy-to-compare format. Profiles include an Executive Summary describing our findings, a discussion of user comments, a listing of MMIS specifications, and information on the vendor's business background. Section 3B. Discussion of Vendor Profile Conclusions and Ratings: Presents our ratings and summarizes our rationale for all evaluated systems. Also includes a blank Vendor Profile Template to be used when gathering information on other vendors and systems. We found that, in general, all of the evaluated systems are able to meet most of the functional needs of a materials management department. However, we did uncover significant differences in the quality of service and support provided by each vendor, and our ratings reflect these differences: we rated two of the systems Acceptable--Preferred and four of the systems Acceptable. We have not yet rated the seventh system because our user experience information may not reflect the vendor's new ownership and management. When this vendor provides the references we requested, we will interview users and supply a rating. We caution readers against basing purchasing decisions solely on our ratings. Each hospital must consider the unique needs of its users and its overall strategic plans--a process that can be aided by using our Process Evaluation and Needs Assessment Worksheet. Our conclusions can then be used to narrow down the number of vendors under consideration...

  14. Information processing of motion in facial expression and the geometry of dynamical systems

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; McMenamin, Brenton W.

    2004-12-01

    An interesting problem in analysis of video data concerns design of algorithms that detect perceptually significant features in an unsupervised manner, for instance methods of machine learning for automatic classification of human expression. A geometric formulation of this genre of problems could be modeled with help of perceptual psychology. In this article, we outline one approach for a special case where video segments are to be classified according to expression of emotion or other similar facial motions. The encoding of realistic facial motions that convey expression of emotions for a particular person P forms a parameter space XP whose study reveals the "objective geometry" for the problem of unsupervised feature detection from video. The geometric features and discrete representation of the space XP are independent of subjective evaluations by observers. While the "subjective geometry" of XP varies from observer to observer, levels of sensitivity and variation in perception of facial expressions appear to share a certain level of universality among members of similar cultures. Therefore, statistical geometry of invariants of XP for a sample of population could provide effective algorithms for extraction of such features. In cases where frequency of events is sufficiently large in the sample data, a suitable framework could be provided to facilitate the information-theoretic organization and study of statistical invariants of such features. This article provides a general approach to encode motion in terms of a particular genre of dynamical systems and the geometry of their flow. An example is provided to illustrate the general theory.

  15. Information processing of motion in facial expression and the geometry of dynamical systems

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; McMenamin, Brenton W.

    2005-01-01

    An interesting problem in analysis of video data concerns design of algorithms that detect perceptually significant features in an unsupervised manner, for instance methods of machine learning for automatic classification of human expression. A geometric formulation of this genre of problems could be modeled with help of perceptual psychology. In this article, we outline one approach for a special case where video segments are to be classified according to expression of emotion or other similar facial motions. The encoding of realistic facial motions that convey expression of emotions for a particular person P forms a parameter space XP whose study reveals the "objective geometry" for the problem of unsupervised feature detection from video. The geometric features and discrete representation of the space XP are independent of subjective evaluations by observers. While the "subjective geometry" of XP varies from observer to observer, levels of sensitivity and variation in perception of facial expressions appear to share a certain level of universality among members of similar cultures. Therefore, statistical geometry of invariants of XP for a sample of population could provide effective algorithms for extraction of such features. In cases where frequency of events is sufficiently large in the sample data, a suitable framework could be provided to facilitate the information-theoretic organization and study of statistical invariants of such features. This article provides a general approach to encode motion in terms of a particular genre of dynamical systems and the geometry of their flow. An example is provided to illustrate the general theory.

  16. Information processes in visual and object buffers of scene understanding system for reliable target detection, separation from background, and identification

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2006-05-01

    Modern target recognition systems suffer from the lack of human-like abilities to understand the visual scene, detect, unambiguously identify and recognize objects. As result, the target recognition systems become dysfunctional if target doesn't demonstrate remarkably distinctive and contrast features that allow for unambiguous separation from background and identification upon such features. This is somewhat similar to visual systems of primitive animals like frogs, which can separate and recognize only moving objects. However, human vision unambiguously separates any object from its background. Human vision combines a rough but wide peripheral, and narrow but precise foveal systems with visual intelligence that utilize both scene and object contexts and resolve ambiguity and uncertainty in the visual information. Perceptual grouping is one of the most important processes in human vision, and it binds visual information into meaningful patterns and structures. Unlike the traditional computer vision models, biologically-inspired Network-Symbolic models convert image information into an "understandable" Network-Symbolic format, which is similar to relational knowledge models. The equivalent of interaction between peripheral and foveal systems in the network-symbolic system is achieved via interaction between Visual and Object Buffers and the top-level system of Visual Intelligence. This interaction provides recursive rough context identification of regions of interest in the visual scene and their analysis in the object buffer for precise and unambiguous separation of the object from background/clutter with following recognition of the target.

  17. Integrated Optical Information Processing

    DTIC Science & Technology

    1988-08-01

    dimensional processing to be performed with inherently one-dimensional signal processing devices. This permits the monolithic or hybrid integration of...effective time delays; 5. Selective partial waveguide outcoupling of channelized light from an integrated optical chip; 6. Potential monolithic ... integrated optical chip; 6. Potential monolithic integration of optical waveguides with two-dimensional opto- electronic detector technology. The integrated

  18. The evolutionary function of conscious information processing is revealed by its task-dependency in the olfactory system.

    PubMed

    Keller, Andreas

    2014-01-01

    Although many responses to odorous stimuli are mediated without olfactory information being consciously processed, some olfactory behaviors require conscious information processing. I will here contrast situations in which olfactory information is processed consciously to situations in which it is processed non-consciously. This contrastive analysis reveals that conscious information processing is required when an organism is faced with tasks in which there are many behavioral options available. I therefore propose that it is the evolutionary function of conscious information processing to guide behaviors in situations in which the organism has to choose between many possible responses.

  19. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...(f), signed by the State or local agency and the State or local central data processing facility whenever a central data processing facility provides ADP services to the State or local agency. Software... central data processing facility or another State or local agency. Service agreements shall be kept on...

  20. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...(f), signed by the State or local agency and the State or local central data processing facility whenever a central data processing facility provides ADP services to the State or local agency. Software... central data processing facility or another State or local agency. Service agreements shall be kept on...

  1. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... by a system of electronic or electrical machines so interconnected and interacting as to minimize the... organizations other than the State agency to perform such tasks as feasibility studies, system studies, system.... Functional Requirements Specification means an initial definition of the proposed system, which documents...

  2. Computerization of workflows, guidelines, and care pathways: a review of implementation challenges for process-oriented health information systems

    PubMed Central

    Roudsari, Abdul

    2011-01-01

    Objective There is a need to integrate the various theoretical frameworks and formalisms for modeling clinical guidelines, workflows, and pathways, in order to move beyond providing support for individual clinical decisions and toward the provision of process-oriented, patient-centered, health information systems (HIS). In this review, we analyze the challenges in developing process-oriented HIS that formally model guidelines, workflows, and care pathways. Methods A qualitative meta-synthesis was performed on studies published in English between 1995 and 2010 that addressed the modeling process and reported the exposition of a new methodology, model, system implementation, or system architecture. Thematic analysis, principal component analysis (PCA) and data visualisation techniques were used to identify and cluster the underlying implementation ‘challenge’ themes. Results One hundred and eight relevant studies were selected for review. Twenty-five underlying ‘challenge’ themes were identified. These were clustered into 10 distinct groups, from which a conceptual model of the implementation process was developed. Discussion and conclusion We found that the development of systems supporting individual clinical decisions is evolving toward the implementation of adaptable care pathways on the semantic web, incorporating formal, clinical, and organizational ontologies, and the use of workflow management systems. These architectures now need to be implemented and evaluated on a wider scale within clinical settings. PMID:21724740

  3. A web accessible scientific workflow system for transparent and reproducible generation of information on subsurface processes from autonomously sensed data

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Richardson, A.; Thomas, S.; Lu, B.; Neto, J.; Wheeler, M.; Rowe, T.; Parashar, M.; Ankeny, M.

    2005-12-01

    Information on subsurface processes is required for a broad range of applications, including site remediation, groundwater management, fossil fuel production and CO2 sequestration. Data on these processes is obtained from diverse sensor networks, includes physical, hydrological and chemical sensors and semi permanent geophysical sensors (mainly seismic and resistivity). Currently, processing is done by specialists through the use of commercial and research software packages such as numerical inverse and forward models, statistical data analysis software and visualization and data presentation packages. Information is presented to stakeholders as tables, images and reports. Processing steps, data and assumptions used for information generation are mostly opaque to endusers. As data migrates between applications the steps taken in each application (e.g. in data reduction)are often only partly documented, resulting in irreproducible results. In this approach, interactive tuning of data processing in a systematic way (e.g. changing model parameters, visualization parameters or data used) or using data processing as a discovery tool is de facto impossible. We implemented a web accessible scientific workflow system for subsurface performance monitoring. This system integrates distributed, automated data acquisition from autonomous sensor networks with server side data management and information visualization through flexible browser based data access tools. Webservices are used for communication with the sensor networks and interaction with applications. This system was originally developed for a monitoring network at the Gilt Edge Mine Superfund site, but has now been implemented for a range of different sensor networks of different complexity. The workflow framework allows for rapid and easy integration in a modular, transparent and reproducible manner of a multitude of existing applications for data analysis and processes. By embedding applications in webservice

  4. MARKETING WESTERN WATER: CAN A PROCESS BASED GEOGRAPHIC INFORMATION SYSTEM IMPROVE REALLOCATION DECISIONS? (R828070)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  5. MARKETING WESTERN WATER: CAN A PROCESS BASED GEOGRAPHIC INFORMATION SYSTEM IMPROVE REALLOCATION DECISIONS? (R828070)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists.

  7. Information Processing Research.

    DTIC Science & Technology

    1986-09-01

    extracting phonetic features in a feature-based recognition system. [Thorpe 84] Thorpe, CE Fido: Vision and Navigation for a Robot Rover. PhD thesis...work blends into the User Interface Research discussed in Chapter 7. See section 7.5 for a discussion of the voice message system. Acoustic / Phonetic ...defined vocabulary that contains many fine phonetic distinctions, as in the set B, D, E, P, T, G, V, Z, C [Waibel 81]. We decided to build an expert

  8. Information Processing Theory: Classroom Applications.

    ERIC Educational Resources Information Center

    Slate, John R.; Charlesworth, John R., Jr.

    The information processing model, a theoretical framework of how humans think, reason, and learn, views human cognitive functioning as analogous to the operation of a computer. This paper uses the increased understanding of the information processing model to provide teachers with suggestions for improving the teaching-learning process. Major…

  9. Information Processing Research.

    DTIC Science & Technology

    1988-05-01

    Architectures for Data -dependent Algorithms 8-15 i~v V ~ % 0 A 8.3.4. Chess Machine 8-16 8.4. BiblIography 8-19 APPEN DIX I. GLOSSARY A-i INDEX I -1...structured data , and a distributed object reference/ method invocation system. The design of FOG was largely influenced by the Matchmaker remote procedure...Flamingo system addressed these key objectives through an object-oriented strategy that can associate data objects with operations, or " methods

  10. Optoelectronic Information Processing

    DTIC Science & Technology

    2012-03-07

    Very fine pointing, tracking , and stabilization control; Ultra-lightweight reconfigurable antennas THz & Microwave/Millimeter Wave photonics...plasmon waveguide for highest coupling efficiency • Process is compatible with standard CMOS reactive ion etching – no complex 3D structures or...Silicon Nanomembranes for Optical Phased Array (OPA) and Optical True Time Delay (TTD) Applications Texas-led MURI-Center for Silicon Nano- Membranes – PI

  11. EARTH SYSTEM ATLAS: A Platform for Access to Peer-Reviewed Information about process and change in the Earth System

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Prentice, C.

    2004-12-01

    A great deal of time, effort and resources have been expended on global change research to date, but dissemination and visualization of the key pertinent data sets has been problematical. Toward that end, we are constructing an Earth System Atlas which will serve as a single compendium describing the state of the art in our understanding of the Earth system and how it has responded to and is likely to respond to natural and anthropogenic perturbations. The Atlas is an interactive web-based system of data bases and data manipulation tools and so is much more than a collection of pre-made maps posted on the web. It represents a tool for assembling, manipulating, and displaying specific data as selected and customized by the user. Maps are created "on the fly" according to user-specified instructions. The information contained in the Atlas represents the growing body of data assembled by the broader Earth system research community, and can be displayed in the form of maps and time series of the various relevant parameters that drive and are driven by changes in the Earth system at various time scales. The Atlas is designed to display the information assembled by the global change research community in the form of maps and time series of all the relevant parameters that drive or are driven by changes in the Earth System at various time scales. This will serve to provide existing data to the community, but also will help to highlight data gaps that may hinder our understanding of critical components of the Earth system. This new approach to handling Earth system data is unique in several ways. First and foremost, data must be peer-reviewed. Further, it is designed to draw on the expertise and products of extensive international research networks rather than on a limited number of projects or institutions. It provides explanatory explanations targeted to the user's needs, and the display of maps and time series can be customize by the user. In general, the Atlas is

  12. 76 FR 52581 - Automated Data Processing and Information Retrieval System Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... share of costs in consolidated information technology (IT) operations to specify that the threshold for... 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives...). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing...

  13. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  14. Information-Theoretic Evidence for Predictive Coding in the Face-Processing System.

    PubMed

    Brodski-Guerniero, Alla; Paasch, Georg-Friedrich; Wollstadt, Patricia; Özdemir, Ipek; Lizier, Joseph T; Wibral, Michael

    2017-08-23

    Predictive coding suggests that the brain infers the causes of its sensations by combining sensory evidence with internal predictions based on available prior knowledge. However, the neurophysiological correlates of (pre)activated prior knowledge serving these predictions are still unknown. Based on the idea that such preactivated prior knowledge must be maintained until needed, we measured the amount of maintained information in neural signals via the active information storage (AIS) measure. AIS was calculated on whole-brain beamformer-reconstructed source time courses from MEG recordings of 52 human subjects during the baseline of a Mooney face/house detection task. Preactivation of prior knowledge for faces showed as α-band-related and β-band-related AIS increases in content-specific areas; these AIS increases were behaviorally relevant in the brain's fusiform face area. Further, AIS allowed decoding of the cued category on a trial-by-trial basis. Our results support accounts indicating that activated prior knowledge and the corresponding predictions are signaled in low-frequency activity (<30 Hz).SIGNIFICANCE STATEMENT Our perception is not only determined by the information our eyes/retina and other sensory organs receive from the outside world, but strongly depends also on information already present in our brains, such as prior knowledge about specific situations or objects. A currently popular theory in neuroscience, predictive coding theory, suggests that this prior knowledge is used by the brain to form internal predictions about upcoming sensory information. However, neurophysiological evidence for this hypothesis is rare, mostly because this kind of evidence requires strong a priori assumptions about the specific predictions the brain makes and the brain areas involved. Using a novel, assumption-free approach, we find that face-related prior knowledge and the derived predictions are represented in low-frequency brain activity. Copyright © 2017 the

  15. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... implementation. This includes system training, systems development, site preparation, data entry, and personal... if the procurement strategy is not adequately described and justified in an APD. The State agency... or if the procurement strategy is not adequately described and justified in an APD. The State...

  16. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  17. A business process modeling experience in a complex information system re-engineering.

    PubMed

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  18. Scaling-up Process-Oriented Guided Inquiry Learning Techniques for Teaching Large Information Systems Courses

    ERIC Educational Resources Information Center

    Trevathan, Jarrod; Myers, Trina; Gray, Heather

    2014-01-01

    Promoting engagement during lectures becomes significantly more challenging as class sizes increase. Therefore, lecturers need to experiment with new teaching methodologies to embolden deep learning outcomes and to develop interpersonal skills amongst students. Process Oriented Guided Inquiry Learning is a teaching approach that uses highly…

  19. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  20. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    PubMed

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Newborn Screening Information System (NBSIS)

    PubMed Central

    Dayhoff, R. E.; Ledley, R. S.; Rotolo, L. S.

    1984-01-01

    A Newborn Screening Information System (NBSIS) has been developed to handle the information processing needs of State Newborn Screening Laboratories. Systems have been customized for use by the States of Maryland and Florida. These systems track clients (babies) from their first contact with the Screening Center through their last follow-up test, producing worksheets, result reports, letters, and summaries for archival storage.

  2. Bibliographic Post-Processing with the TIS (Technology Information System) Intelligent Gateway: Analytical and Communication Capabilities.

    DTIC Science & Technology

    1985-09-01

    nucleosynthesis - Gateway to the very early universe AUTH: ’A/TURNER. M. S. PAA: A/(Chicago. University, Chicago. IL) (American Institute of Physics, NASA. NSF...TITLE> Big bang nucleosynthesis - Gateway to the very early universe <AUTHORS> TURNER. M. S. <PAA> A/(Chicogo, University, Chicago. IL) <PUB DESC...of the electronic states involved in the one- and two-photon processes is also discussed. Big bang nucleosynthesis - Gateway to the very early

  3. Mission Medical Information System

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.; Joe, John C.; Follansbee, Nicole M.

    2008-01-01

    This viewgraph presentation gives an overview of the Mission Medical Information System (MMIS). The topics include: 1) What is MMIS?; 2) MMIS Goals; 3) Terrestrial Health Information Technology Vision; 4) NASA Health Information Technology Needs; 5) Mission Medical Information System Components; 6) Electronic Medical Record; 7) Longitudinal Study of Astronaut Health (LSAH); 8) Methods; and 9) Data Submission Agreement (example).

  4. Object-Oriented Development Process for Department of Defense Information Systems.

    DTIC Science & Technology

    1995-07-01

    Incremental: total system |Jsystem[ 1/ specification 1• /• increment 1k..n" __ /" • /1..n.n •emonstratiOevelopmentPo ucinOperations" Defintion & Validationj...by the Assuming that mutual inheritance and looping are excluded. A-24 Object F Physical-object Even Abstract-object [ Animate -object I Inanimate

  5. Analysis of hydrological processes across the Northern Eurasia with recently re-developed online informational system

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A. I.; Proussevitch, A. A.; Gordov, E. P.; Okladnikov, I.; Titov, A. G.

    2016-12-01

    The volume of georeferenced datasets used for hydrology and climate research is growing immensely due to recent advances in modeling, high performance computers, and sensor networks, as well as initiation of a set of large scale complex global and regional monitoring experiments. To facilitate the management and analysis of these extensive data pools we developed Web-based data management, visualization, and analysis system - RIMS - http://earthatlas.sr.unh.edu/ (Rapid Integrated Mapping and Analysis System) with a focus on hydrological applications. Recently, under collaboration with Russian colleagues from the Institute of Monitoring of Climatic and Ecological Systems SB RAS, Russia, we significantly re-designed the RIMS to include the latest Web and GIS technologies in compliance with the Open Geospatial Consortium (OGC) standards. An upgraded RIMS can be successfully applied to address multiple research problems using an extensive data archive and embedded tools for data computations, visualizations and distributions. We will demonstrate current possibility of the system providing several results of applied data analysis fulfilled for territory of the Northern Eurasia. These results will include the analysis of historical, contemporary and future changes in climate and hydrology based on station and gridded data, investigations of recent extreme hydrological events, their anomalies, causes and potential impacts, and creation and analysis of new data sets through integration of social and geophysical data.

  6. Business Information Processing Curriculum Guide.

    ERIC Educational Resources Information Center

    Sullivan, Carol

    This curriculum guide is designed to train students in the competencies necessary to meet the needs of the automated office in entry-level information processing positions. The guide is organized into 16 units that are correlated with the essential elements for the business information processing course. Introductory materials include a scope and…

  7. All-Union Conference on Information Retrieval Systems and Automatic Processing of Scientific and Technical Information, 3rd, Moscow, 1967, Transactions. (Selected Articles).

    ERIC Educational Resources Information Center

    Air Force Systems Command, Wright-Patterson AFB, OH. Foreign Technology Div.

    The role and place of the machine in scientific and technical information is explored including: basic trends in the development of information retrieval systems; preparation of engineering and scientific cadres with respect to mechanization and automation of information works; the logic of descriptor retrieval systems; the 'SETKA-3' automated…

  8. All-Union Conference on Information Retrieval Systems and Automatic Processing of Scientific and Technical Information, 3rd, Moscow, 1967, Transactions. (Selected Articles).

    ERIC Educational Resources Information Center

    Air Force Systems Command, Wright-Patterson AFB, OH. Foreign Technology Div.

    The role and place of the machine in scientific and technical information is explored including: basic trends in the development of information retrieval systems; preparation of engineering and scientific cadres with respect to mechanization and automation of information works; the logic of descriptor retrieval systems; the 'SETKA-3' automated…

  9. Versatile microwave-driven trapped ion spin system for quantum information processing.

    PubMed

    Piltz, Christian; Sriarunothai, Theeraphot; Ivanov, Svetoslav S; Wölk, Sabine; Wunderlich, Christof

    2016-07-01

    Using trapped atomic ions, we demonstrate a tailored and versatile effective spin system suitable for quantum simulations and universal quantum computation. By simply applying microwave pulses, selected spins can be decoupled from the remaining system and, thus, can serve as a quantum memory, while simultaneously, other coupled spins perform conditional quantum dynamics. Also, microwave pulses can change the sign of spin-spin couplings, as well as their effective strength, even during the course of a quantum algorithm. Taking advantage of the simultaneous long-range coupling between three spins, a coherent quantum Fourier transform-an essential building block for many quantum algorithms-is efficiently realized. This approach, which is based on microwave-driven trapped ions and is complementary to laser-based methods, opens a new route to overcoming technical and physical challenges in the quest for a quantum simulator and a quantum computer.

  10. Versatile microwave-driven trapped ion spin system for quantum information processing

    PubMed Central

    Piltz, Christian; Sriarunothai, Theeraphot; Ivanov, Svetoslav S.; Wölk, Sabine; Wunderlich, Christof

    2016-01-01

    Using trapped atomic ions, we demonstrate a tailored and versatile effective spin system suitable for quantum simulations and universal quantum computation. By simply applying microwave pulses, selected spins can be decoupled from the remaining system and, thus, can serve as a quantum memory, while simultaneously, other coupled spins perform conditional quantum dynamics. Also, microwave pulses can change the sign of spin-spin couplings, as well as their effective strength, even during the course of a quantum algorithm. Taking advantage of the simultaneous long-range coupling between three spins, a coherent quantum Fourier transform—an essential building block for many quantum algorithms—is efficiently realized. This approach, which is based on microwave-driven trapped ions and is complementary to laser-based methods, opens a new route to overcoming technical and physical challenges in the quest for a quantum simulator and a quantum computer. PMID:27419233

  11. Searching System Call Information for Clues: The Effects of Intrusions on Processes

    DTIC Science & Technology

    2003-03-01

    executed, their intrusion detection system can determine the particular user operations performed. This is a prime example of how 2-13 internal...re-seeded with the system’s time every one hundred samples to address any issues with sequence repetitiveness. The Mersenne Twister [Mats02] was also...World Wide Web Site, URL http://www.ll.mit.edu/ IST/ideval/. [Mats02] Matsumoto, Makoto, “ Mersenne Twister Home Page.” World Wide Web, 2002

  12. The Fluvial Information System

    NASA Astrophysics Data System (ADS)

    Dugdale, S. J.; Carbonneau, P.; Clough, S.

    2009-12-01

    River ecologists have long been aware that our understanding of lotic ecology is limited by our lack of methods applicable to catchment scale processes. Furthermore, the EU’s Water Framework Directive states that surface waters must be managed at catchment scales. This has created a need for a new approach to high-resolution catchment scale data collection in fluvial environments. In response to this, remote sensing has been the focus of increasing interest in river science, and it is now possible to map parameters such as water depth, grain size and habitat type with sub-metric resolutions over large areas. These techniques are capable of yielding unprecedented amounts of information about river systems, and with such levels of information, crucial questions about catchment scale ecology can now be addressed. However, this intensive approach produces vast amounts of raster data leading to significant issues in terms of data management, and extracting spatially explicit information from large image databases poses a significant challenge which must be resolved if fluvial remote sensing methods are to deliver their potential. GIS has already been successfully applied to manage remotely sensed data. Unfortunately, when applied to fluvial remote sensing raster data, traditional GIS appears limited and unsuited to the specific tasks required by river scientists and managers, and experience with GIS packages has shown that they become overwhelmed when faced with datasets comprising thousands of rasters. Another fundamental issue with traditional GIS packages is the use of established Cartesian map projection systems. Given that rivers are curvilinear entities, the use of Cartesian grid map projections is mismatched and curvilinear coordinate systems unique to each river will be required. This paper introduces the Fluvial Information System (FIS), a raster based GIS-type system designed to manage fluvial remote sensing data and automatically extract meaningful

  13. Cellular and Network Mechanisms Underlying Information Processing in a Simple Sensory System

    NASA Technical Reports Server (NTRS)

    Jacobs, Gwen; Henze, Chris; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Realistic, biophysically-based compartmental models were constructed of several primary sensory interneurons in the cricket cercal sensory system. A dynamic atlas of the afferent input to these cells was used to set spatio-temporal parameters for the simulated stimulus-dependent synaptic inputs. We examined the roles of dendritic morphology, passive membrane properties, and active conductances on the frequency tuning of the neurons. The sensitivity of narrow-band low pass interneurons could be explained entirely by the electronic structure of the dendritic arbors and the dynamic sensitivity of the SIZ. The dynamic characteristics of interneurons with higher frequency sensitivity required models with voltage-dependent dendritic conductances.

  14. Cellular and Network Mechanisms Underlying Information Processing in a Simple Sensory System

    NASA Technical Reports Server (NTRS)

    Jacobs, Gwen; Henze, Chris; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Realistic, biophysically-based compartmental models were constructed of several primary sensory interneurons in the cricket cercal sensory system. A dynamic atlas of the afferent input to these cells was used to set spatio-temporal parameters for the simulated stimulus-dependent synaptic inputs. We examined the roles of dendritic morphology, passive membrane properties, and active conductances on the frequency tuning of the neurons. The sensitivity of narrow-band low pass interneurons could be explained entirely by the electronic structure of the dendritic arbors and the dynamic sensitivity of the SIZ. The dynamic characteristics of interneurons with higher frequency sensitivity required models with voltage-dependent dendritic conductances.

  15. The information-processing approach.

    PubMed

    van der Heijden, A H; Stebbins, S

    1990-01-01

    The information-processing (IP) approach to perception and cognition arose as a reaction to behaviourism. This reaction mainly concerned the nature of explanation in scientific psychology. The "standard" account of behaviour, phrased in strictly external terms, was replaced by a "realist" account, phrased in terms of internal entities and processes. An analysis of the theoretical language used in IP psychology shows an undisciplined state of affairs. A great number of languages is simultaneously in use; no level of analysis is unambiguously referred to; and basic concepts such as information and processing remain largely undefined. Nevertheless, over the past 25 years the IP approach has developed into a disciplined and sophisticated experimental science. A look at actual practice hints at the basic reason for its success. The approach is not so much concerned with absolute or intrinsic properties of the human information processor, but with what can be called its relative or differential properties. A further analysis of this feature of the IP approach in terms of the formal language of a logical system makes explicit the basis of its success. The IP approach can be regarded as developing an empirical difference calculus on an unspecified class of objects, phrased in terms of a simulated "theory-neutral" observation language, and with operators that are structurally analogous to logical operators. This reinterpretation of what the IP approach is about brings a number of advantages. It strengthens its position as an independent science, clarifies its relation with other approaches within psychology and other sciences within the cognitive science group, and makes it independent of philosophical subtleties.

  16. PMIS: System Description. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    PMIS (Planning and Management Information System) is an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time, interactive data management system for strategic planning;…

  17. Advanced information processing system: Hosting of advanced guidance, navigation and control algorithms on AIPS using ASTER

    NASA Technical Reports Server (NTRS)

    Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John

    1994-01-01

    This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.

  18. Landfill site selection using geographic information system and analytical hierarchy process: A case study Al-Hillah Qadhaa, Babylon, Iraq.

    PubMed

    Chabuk, Ali; Al-Ansari, Nadhir; Hussain, Hussain Musa; Knutsson, Sven; Pusch, Roland

    2016-05-01

    Al-Hillah Qadhaa is located in the central part of Iraq. It covers an area of 908 km(2) with a total population of 856,804 inhabitants. This Qadhaa is the capital of Babylon Governorate. Presently, no landfill site exists in that area based on scientific site selection criteria. For this reason, an attempt has been carried out to find the best locations for landfills. A total of 15 variables were considered in this process (groundwater depth, rivers, soil types, agricultural land use, land use, elevation, slope, gas pipelines, oil pipelines, power lines, roads, railways, urban centres, villages and archaeological sites) using a geographic information system. In addition, an analytical hierarchy process was used to identify the weight for each variable. Two suitable candidate landfill sites were determined that fulfil the requirements with an area of 9.153 km(2) and 8.204 km(2) These sites can accommodate solid waste till 2030.

  19. The DEFENSE (debris Flows triggEred by storms - nowcasting system): An early warning system for torrential processes by radar storm tracking using a Geographic Information System (GIS)

    NASA Astrophysics Data System (ADS)

    Tiranti, Davide; Cremonini, Roberto; Marco, Federica; Gaeta, Armando Riccardo; Barbero, Secondo

    2014-09-01

    Debris flows, responsible for economic losses and occasionally casualties in the alpine region, are mainly triggered by heavy rains characterized by hourly peaks of varying intensity, depending on the features of the basin under consideration. By integrating a recent classification of alpine basins with the radar storm tracking method, an innovative early warning system called DEFENSE (DEbris Flows triggEred by storms - Nowcasting SystEm) was developed using a Geographical Information System (GIS). Alpine catchments were classified into three main classes based on the weathering capacity of the bedrock into clay or clay-like minerals, the amount of which, in unconsolidated material, directly influences the debris flow rheology, and thus the sedimentary processes, the alluvial fan architecture, as well as the triggering frequency and seasonal occurrence probability of debris flows. Storms were identified and tracked by processing weather radar observations; subsequently, rainfall intensities and storm severity were estimated over each classified basin. Due to rainfall threshold values determined for each basin class, based on statistical analysis of historical records, an automatic corresponding warning could be issued to municipalities.

  20. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    PubMed

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance

  1. Information processing in the adaptation of Saccharomyces cerevisiae to osmotic stress: an analysis of the phosphorelay system.

    PubMed

    Uschner, Friedemann; Klipp, Edda

    2014-12-01

    Cellular signaling is key for organisms to survive immediate stresses from fluctuating environments as well as relaying important information about external stimuli. Effective mechanisms have evolved to ensure appropriate responses for an optimal adaptation process. For them to be functional despite the noise that occurs in biochemical transmission, the cell needs to be able to infer reliably what was sensed in the first place. For example Saccharomyces cerevisiae are able to adjust their response to osmotic shock depending on the severity of the shock and initiate responses that lead to near perfect adaptation of the cell. We investigate the Sln1-Ypd1-Ssk1-phosphorelay as a module in the high-osmolarity glycerol pathway by incorporating a stochastic model. Within this framework, we can imitate the noisy perception of the cell and interpret the phosphorelay as an information transmitting channel in the sense of C.E. Shannon's "Information Theory". We refer to the channel capacity as a measure to quantify and investigate the transmission properties of this system, enabling us to draw conclusions on viable parameter sets for modeling the system.

  2. Web platform using digital image processing and geographic information system tools: a Brazilian case study on dengue.

    PubMed

    Brasil, Lourdes M; Gomes, Marília M F; Miosso, Cristiano J; da Silva, Marlete M; Amvame-Nze, Georges D

    2015-07-16

    Dengue fever is endemic in Asia, the Americas, the East of the Mediterranean and the Western Pacific. According to the World Health Organization, it is one of the diseases of greatest impact on health, affecting millions of people each year worldwide. A fast detection of increases in populations of the transmitting vector, the Aedes aegypti mosquito, is essential to avoid dengue outbreaks. Unfortunately, in several countries, such as Brazil, the current methods for detecting populations changes and disseminating this information are too slow to allow efficient allocation of resources to fight outbreaks. To reduce the delay in providing the information regarding A. aegypti population changes, we propose, develop, and evaluate a system for counting the eggs found in special traps and to provide the collected data using a web structure with geographical location resources. One of the most useful tools for the detection and surveillance of arthropods is the ovitrap, a special trap built to collect the mosquito eggs. This allows for an egg counting process, which is still usually performed manually, in countries such as Brazil. We implement and evaluate a novel system for automatically counting the eggs found in the ovitraps' cardboards. The system we propose is based on digital image processing (DIP) techniques, as well as a Web based Semi-Automatic Counting System (SCSA-WEB). All data collected are geographically referenced in a geographic information system (GIS) and made available on a Web platform. The work was developed in Gama's administrative region, in Brasília/Brazil, with the aid of the Environmental Surveillance Directory (DIVAL-Gama) and Brasília's Board of Health (SSDF), in partnership with the University of Brasília (UnB). The system was built based on a field survey carried out during three months and provided by health professionals. These professionals provided 84 cardboards from 84 ovitraps, sized 15 × 5 cm. In developing the system, we conducted

  3. Information Systems in Dentistry

    PubMed Central

    Masic, Fedja

    2012-01-01

    Introduction: Almost the entire human creativity today, from the standpoint of its efficiency and expediency, is conditioned with the existence of information systems. Most information systems are oriented to the management and decision-making, including health information system. System of health and health insurance together form one of the most important segments of society and its functioning as a compact unit. Increasing requirements for reducing health care costs while preserving or improving the quality of services provided represent a difficult task for the health system. Material and methods: Using descriptive metods by retreiiving literature we analyzed the latest solutions in information and telecommunications technology is the basis for building an effective and efficient health system. Computerization does not have the primary objective of saving, but the rationalization of spending in health care. It is estimated that at least 20-30% of money spent in health care can be rationally utilized. Computerization should give the necessary data and indicators for this rationalization. Very important are the goals of this project and the achievement of other uses and benefits, improving overall care for patients and policyholders, increasing the speed and accuracy of diagnosis in determining treatment using electronic diagnostic and therapeutic guidelines. Results and discussion: Computerization in dentistry began similarly as in other human activities–recording large amounts of data on digital media, and by replacing manual data processing to machine one. But specifics of the dental profession have led to the specifics of the application of information technology (IT), and continue to require special development of dental oriented and applied IT. Harmonization of dental software with global standards will enable doctors and dentists to with a few mouse clicks via the internet reach the general medical information about their patients from the central

  4. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  5. Semantic processing in information retrieval.

    PubMed Central

    Rindflesch, T. C.; Aronson, A. R.

    1993-01-01

    Intuition suggests that one way to enhance the information retrieval process would be the use of phrases to characterize the contents of text. A number of researchers, however, have noted that phrases alone do not improve retrieval effectiveness. In this paper we briefly review the use of phrases in information retrieval and then suggest extensions to this paradigm using semantic information. We claim that semantic processing, which can be viewed as expressing relations between the concepts represented by phrases, will in fact enhance retrieval effectiveness. The availability of the UMLS domain model, which we exploit extensively, significantly contributes to the feasibility of this processing. PMID:8130547

  6. Information Processing Using Quantum Probability

    NASA Astrophysics Data System (ADS)

    Behera, Laxmidhar

    2006-11-01

    This paper presents an information processing paradigm that introduces collective response of multiple agents (computational units) while the level of intelligence associated with the information processing has been increased manifold. It is shown that if the potential field of the Schroedinger wave equation is modulated using a self-organized learning scheme, then the probability density function associated with the stochastic data is transferred to the probability amplitude function which is the response of the Schroedinger wave equation. This approach illustrates that information processing of data with stochastic behavior can be efficiently done using quantum probability instead of classical probability. The proposed scheme has been demonstrated through two applications: denoising and adaptive control.

  7. Neural processing of gravity information

    NASA Technical Reports Server (NTRS)

    Schor, Robert H.

    1992-01-01

    The goal of this project was to use the linear acceleration capabilities of the NASA Vestibular Research Facility (VRF) at Ames Research Center to directly examine encoding of linear accelerations in the vestibular system of the cat. Most previous studies, including my own, have utilized tilt stimuli, which at very low frequencies (e.g., 'static tilt') can be considered a reasonably pure linear acceleration (e.g., 'down'); however, higher frequencies of tilt, necessary for understanding the dynamic processing of linear acceleration information, necessarily involves rotations which can stimulate the semicircular canals. The VRF, particularly the Long Linear Sled, has promise to provide controlled pure linear accelerations at a variety of stimulus frequencies, with no confounding angular motion.

  8. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  9. Community Information Systems.

    ERIC Educational Resources Information Center

    Freeman, Andrew

    Information is provided on technological and social trends as background for a workshop designed to heighten the consciousness of workers in community information systems. Initially, the basic terminology is considered in its implications for an integrated perspective of community information systems, with particular attention given to the meaning…

  10. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  11. Information processing, computation, and cognition

    PubMed Central

    Scarantino, Andrea

    2010-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958

  12. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  13. Cognition: Human Information Processing. Introduction.

    ERIC Educational Resources Information Center

    Griffith, Belver C.

    1981-01-01

    Summarizes the key research issues and developments in cognitive science, especially with respect to the similarities, differences, and interrelationships between human and machine information processing. Nine references are listed. (JL)

  14. Application of a Micro Computer-Based Management Information System to Improve the USAF Service Reporting Process

    DTIC Science & Technology

    1990-09-01

    ASD) SPO. The Service Reporting Management Information System (SRMIS) was implemented and evaluated in the Air Force One (AF-1) Replacement Aircraft SPO during a five month trial period.... Information System (MIS). MIS experts and System Program Office (SPO) acquisition managers; and software prototyping with an Aeronautical Systems Division...This study conducted research into the development, implementation and evaluation of a personal computer based Service Reporting (SR) Management

  15. Learning to Use Geographic Information Systems and Image Processing and Analysis to Teach Ocean Science to Middle School Students

    NASA Astrophysics Data System (ADS)

    Moore, S. D.; Martin, J.; Kinzel, M.

    2004-12-01

    This presentation will provide a middle school teacher's perspective on Ocean Explorers, a three-year project directed at teachers and schools in California. Funded by the Information Technology Experiences for Students and Teachers (ITEST) program at the National Science Foundation, Ocean Explorers is giving support to teams of teachers that will serve as local user groups for the exploration of geographic information systems (GIS) and image processing and analysis (IPA) as educational technologies for studying ocean science. Conducted as a collaboration between the nonprofit Center for Image Processing in Education and the Channel Islands National Marine Sanctuary, the project is providing mentoring, software, equipment, funding, and training on how to design inquiry-based activities that support achievement of California's standards for science, technology, mathematics, and reading education. During year two of Ocean Explorers, the teams of teachers will begin to use GIS and IPA as tools for involving their students in original research on issues of interest to their home communities. With assistance from the Ocean Explorers project, the teachers will create inquiry-based activities for their students that will help their school achieve targeted standards. This presentation will focus on plans by one teacher for involving students from St. Mary's Middle School, Fullerton, California, in tracking of ocean pollution and beach closures along the Southern California coast.

  16. Effects of age on spatial information processing: relationship to senescent changes in brain noradrenergic and opioid systems

    SciTech Connect

    Rapp, P.R.

    1985-01-01

    A major focus in current research on aging is the identification of senescent changes in cognitive function in laboratory animals. This literature indicates that the processing of spatial information may be particularly impaired during senescence. The degree to which nonspecific factors (eg. sensory of motor deficits) contribute to behavioral impairments in aging, however, remains largely uninvestigated. In addition, few studies have attempted to identify senescent changes in brain structure and function which might underlie the behavioral manifestations of aging. In the behavioral experiments reported here, the authors tested young, middle-age, and senescent rates in several versions of a spatial memory task, the Morris water maze. The results of these investigations demonstrate that aged rats are significantly impaired in the Morris task compared to young or middle-age animals. In addition, these studies indicate that age-related deficits in the water maze reflect a specific dysfunction in the ability of older animals to effectively process spatial information rather than a senescent decline in sensory or motor functions. Using the subjects from the behavioral studies, additional investigations assessed whether age-dependent changes in neurochemical and neuroanatomical systems which are known to mediate spatial learning in young animals were related to the behavioral deficits exhibited by aged rats. The results of these studies demonstrate that a portion of senescent animals exhibit significant increases in lateral septal /sup 3/H-desmethylimipramine binding and decrease in /sup 3/H-naloxone binding in this same region as assessed by quantitative in vitro autoradiography.

  17. Information-computational system for storage, search and analytical processing of environmental datasets based on the Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Titov, A.; Gordov, E.; Okladnikov, I.

    2009-04-01

    a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.

  18. Nanophotonics for information systems

    NASA Astrophysics Data System (ADS)

    Nezhad, M.; Abashin, M.; Ikeda, K.; Pang, L.; Kim, H. C.; Levy, U.; Tetz, K.; Rokitski, R.; Fainman, Y.

    2007-02-01

    Optical technology plays an increasingly important role in numerous applications areas, including communications, information processing, and data storage. However, as optical technology develops, it is evident that there is a growing need to develop reliable photonic integration technologies. This will include the development of passive as well as active optical components that can be integrated into functional optical circuits and systems, including filters, switching fabrics that can be controlled either electrically or optically, optical sources, detectors, amplifiers, etc. We explore the unique capabilities and advantages of nanotechnology in developing next generation integrated photonic chips. Our long-range goal is to develop a range of photonic nanostructures including artificially birefringent and resonant devices, photonic crystals, and photonic crystals with defects to tailor spectral filters, and nanostructures for spatial field localization to enhance optical nonlinearities, to facilitate on-chip system integration through compatible materials and fabrication processes. The design of artificial nanostructured materials, PCs and integrated photonic systems is one of the most challenging tasks as it not only involves the accurate solution of electromagnetic optics equations, but also the need to incorporate the material and quantum physics equations. Near-field interactions in artificial nanostructured materials provide a variety of functionalities useful for optical systems integration. Recently, the inclusion of surface plasmon photonics in this area has opened up a host of new possibilities Finally and most importantly, nanophotonics may enable easier integration with other nanotechnologies: electronics, magnetics, mechanics, chemistry, and biology. We will address some of these areas in this paper.

  19. Spatial Analysis in Determination Of Flood Prone Areas Using Geographic Information System and Analytical Hierarchy Process at Sungai Sembrong's Catchment

    NASA Astrophysics Data System (ADS)

    Bukari, S. M.; Ahmad, M. A.; Wai, T. L.; Kaamin, M.; Alimin, N.

    2016-07-01

    Floods that struck Johor state in 2006 and 2007 and the East Coastal in 2014 have triggered a greatly impact to the flood management here in Malaysia. Accordingly, this study conducted to determine potential areas of flooding, especially in Batu Pahat district since it faces terrifying experienced with heavy flood. This objective is archived by using the application of Geographic Information Systems (GIS) on study area of flood risk location at the watershed area of Sungai Sembrong. GIS functions as spatial analysis is capable to produce new information based on analysis of data stored in the system. Meanwhile the Analytical Hierarchy Process (AHP) was used as a method for setting up in decision making concerning the existing data. By using AHP method, preparation and position of the criteria and parameters required in GIS are neater and easier to analyze. Through this study, a flood prone area in the watershed of Sungai Sembrong was identified with the help of GIS and AHP. Analysis was conducted to test two different cell sizes, which are 30 and 5. The analysis of flood prone areas were tested on both cell sizes with two different water levels and the results of the analysis were displayed by GIS. Therefore, the use of AHP and GIS are effective and able to determine the potential flood plain areas in the watershed area of Sungai Sembrong.

  20. AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems

    PubMed Central

    LeVine, Michael V.; Weinstein, Harel

    2015-01-01

    In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular “action at a distance” is termed allostery. Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system's underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor. PMID:26594108

  1. AIM for Allostery: Using the Ising Model to Understand Information Processing and Transmission in Allosteric Biomolecular Systems.

    PubMed

    LeVine, Michael V; Weinstein, Harel

    2015-05-01

    In performing their biological functions, molecular machines must process and transmit information with high fidelity. Information transmission requires dynamic coupling between the conformations of discrete structural components within the protein positioned far from one another on the molecular scale. This type of biomolecular "action at a distance" is termed allostery. Although allostery is ubiquitous in biological regulation and signal transduction, its treatment in theoretical models has mostly eschewed quantitative descriptions involving the system's underlying structural components and their interactions. Here, we show how Ising models can be used to formulate an approach to allostery in a structural context of interactions between the constitutive components by building simple allosteric constructs we termed Allosteric Ising Models (AIMs). We introduce the use of AIMs in analytical and numerical calculations that relate thermodynamic descriptions of allostery to the structural context, and then show that many fundamental properties of allostery, such as the multiplicative property of parallel allosteric channels, are revealed from the analysis of such models. The power of exploring mechanistic structural models of allosteric function in more complex systems by using AIMs is demonstrated by building a model of allosteric signaling for an experimentally well-characterized asymmetric homodimer of the dopamine D2 receptor.

  2. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  3. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  4. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  5. Regional Health Information Systems

    PubMed Central

    Fuller, Sherrilynne

    1997-01-01

    Abstract In general, there is agreement that robust integrated information systems are the foundation for building successful regional health care delivery systems. Integrated Advanced Information Management System (IAIMS) institutions that, over the years, have developed strategies for creating cohesive institutional information systems and services are finding that IAIMS strategies work well in the even more complex regional environment. The key elements of IAIMS planning are described and lessons learned are discussed in the context of regional health information systems developed. The challenges of aligning the various information agencies and agendas in support of a regional health information system are complex ; however, the potential rewards for health care in quality, efficacy, and cost savings are enormous. PMID:9067887

  6. ECONOMICS OF INFORMATION SYSTEMS

    DTIC Science & Technology

    The paper presents a study of the rational choice-making of an individual from among available information systems , or available components of such...components, of information systems . The available set depends on the choices made by suppliers. Joint choices by demanders and suppliers would...determine which information systems are in fact produced and used under given external conditions. These conditions include the technological knowledge of those concerned.

  7. Management Information Systems Research.

    DTIC Science & Technology

    Research on management information systems is illusive in many respects. Part of the basic research problem in MIS stems from the absence of standard...definitions and the lack of a unified body of theory. Organizations continue to develop large and often very efficient information systems , but...decision making. But the transition from these results to the realization of ’satisfactory’ management information systems remains difficult indeed. The

  8. Weather Information System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    WxLink is an aviation weather system based on advanced airborne sensors, precise positioning available from the satellite-based Global Positioning System, cockpit graphics and a low-cost datalink. It is a two-way system that uplinks weather information to the aircraft and downlinks automatic pilot reports of weather conditions aloft. Manufactured by ARNAV Systems, Inc., the original technology came from Langley Research Center's cockpit weather information system, CWIN (Cockpit Weather INformation). The system creates radar maps of storms, lightning and reports of surface observations, offering improved safety, better weather monitoring and substantial fuel savings.

  9. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  10. Information processing in miniature brains.

    PubMed

    Chittka, L; Skorupski, P

    2011-03-22

    Since a comprehensive understanding of brain function and evolution in vertebrates is often hobbled by the sheer size of the nervous system, as well as ethical concerns, major research efforts have been made to understand the neural circuitry underpinning behaviour and cognition in invertebrates, and its costs and benefits under natural conditions. This special feature of Proceedings of the Royal Society B contains an idiosyncratic range of current research perspectives on neural underpinnings and adaptive benefits (and costs) of such diverse phenomena as spatial memory, colour vision, attention, spontaneous behaviour initiation, memory dynamics, relational rule learning and sleep, in a range of animals from marine invertebrates with exquisitely simple nervous systems to social insects forming societies with many thousands of individuals working together as a 'superorganism'. This introduction provides context and history to tie the various approaches together, and concludes that there is an urgent need to understand the full neuron-to-neuron circuitry underlying various forms of information processing-not just to explore brain function comprehensively, but also to understand how (and how easily) cognitive capacities might evolve in the face of pertinent selection pressures. In the invertebrates, reaching these goals is becoming increasingly realistic.

  11. Anesthesia information management systems.

    PubMed

    Williams, Joe R

    2005-06-01

    Documentation is the last component of anesthesia patient management to be affected by technology. Anesthesia information management systems (AIMS) have been introduced in a limited number of practice sites. The automated systems provide unbiased reporting of most patient information. This results in improved patient care and possible medical legal advantages. AIMS also allow anesthesia departments to monitor their business related activity.

  12. Environmental geographic information system.

    SciTech Connect

    Peek, Dennis W; Helfrich, Donald Alan; Gorman, Susan

    2010-08-01

    This document describes how the Environmental Geographic Information System (EGIS) was used, along with externally received data, to create maps for the Site-Wide Environmental Impact Statement (SWEIS) Source Document project. Data quality among the various classes of geographic information system (GIS) data is addressed. A complete listing of map layers used is provided.

  13. Information Retrieval System.

    ERIC Educational Resources Information Center

    Mahle, Jack D., Jr.

    The Fort Detrick Information Retrieval System is a system of computer programs written in COBOL for a CDC 3150 to store and retrieve information about the scientific and technical reports and documents of the Fort Detrick Technical Library. The documents and reports have been abstracted and indexed. This abstract, the subject matter descriptors,…

  14. Archival Information Management System.

    DTIC Science & Technology

    1995-02-01

    management system named Archival Information Management System (AIMS), designed to meet the audit trail requirement for studies completed under the...are to be archived to the extent that future reproducibility and interrogation of results will exist. This report presents a prototype information

  15. The Benefits and Challenges of an Interfaced Electronic Health Record and Laboratory Information System: Effects on Laboratory Processes.

    PubMed

    Petrides, Athena K; Bixho, Ida; Goonan, Ellen M; Bates, David W; Shaykevich, Shimon; Lipsitz, Stuart R; Landman, Adam B; Tanasijevic, Milenko J; Melanson, Stacy E F

    2017-03-01

    - A recent government regulation incentivizes implementation of an electronic health record (EHR) with computerized order entry and structured results display. Many institutions have also chosen to interface their EHR with their laboratory information system (LIS). - To determine the impact of an interfaced EHR-LIS on laboratory processes. - We analyzed several different processes before and after implementation of an interfaced EHR-LIS: the turnaround time, the number of stat specimens received, venipunctures per patient per day, preanalytic errors in phlebotomy, the number of add-on tests using a new electronic process, and the number of wrong test codes ordered. Data were gathered through the LIS and/or EHR. - The turnaround time for potassium and hematocrit decreased significantly (P = .047 and P = .004, respectively). The number of stat orders also decreased significantly, from 40% to 7% for potassium and hematocrit, respectively (P < .001 for both). Even though the average number of inpatient venipunctures per day increased from 1.38 to 1.62 (P < .001), the average number of preanalytic errors per month decreased from 2.24 to 0.16 per 1000 specimens (P < .001). Overall there was a 16% increase in add-on tests. The number of wrong test codes ordered was high and it was challenging for providers to correctly order some common tests. - An interfaced EHR-LIS significantly improved within-laboratory turnaround time and decreased stat requests and preanalytic phlebotomy errors. Despite increasing the number of add-on requests, an electronic add-on process increased efficiency and improved provider satisfaction. Laboratories implementing an interfaced EHR-LIS should be cautious of its effects on test ordering and patient venipunctures per day.

  16. Enabling Business Processes through Information Management and IT Systems: The FastFit and Winter Gear Distributors Case Studies

    ERIC Educational Resources Information Center

    Kesner, Richard M.; Russell, Bruce

    2009-01-01

    The "FastFit Case Study" and its companion, the "Winter Gear Distributors Case Study" provide undergraduate business students with a suitable and even familiar business context within which to initially consider the role of information management (IM) and to a lesser extent the role of information technology (IT) systems in enabling a business.…

  17. Enabling Business Processes through Information Management and IT Systems: The FastFit and Winter Gear Distributors Case Studies

    ERIC Educational Resources Information Center

    Kesner, Richard M.; Russell, Bruce

    2009-01-01

    The "FastFit Case Study" and its companion, the "Winter Gear Distributors Case Study" provide undergraduate business students with a suitable and even familiar business context within which to initially consider the role of information management (IM) and to a lesser extent the role of information technology (IT) systems in enabling a business.…

  18. Transparent materials processing system

    NASA Technical Reports Server (NTRS)

    Hetherington, J. S.

    1977-01-01

    A zero gravity processing furnace system was designed that will allow acquisition of photographic or other visual information while the sample is being processed. A low temperature (30 to 400 C) test model with a flat specimen heated by quartz-halide lamps was constructed. A high temperature (400 to 1000 C) test model heated by resistance heaters, utilizing a cylindrical specimen and optics, was also built. Each of the test models is discussed in detail. Recommendations are given.

  19. Optical Hybrid Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Takeda, Shuntaro; Furusawa, Akira

    Historically, two complementary approaches to optical quantum information processing have been pursued: qubits and continuous-variables, each exploiting either particle or wave nature of light. However, both approaches have pros and cons. In recent years, there has been a significant progress in combining both approaches with a view to realizing hybrid protocols that overcome the current limitations. In this chapter, we first review the development of the two approaches with a special focus on quantum teleportation and its applications. We then introduce our recent research progress in realizing quantum teleportation by a hybrid scheme, and mention its future applications to universal and fault-tolerant quantum information processing.

  20. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  1. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  2. A Microbiology Information System

    PubMed Central

    Peebles, James E.; Ryan, Kenneth J.

    1980-01-01

    This paper describes a microbiology information system which is integrated into a general purpose laboratory information system as well as into the normal workflow of the microbiology laboratory. Data entry using “customized” terminal keyboards greatly simplify technologists interaction with the system allowing direct entry of results at each workstation. Results are reported in a user oriented format utilizing full English description of all terms.

  3. NMCS Information Processing System 360 Formatted File System (NIPS 360 FFS). Users Manual. Volume 1. Introduction to File Concepts

    DTIC Science & Technology

    1978-09-01

    may be utilized from the bitched job stream as well as from terminals. The third aajor subset of this component is Sourze Data Automation (SODA...segment. 3.10.1.3 Data Gemecation Gcoup The DSNAME parameter of the DD cards in the ICL stream for the user and system libaries (SLIB), sequential data fil...fields and groups of the same type. SODA System component -- Source Data Automation . Stop W:cr Table A user-defined table for the Keywcrd Indexing

  4. Systemic factors of errors in the case identification process of the national routine health information system: a case study of Modified Field Health Services Information System in the Philippines.

    PubMed

    Murai, Shinsuke; Lagrada, Leizel P; Gaite, Julita T; Uehara, Naruo

    2011-10-14

    The quality of data in national health information systems has been questionable in most developing countries. However, the mechanisms of errors in the case identification process are not fully understood. This study aimed to investigate the mechanisms of errors in the case identification process in the existing routine health information system (RHIS) in the Philippines by measuring the risk of committing errors for health program indicators used in the Field Health Services Information System (FHSIS 1996), and characterizing those indicators accordingly. A structured questionnaire on the definitions of 12 selected indicators in the FHSIS was administered to 132 health workers in 14 selected municipalities in the province of Palawan. A proportion of correct answers (difficulty index) and a disparity of two proportions of correct answers between higher and lower scored groups (discrimination index) were calculated, and the patterns of wrong answers for each of the 12 items were abstracted from 113 valid responses. None of 12 items reached a difficulty index of 1.00. The average difficulty index of 12 items was 0.266 and the discrimination index that showed a significant difference was 0.216 and above. Compared with these two cut-offs, six items showed non-discrimination against lower difficulty indices of 0.035 (4/113) to 0.195 (22/113), two items showed a positive discrimination against lower difficulty indices of 0.142 (16/113) and 0.248 (28/113), and four items showed a positive discrimination against higher difficulty indices of 0.469 (53/113) to 0.673 (76/113). The results suggest three characteristics of definitions of indicators such as those that are (1) unsupported by the current conditions in the health system, i.e., (a) data are required from a facility that cannot directly generate the data and, (b) definitions of indicators are not consistent with its corresponding program; (2) incomplete or ambiguous, which allow several interpretations; and (3

  5. Systemic factors of errors in the case identification process of the national routine health information system: A case study of Modified Field Health Services Information System in the Philippines

    PubMed Central

    2011-01-01

    Background The quality of data in national health information systems has been questionable in most developing countries. However, the mechanisms of errors in the case identification process are not fully understood. This study aimed to investigate the mechanisms of errors in the case identification process in the existing routine health information system (RHIS) in the Philippines by measuring the risk of committing errors for health program indicators used in the Field Health Services Information System (FHSIS 1996), and characterizing those indicators accordingly. Methods A structured questionnaire on the definitions of 12 selected indicators in the FHSIS was administered to 132 health workers in 14 selected municipalities in the province of Palawan. A proportion of correct answers (difficulty index) and a disparity of two proportions of correct answers between higher and lower scored groups (discrimination index) were calculated, and the patterns of wrong answers for each of the 12 items were abstracted from 113 valid responses. Results None of 12 items reached a difficulty index of 1.00. The average difficulty index of 12 items was 0.266 and the discrimination index that showed a significant difference was 0.216 and above. Compared with these two cut-offs, six items showed non-discrimination against lower difficulty indices of 0.035 (4/113) to 0.195 (22/113), two items showed a positive discrimination against lower difficulty indices of 0.142 (16/113) and 0.248 (28/113), and four items showed a positive discrimination against higher difficulty indices of 0.469 (53/113) to 0.673 (76/113). Conclusions The results suggest three characteristics of definitions of indicators such as those that are (1) unsupported by the current conditions in the health system, i.e., (a) data are required from a facility that cannot directly generate the data and, (b) definitions of indicators are not consistent with its corresponding program; (2) incomplete or ambiguous, which allow

  6. Image-plane processing of visual information

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.

    1984-01-01

    Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.

  7. Image-plane processing of visual information

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.

    1984-01-01

    Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.

  8. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2015-06-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  9. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2016-03-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  10. Earthquake Information System

    NASA Astrophysics Data System (ADS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  11. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; Brissebrat, Guillaume; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Favot, Florence; Roussot, Odile

    2014-05-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to facilitate and speed up data and information exchange between researchers from different disciplines. The AMMA information system includes (i) a multidisciplinary user-friendly data management and dissemination system, (ii) report and chart archives associated with display websites and (iii) a scientific paper exchange system. The AMMA information system is enriched by several previous (IMPETUS...) and following projects (FENNEC, ESCAPE, QweCI, DACCIWA…) and is becoming a reference information system about West Africa monsoon. (i) The AMMA project includes airborne, ground-based and ocean measurements, satellite data use, modelling studies and value-added product development. Therefore, the AMMA database user interface enables to access a great amount and a large variety of data: - 250 local observation datasets, that cover many geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health). They have been collected by operational networks from 1850 to present, long term monitoring research networks (CATCH, IDAF, PIRATA...) or scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. All the data are documented in compliance with metadata international standards, and delivered into standard formats. The data request user interface takes full advantage of the data and metadata base relational structure and enables users to elaborate easily multicriteria data requests (period, area, property, property value…). The AMMA data portal counts around 800 registered users and process about 50 data requests every month. The AMMA databases and data portal have been developed and are operated jointly by SEDOO and ESPRI in France

  12. White Light Optical Information Processing.

    DTIC Science & Technology

    1985-05-31

    together) incident on the nematic film , after passage through the opti- cal system, was about 0.2 watts. A second beam splitter BSI was placed between... film , a process that is like holography, indeed is often termed image-plane holography, but in fact goes back 0 to Ives.5 In particular, the use of...slit images became straight, whereupon the system was assumed . to be properly adjusted. For the real time, or phase conjugation process, a thin film

  13. Application of the informational reference system OZhUR to the automated processing of data from satellites of the Kosmos series

    NASA Technical Reports Server (NTRS)

    Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.

    1978-01-01

    The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.

  14. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  15. Medical Information Processing by Computer.

    ERIC Educational Resources Information Center

    Kleinmuntz, Benjamin

    The use of the computer for medical information processing was introduced about a decade ago. Considerable inroads have now been made toward its applications to problems in medicine. Present uses of the computer, both as a computational and noncomputational device include the following: automated search of patients' files; on-line clinical data…

  16. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  17. Information Processing Applications: Curriculum Guidelines.

    ERIC Educational Resources Information Center

    Washington Office of the State Superintendent of Public Instruction, Olympia. Div. of Vocational-Technical and Adult Education Services.

    This guide is intended to serve as a resource for business education instructors who are teaching a course in information processing for the automated office. The following topics are covered: program goals, student learning objectives for production applications, an introduction to production applications, a curriculum outline, student learning…

  18. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  19. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  20. Arkansas Technology Information System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan; Parette, Howard P., Jr.

    The Arkansas Technology Information System (ARTIS) was developed to fill a significant void in existing systems of technical support to Arkansans with disabilities by creating and maintaining a consumer-responsive statewide system of data storage and retrieval regarding assistive technology and services. ARTIS goals also include establishment of a…

  1. The visual information system

    Treesearch

    Merlyn J. Paulson

    1979-01-01

    This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...

  2. Information System Overview.

    ERIC Educational Resources Information Center

    Burrows, J. H.

    This paper was prepared for distribution to the California Educational Administrators participating in the "Executive Information Systems" Unit of Instruction as part of the instructional program of Operation PEP (Prepare Educational Planners). The purpose of the course was to introduce some basic concepts of information systems…

  3. NASA space information systems overview

    NASA Technical Reports Server (NTRS)

    Hall, Dana L.

    1987-01-01

    A major objective of NASA space missions is the gathering of information that when analyzed, compared, and interpreted furthers man's knowledge of his planet and surrounding universe. A space information system is the combination of data gathering, data processing, and data transport capabilities that interact to provide the underlying services that enable that advancement in understanding. Past space projects have been characterized by rather disjoint data systems that often did not satisfy user requirements. NASA has learned from those experiences, however, and now is conceptualizing a new generation of sophisticated, integrated space information systems suitable to the wide range of near future space endeavors. This paper examines the characteristics of recent data systems and, based upon that characterization, outlines the scope and attributes of future systems. A description if offered of the information system for the Space Station Program as one real example of such advanced capabilities.

  4. Processing Of Visual Information In Primate Brains

    NASA Technical Reports Server (NTRS)

    Anderson, Charles H.; Van Essen, David C.

    1991-01-01

    Report reviews and analyzes information-processing strategies and pathways in primate retina and visual cortex. Of interest both in biological fields and in such related computational fields as artificial neural networks. Focuses on data from macaque, which has superb visual system similar to that of humans. Authors stress concept of "good engineering" in understanding visual system.

  5. Processing Of Visual Information In Primate Brains

    NASA Technical Reports Server (NTRS)

    Anderson, Charles H.; Van Essen, David C.

    1991-01-01

    Report reviews and analyzes information-processing strategies and pathways in primate retina and visual cortex. Of interest both in biological fields and in such related computational fields as artificial neural networks. Focuses on data from macaque, which has superb visual system similar to that of humans. Authors stress concept of "good engineering" in understanding visual system.

  6. Forest Resource Information System

    NASA Technical Reports Server (NTRS)

    Mrocznyski, R. P.

    1983-01-01

    Twenty-three processing functions aid in utilizing LANDSAT data for forest resource management. Designed to work primarily with digital data obtained from measurements recorded by multispectral remote sensors mounted on aerospace platforms. communication between processing functions, simplicity of control, and commonality of data files in LARSFRIS enhance usefulness of system as tool for research and development of remote sensing systems.

  7. Risk assessment of urban flood disaster in Jingdezhen City based on analytic hierarchy process and geographic information system

    NASA Astrophysics Data System (ADS)

    Sun, D. C.; Huang, J.; Wang, H. M.; Wang, Z. Q.; Wang, W. Q.

    2017-08-01

    The research of urban flood risk assessment and management are of great academic and practical importance, which has become a widespread concern throughout the world. It’s significant to understand the spatial-temporal distribution of the flood risk before making the risk response measures. In this study, the urban region of Jingdezhen City is selected as the study area. The assessment indicators are selected from four aspects: disaster-causing factors, disaster-pregnant environment, disaster-bearing body and the prevention and mitigation ability, by consideration of the formation process of urban flood risk. And then, a small-scale flood disaster risk assessment model is developed based on Analytic Hierarchy Process(AHP) and Geographic Information System(GIS), and the spatial-temporal distribution of flood risk in Jingdezhen City is analysed. The results show that the risk decreases gradually from the centre line of Changjiang River to the surrounding, and the areas of high flood disaster risk is decreasing from 2010 to 2013 while the risk areas are more concentred. The flood risk of the areas along the Changjiang River is the largest, followed by the low-lying areas in Changjiang District. And the risk is also large in Zhushan District where the population, the industries and commerce are concentrated. The flood risk in the western part of Changjiang District and the north-eastern part of the study area is relatively low. The results can provide scientific support for flood control construction and land development planning in Jingdezhen City.

  8. Physics as quantum information processing

    NASA Astrophysics Data System (ADS)

    Mauro D'Ariano, Giacomo

    2011-10-01

    The experience from Quantum Information has lead us to look at Quantum Theory (QT) and the whole Physics from a different angle. The information-theoretical paradigm—It from Bit— prophesied by John Archibald Wheeler is relentlessly advancing. Recently it has been shown that QT is derivable from pure informational principles. The possibility that there is only QT at the foundations of Physics has been then considered, with space-time, Relativity, quantization rules and Quantum Field Theory (QFT) emerging from a quantum-information processing. The resulting theory is a discrete version of QFT with automatic relativistic invariance, and without fields, Hamiltonian, and quantization rules. In this paper I review some recent advances on these lines. In particular: i) How space-time and relativistic covariance emerge from the quantum computation; ii) The derivation of the Dirac equation as free information flow, without imposing Lorentz covariance; iii) the information-theoretical meaning of inertial mass and Planck constant; iv) An observable consequence of the theory: a mass-dependent refraction index of vacuum. I will then conclude with two possible routes to Quantum Gravity.

  9. Practicality of quantum information processing

    NASA Astrophysics Data System (ADS)

    Lau, Hoi-Kwan

    Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I

  10. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  11. Efficiency of cellular information processing

    NASA Astrophysics Data System (ADS)

    Barato, Andre C.; Hartich, David; Seifert, Udo

    2014-10-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial.

  12. Information processing. [in human performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  13. Occurrence reporting and processing of operations information

    SciTech Connect

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  14. Management Information System Project.

    ERIC Educational Resources Information Center

    Foley, Walter J.; Harr, Gordon G.

    The Management Information System (MIS) described in this report represents a plan to utilize modern management techniques to facilitate the goal of a learner-responsive school system. The MIS component is being developed to meet the need for the coordination of the resources of staff, facilities, and time with the long range planning and…

  15. The Impact of Hierarchy and Group Structure on Information Processing in Decision Making: Application of a Networks/Systems Approach.

    ERIC Educational Resources Information Center

    Ford, David L., Jr.

    When one engages in organizational diagnosis, it has been suggested that greater understanding of the organization can come through: (1) an identification of all the channels conveying material and information, and (2) a description of the means by which this communication influences the behavior of the organization. A networks/system approach is…

  16. SHRIF, a General-Purpose System for Heuristic Retrieval of Information and Facts, Applied to Medical Knowledge Processing.

    ERIC Educational Resources Information Center

    Findler, Nicholas V.; And Others

    1992-01-01

    Describes SHRIF, a System for Heuristic Retrieval of Information and Facts, and the medical knowledge base that was used in its development. Highlights include design decisions; the user-machine interface, including the language processor; and the organization of the knowledge base in an artificial intelligence (AI) project like this one. (57…

  17. SHRIF, a General-Purpose System for Heuristic Retrieval of Information and Facts, Applied to Medical Knowledge Processing.

    ERIC Educational Resources Information Center

    Findler, Nicholas V.; And Others

    1992-01-01

    Describes SHRIF, a System for Heuristic Retrieval of Information and Facts, and the medical knowledge base that was used in its development. Highlights include design decisions; the user-machine interface, including the language processor; and the organization of the knowledge base in an artificial intelligence (AI) project like this one. (57…

  18. Manufacturing information system

    NASA Astrophysics Data System (ADS)

    Allen, D. K.; Smith, P. R.; Smart, M. J.

    1983-12-01

    The size and cost of manufacturing equipment has made it extremely difficult to perform realistic modeling and simulation of the manufacturing process in university research laboratories. Likewise the size and cost factors, coupled with many uncontrolled variables of the production situation has even made it difficult to perform adequate manufacturing research in the industrial setting. Only the largest companies can afford manufacturing research laboratories; research results are often held proprietary and seldom find their way into the university classroom to aid in education and training of new manufacturing engineers. It is the purpose for this research to continue the development of miniature prototype equipment suitable for use in an integrated CAD/CAM Laboratory. The equipment being developed is capable of actually performing production operations (e.g. drilling, milling, turning, punching, etc.) on metallic and non-metallic workpieces. The integrated CAD/CAM Mini-Lab is integrating high resolution, computer graphics, parametric design, parametric N/C parts programmings, CNC machine control, automated storage and retrieval, with robotics materials handling. The availability of miniature CAD/CAM laboratory equipment will provide the basis for intensive laboratory research on manufacturing information systems.

  19. Implementing Information Assurance - Beyond Process

    DTIC Science & Technology

    2009-01-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY...for use on the system, fUlther burden of proof that it isn’t installed is required, For example, the Defense Information Systems Agency (DISA) Gold...or Class 4 certificate and a hardware security token or using aNSA -certified product. A smart card and the Common Access Card (CAC) are examples of the

  20. Emergency department information system implementation and process redesign result in rapid and sustained financial enhancement at a large academic center.

    PubMed

    Shapiro, Jason S; Baumlin, Kevin M; Chawla, Neal; Genes, Nicholas; Godbold, James; Ye, Fen; Richardson, Lynne D

    2010-05-01

    The objectives were to measure the financial impact of implementing a fully integrated emergency department information system (EDIS) and determine the length of time to "break even" on the initial investment. A before-and-after study design was performed using a framework of analysis consisting of four 15-month phases: 1) preimplementation, 2) peri-implementation, 3) postimplementation, and 4) sustained effects. Registration and financial data were reviewed. Costs and rates of professional and facility charges and receipts were calculated for the phases in question and compared against monthly averages for covariates such as volume, collections rates, acuity, age, admission rate, and insurance status with an autoregressive time series analysis using a segmented model. The break-even point was calculated by measuring cumulative monthly receipts for the last three study phases in excess of the average monthly receipts from the preimplementation phase, corrected for change in volume, and then plotting this against cumulative overall cost. Time to break even on the initial EDIS investment was less than 8 months. Total revenue enhancement at the end of the 5-year study period was $16,138,953 with an increase of 69.40% in charges and 70.06% in receipts. This corresponds to an increase in receipts per patient from $50 to $90 for professional services and $131 to $183 for facilities charges. Other than volume, there were no significant changes in trends for covariates between the preimplementation and sustained-effects periods. A comprehensive EDIS implementation with process redesign resulted in sustained increases in professional and facility revenues and a rapid initial break-even point. .

  1. Management Information System

    NASA Technical Reports Server (NTRS)

    1984-01-01

    New Automated Management Information Center (AMIC) employs innovative microcomputer techniques to create color charts, viewgraphs, or other data displays in a fraction of the time formerly required. Developed under Kennedy Space Center's contract by Boeing Services International Inc., Seattle, WA, AMIC can produce an entirely new informational chart in 30 minutes, or an updated chart in only five minutes. AMIC also has considerable potential as a management system for business firms.

  2. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  3. CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.

    2005-12-01

    The Consortium of Universities for Advancement of Hydrologic Science, Inc. (CUAHSI) seeks to build a Hydrologic Information System (HIS) for which hydrologic data sources will be assembled in space and time to create a digital representation of atmospheric, surface and subsurface water flow through a watershed or other hydrologic system. A common data window for automatically accessing water observation data from US federal agencies is being developed based on web data services. Together with the related CLEANER program in environmental engineering, a cybercollaboratory is being used to foster remote access to data and shared research concerning its interpretation and model. A Digital Library to index hydrologic information within a river basin or aquifer has been developed and a Digital Watershed to synthesize observations, GIS, weather and climate grids and remote sensing is being designed and prootyped. Examples are presented from the Neuse basin in North Carolina and other locations to illustrate these components of a Hydrologic Information System.

  4. Organizations and Information Processing: A Field Study of Research and Development Units within the United States Air Force Systems Command.

    DTIC Science & Technology

    1984-08-01

    uncertainty (Downey, Hellriegel and Slocum, 1975). DowTney and Slocum (1975) conceptualize uncertainty as a psychological state in which the sources of...Child, 1912; Osborn and Hunt, 1914; Downey, Hellriegel , and Slocum, 1975; Huber, O’Connel and Cummings, 1975; Schmidt and Cummings, 1975; Leifer and...uncertainty and the need for external information processing, including: Duncan, 1972; Child, 1972; Osborn and Hunt, 1914; Hellriegel and Solcum, 1915; Huber

  5. Geographic information systems

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.

    1982-01-01

    Information and activities are provided to: (1) enhance the ability to distinguish between a Geographic Information System (GIS) and a data management system; (2) develop understanding of spatial data handling by conventional methods versus the automated approach; (3) promote awareness of GIS design and capabilities; (4) foster understanding of the concepts and problems of data base development and management; (5) facilitate recognition of how a computerized GIS can model conditions in the present "real world" to project conditions in the future; and (6) appreciate the utility of integrating LANDSAT and other remotely sensed data into the GIS.

  6. Geographic information systems

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.

    1982-01-01

    Information and activities are provided to: (1) enhance the ability to distinguish between a Geographic Information System (GIS) and a data management system; (2) develop understanding of spatial data handling by conventional methods versus the automated approach; (3) promote awareness of GIS design and capabilities; (4) foster understanding of the concepts and problems of data base development and management; (5) facilitate recognition of how a computerized GIS can model conditions in the present "real world" to project conditions in the future; and (6) appreciate the utility of integrating LANDSAT and other remotely sensed data into the GIS.

  7. Global Land Information System

    USGS Publications Warehouse

    ,

    1999-01-01

    The Global Land Information System (GLIS) is a World Wide Web-based query tool developed by the U.S. Geological Survey (USGS) to provide data and information about the Earth's land surface. Examples of holdings available through the GLIS include cartographic data, topographic data, soils data, aerial photographs, and satellite images from various agencies and cooperators located around the world. Both hard copy and digital data collections are represented in the GLIS, and preview images are available for millions of the products in the system.

  8. The Phobos information system

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I. P.; Oberst, J.; Zubarev, A. E.; Nadezhdina, I. E.; Kokhanov, A. A.; Garov, A. S.; Uchaev, D. V.; Uchaev, Dm. V.; Malinnikov, V. A.; Klimkin, N. D.

    2014-11-01

    We have developed a Geo-information system (GIS) for Phobos, based on data from the Mars Express and Viking Orbiter missions, which includes orthoimages, global maps, terrain- and gravity field models, all referenced to the Phobos coordinate system. The data are conveniently stored in the ArcGIS software system, which provides an environment for mapping and which allows us to carry out joint data analysis and miscellaneous data cross-comparisons. We have compiled catalogs of Phobos craters using manual and automated techniques, which includes about 5500 and 6400 craters correspondingly. While crater numbers are biased by available image data resolution and illumination, we estimate that our catalog of manually detected craters contains all Phobos craters with diameters D>250 m which is a total of 1072 and catalog of automated detected craters are complete for craters D>400 m (360 craters). Statistical analysis of these large craters reveals a surplus of craters on the anti-Mars hemisphere, whereas differences in crater abundance between leading and trailing hemisphere cannot be confirmed. This in contrast to previous papers, where no such asymmetry was found (Schmedemann et al., 2014). But we cannot rule out remaining biases due to resolution, viewing angles or illumination effects. Using digital terrain model (DTM) derived from photogrammetry image processing we estimate depths of 25 craters larger than 2 km using geometric and dynamic heights (for discussion of Phobos crater morphometry see Kokhanov et al., 2014). We also have compiled catalogs of lineaments, and boulders. In particular, we mapped 546 individual grooves or crater chains, which extend in length from 0.3 km to 16.2 km. We identified and determined the sizes and locations of 1379 boulders near crater Stickney. Cross-comparisons of gravity field models against distribution patterns of grooves and boulders are currently under way and may shed light on their possible origins. Finally, we have developed

  9. An accountability model for integrating information systems, evaluation mechanisms, and decision making processes in alcohol and drug abuse agencies.

    PubMed

    Duncan, F H; Link, A D

    1979-01-01

    This article has attempted to demonstrate that decision making and evaluation can be carried out in a systematic fashion only if agencies make a commitment to do so, and only if adequate systems are established. The management information system is the most expensive and most sophisticated component of the integrated model presented here. Its existence, in some fashion, is essential to the operation of the model. Contrary to what many managers may believe and practice, the management information system is not in itself the final solution to evaluation. Neither is the evaluation a panacea for all program ills. Evaluation can provide the information required to meet the ever increasing demands for agency or program accountability evaluation can also provide insights for future decisions to change or alter the allocation of resources. Such evaluation must be carefully planned and implemented; and, at the state level, can be successful only if executed in a systematic manner as suggested here. Regardless of the degree of sophistication of any system, it will work only when supported by users in the local treatment centers. If the model employed does little to serve them, it is not a model worth considering. It is with these needs in mind that this model was developed.

  10. Space Station information systems

    NASA Technical Reports Server (NTRS)

    Swingle, W. L.; Mckay, C. W.

    1983-01-01

    The space operations information system is defined and characterized in a wide perspective. Interactive subsets of the total system are defined and discussed. Particular attention is paid to the concept of end-to-end systems and their repetitive population within the total system. High level program goals are reviewed and related to more explicit system requirements and user needs. Emphasis is placed on the utility and cost effectiveness of data system services from a user standpoint. Productivity, as a quantitative goal, in both development and operational phases is also addressed. Critical aspects of the approach to successful development of the data management system are discussed along with recommendations important to advanced development activities. Current and planned activity in both technology and advanced development areas are reviewed with emphasis on their importance to program success.

  11. Application of symbolic processing to command and control: An Advanced Information Presentation System. Volume 2: Knowledge base

    NASA Astrophysics Data System (ADS)

    Zdybel, F.; Gibbons, J.; Greenfeld, N.; Yonke, M.

    1981-08-01

    This report describes the work performed in the second year of the three-year contract to explore the application of symbolic processing to command and control (C2); specifically, the graphics interface between the C2 user and a complex C2 decision support system. Volume 2 contains the complete AIPS knowledge base. This document provides the fully-inherited structure that the system sees during operation.

  12. Application of symbolic processing to command and control: An Advanced Information Presentation System. Volume 3: Program source files

    NASA Astrophysics Data System (ADS)

    Zdybel, F.; Gibbons, J.; Greenfeld, N.; Yonke, M.

    1981-08-01

    This report describes the work performed in the second year of the three-year contract to explore the application of symbolic processing to command and control (C2); specifically, the graphics interface between the C2 user and a complex C2 decision support system. Volume 3 contains the programs that manipulate the knowledge base and provide the active behavioral component of the system.

  13. Insect barcode information system.

    PubMed

    Pratheepa, Maria; Jalali, Sushil Kumar; Arokiaraj, Robinson Silvester; Venkatesan, Thiruvengadam; Nagesh, Mandadi; Panda, Madhusmita; Pattar, Sharath

    2014-01-01

    Insect Barcode Information System called as Insect Barcode Informática (IBIn) is an online database resource developed by the National Bureau of Agriculturally Important Insects, Bangalore. This database provides acquisition, storage, analysis and publication of DNA barcode records of agriculturally important insects, for researchers specifically in India and other countries. It bridges a gap in bioinformatics by integrating molecular, morphological and distribution details of agriculturally important insects. IBIn was developed using PHP/My SQL by using relational database management concept. This database is based on the client- server architecture, where many clients can access data simultaneously. IBIn is freely available on-line and is user-friendly. IBIn allows the registered users to input new information, search and view information related to DNA barcode of agriculturally important insects.This paper provides a current status of insect barcode in India and brief introduction about the database IBIn. http://www.nabg-nbaii.res.in/barcode.

  14. Pharmacology Information System Ready

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    Discusses the development and future of Prophet,'' a specialized information handling system for pharmacology research. It is designed to facilitate the acquisition and dissemination of knowledge about mechanisms of drug action, and it is hoped that it will aid in converting pharmacology research from an empirical to a predictive science. (JR)

  15. Pharmacology Information System Ready

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    Discusses the development and future of Prophet,'' a specialized information handling system for pharmacology research. It is designed to facilitate the acquisition and dissemination of knowledge about mechanisms of drug action, and it is hoped that it will aid in converting pharmacology research from an empirical to a predictive science. (JR)

  16. Communication and Information Systems.

    ERIC Educational Resources Information Center

    Wheeler, Peter

    1982-01-01

    Discusses the Microelectronics Education Programme's work in the communication and information systems domain, suggesting that teachers understanding the new technologies and incorporate them into regular classroom instruction. Focuses on computers in the classroom, economy of time, keyboard skills, life skills, and vocational training. (Author/JN)

  17. Statistical Information Retrieval System.

    ERIC Educational Resources Information Center

    DiFondi, Nicholas M.

    An information retrieval system was developed using technical word occurrences as a basis for classification. A set of words, designated a vocabulary, was selected from the middle range of frequency listing of words occurring in an experimental sample of 94 documents. The selection produced 115 non-function words with technical definition that did…

  18. Geographic information systems

    USGS Publications Warehouse

    ,

    1992-01-01

    Geographic information systems (GIS) technology can be used for scientific investigations, resource management, and developmental planning. For example, a GIS might allow emergency planners to easily calculate emergency response times in the event of a natural disaster, or a GIS might be used to find wetlands that need protection form pollution.

  19. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts. Final Report, Year 2.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    This document examines the design and structure of PMIS (Planning and Management Information System), an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time,…

  20. FLEXIBLE APPLICATION OF THE JLAB PANSOPHY INFORMATION SYSTEM FOR PROJECT REPORTS, PROCESS MONITORING, AND R&D SAMPLE TRACKING

    SciTech Connect

    Valerie Bookwalter; Bonnie Madre; Charles Reece

    2008-02-12

    The use and features of the JLab SRF Institute IT system Pansophy1,2 continue to expand. In support of the cryomodule rework project for CEBAF a full set of web-based travelers has been implemented and an integrated set of live summary reports has been created. A graphical user interface within the reports enables navigation to either higher-level summaries or drill-down to the original source data. In addition to collection of episodic data, Pansophy is now used to capture, coordinate, and display continuously logged process parameter that relate to technical water systems and clean room environmental conditions. In a new expansion, Pansophy is being used to collect and track process and analytical data sets associated with SRF material samples that are part of the surface creation, processing, and characterization R&D program.

  1. Information processing for aerospace structural health monitoring

    NASA Astrophysics Data System (ADS)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  2. NLP Meets the Jabberwocky: Natural Language Processing in Information Retrieval.

    ERIC Educational Resources Information Center

    Feldman, Susan

    1999-01-01

    Focuses on natural language processing (NLP) in information retrieval. Defines the seven levels at which people extract meaning from text/spoken language. Discusses the stages of information processing; how an information retrieval system works; advantages to adding full NLP to information retrieval systems; and common problems with information…

  3. NLP Meets the Jabberwocky: Natural Language Processing in Information Retrieval.

    ERIC Educational Resources Information Center

    Feldman, Susan

    1999-01-01

    Focuses on natural language processing (NLP) in information retrieval. Defines the seven levels at which people extract meaning from text/spoken language. Discusses the stages of information processing; how an information retrieval system works; advantages to adding full NLP to information retrieval systems; and common problems with information…

  4. Information Management of a Structured Admissions Interview Process in a Medical College with an Apple II System

    PubMed Central

    O'Reilly, Robert; Fedorko, Steve; Nicholson, Nigel

    1983-01-01

    This paper describes a structured interview process for medical school admissions supported by an Apple II computer system which provides feedback to interviewers and the College admissions committee. Presented are the rationale for the system, the preliminary results of analysis of some of the interview data, and a brief description of the computer program and output. The present data show that the structured interview yields very high interrater reliability coefficients, is acceptable to the medical school faculty, and results in quantitative data useful in the admission process. The system continues in development at this time, a second year of data will be shortly available, and further refinements are being made to the computer program to enhance its utilization and exportability.

  5. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  6. SRS Research Information System Thesaurus.

    ERIC Educational Resources Information Center

    Schultz, Claire K., Ed.

    For information storage and retrieval, a thesaurus is used during indexing and searching processes to translate from natural language into a more restricted retrieval system language. The purpose of this thesaurus is to control the language used to index and retrieve documents of interest to Social and Rehabilitation Service (SRS) and the…

  7. Toward intelligent information system

    NASA Astrophysics Data System (ADS)

    Takano, Fumio; Hinatsu, Ken'ichi

    This article describes the indexing aid system and project at JICST, API, NLM and BIOSIS. They are dealing with the very broad domain of science, medicine and technological literatures and indexing is done by use of controlled terms, the indexing is routinely performed by highly skilled indexers. Because of the high cost of controlled indexing of bibliographic information they have designed automated indexing system and/or expert-like system to take advantage of many years of experienced indexing using knowledge bases and /on thesauri.

  8. Terminal chaos for information processing in neurodynamics.

    PubMed

    Zak, M

    1991-01-01

    New nonlinear phenomenon-terminal chaos caused by failure of the Lipschitz condition at equilibrium points of dynamical systems is introduced. It is shown that terminal chaos has a well organized probabilistic structure which can be predicted and controlled. This gives an opportunity to exploit this phenomenon for information processing. It appears that chaotic states of neurons activity are associated with higher level of cognitive processes such as generalization and abstraction.

  9. Uniform Library Information Systems in Yugoslavia.

    ERIC Educational Resources Information Center

    Zivkovic, Bogomila

    1979-01-01

    The process followed by Yugoslavia in developing a national interdisciplinary information system included surveys of existing library practices, identification of potential databases, processing of library materials, production of national bibliographies, production of catalogs, and use of the system. (FM)

  10. Nuclear criticality information system

    SciTech Connect

    Koponen, B.L.; Hampel, V.E.

    1981-11-30

    The nuclear criticality safety program at LLNL began in the 1950's with a critical measurements program which produced benchmark data until the late 1960's. This same time period saw the rapid development of computer technology useful for both computer modeling of fissile systems and for computer-aided management and display of the computational benchmark data. Database management grew in importance as the amount of information increased and as experimental programs were terminated. Within the criticality safety program at LLNL we began at that time to develop a computer library of benchmark data for validation of computer codes and cross sections. As part of this effort, we prepared a computer-based bibliography of criticality measurements on relatively simple systems. However, it is only now that some of these computer-based resources can be made available to the nuclear criticality safety community at large. This technology transfer is being accomplished by the DOE Technology Information System (TIS), a dedicated, advanced information system. The NCIS database is described.

  11. A Web Information Retrieval System

    NASA Astrophysics Data System (ADS)

    Kim, Tae-Hyun; Park, Dong-Chul; Huh, Woong; Kim, Hyen-Ug; Yoon, Chung-Hwa; Park, Chong-Dae; Woo, Dong-Min; Jeong, Taikyeong; Cho, Il-Hwan; Lee, Yunsik

    An approach for the retrieval of price information from internet sites is applied to real-world application problems in this paper. The Web Information Retrieval System (WIRS) utilizes Hidden Markov Model (HMM) for its powerful capability to process temporal information. HMM is an extremely flexible tool and has been successfully applied to a wide variety of stochastic modeling tasks. In order to compare the prices and features of products from various web sites, the WIRS extracts prices and descriptions of various products within web pages. The WIRS is evaluated with real-world problems and compared with a conventional method and the result is reported in this paper.

  12. Process and information integration via hypermedia

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.

    1990-01-01

    Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.

  13. Information sciences experiment system

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Murray, Nicholas D.; Benz, Harry F.; Bowker, David E.; Hendricks, Herbert D.

    1990-01-01

    The rapid expansion of remote sensing capability over the last two decades will take another major leap forward with the advent of the Earth Observing System (Eos). An approach is presented that will permit experiments and demonstrations in onboard information extraction. The approach is a non-intrusive, eavesdropping mode in which a small amount of spacecraft real estate is allocated to an onboard computation resource. How such an approach allows the evaluation of advanced technology in the space environment, advanced techniques in information extraction for both Earth science and information science studies, direct to user data products, and real-time response to events, all without affecting other on-board instrumentation is discussed.

  14. Procedures for using Geographic Information Systems for the handling and processing of scientific data from the planetary surfaces.

    NASA Astrophysics Data System (ADS)

    Frigeri, A.; Federico, C.; Pauselli, C.; Minelli, G.

    The availability of large volume of data from instruments on-board scientific planetary missions justify the use of Geographic Information Systems (GIS) procedures for the study of terrestrial planets and their satellites. As mission data volumes increase the use of GIS techniques offer the planetary scientist a way for fast retrieval, storage and analysis of heterogeneous data and allows comparative analysis between different dataset that otherwise would be difficult to perform. Although GIS systems have been already used for planetary research, none provides a native generic support for studying surfaces of terrestrial planets and satellites. The work presented here describes the development of a pool of procedures in the form of computer codes and supporting files produced to provide a generic support to handle, analyze and visualize planetary remote sensed data in a selected GIS system allowing to perform the comparative analysis of different geological and geophysical planetary data. The application of procedures developed allowed to aggregate maps from different mission to Mars in order to investigate the geologic context of an area of Mars and to correlate these information with the first subsurface signals of the Mars Advanced Radar Subsurface and Ionospheric Sounder (MARSIS).

  15. Safeguards Information Management Systems (SIMS)

    SciTech Connect

    Sorenson, R.J.; Sheely, K.B.; Brown, J.B.; Horton, R.D.; Strittmatter, R.; Manatt, D.R.

    1994-04-01

    The requirements for the management of information at the International Atomic Energy Agency (IAEA) and its Department of Safeguards are rapidly changing. Historically, the Department of Safeguards has had the requirement to process large volumes of conventional safeguards information. An information management system is currently in place that adequately handles the IAEA`s conventional safeguards data needs. In the post-Iraq environment, however, there is a growing need to expand the IAEA information management capability to include unconventional forms of information. These data include environmental sampling results, photographs, video film, lists of machine tools, and open-source materials such as unclassified publications. The US Department of Energy (DOE) has responded to this information management need by implementing the Safeguards Information Management Systems (SIMS) initiative. SIMS was created by the DOE to anticipate and respond to IAEA information management needs through a multilaboratory initiative that will utilize an integrated approach to develop and deploy technology in a timely and cost-effective manner. The DOE will use the SIMS initiative to coordinate US information management activities that support the IAEA Department of Safeguards.

  16. Interactive Development Environments for Information Systems

    PubMed Central

    Wasserman, Anthony I.

    1986-01-01

    Most medical information systems are interactive information systems, since they provide their users with conversational access to data. The design of an interactive information system requires attention to data design, process design, and user interface design so that the resulting system will be easy to use and reliable. This paper describes some automated tools aimed at assisting software designers and developers in creating interactive information systems, with emphasis on the Software through Pictures environment and the User Software Engineering (USE) methodology.

  17. CUAHSI Hydrologic Information Systems

    NASA Astrophysics Data System (ADS)

    Maidment, D.; Zaslavsky, I.; Tarboton, D.; Piasecki, M.; Goodall, J.

    2006-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc (CUAHSI) has a Hydrologic Information System (HIS) project, which is supported by NSF to develop infrastructure and services to support the advance of hydrologic science in the United States. This paper provides an overview of the HIS project. A set of web services called WaterOneFlow is being developed to provide better access to water observations data (point measurements of streamflow, water quality, climate and groundwater levels) from government agencies and individual investigator projects. Successful partnerships have been created with the USGS National Water Information System, EPA Storet and the NCDC Climate Data Online. Observations catalogs have been created for stations in the measurement networks of each of these data systems so that they can be queried in a uniform manner through CUAHSI HIS, and data delivered from them directly to the user via web services. A CUAHSI Observations Data Model has been designed for storing individual investigator data and an equivalent set of web services created for that so that individual investigators can publish their data onto the internet in the same format CUAHSI is providing for the federal agency data. These data will be accessed through HIS Servers hosted at the national level by CUAHSI and also by research centers and academic departments for regional application of HIS. An individual user application called HIS Analyst will enable individual hydrologic scientists to access the information from the network of HIS Servers. The present focus is on water observations data but later development of this system will include weather and climate grid information, GIS data, remote sensing data and linkages between data and hydrologic simulation models.

  18. Building a hospital information system: design considerations based on results from a Europe-wide vendor selection process.

    PubMed

    Kuhn, K A; Lenz, R; Blaser, R

    1999-01-01

    A number of research and development projects in the U.S. and in Europe have shown that novel technologies can open significant perspectives for hospital information systems (HIS). The selection of software products for a HIS, however, is still nontrivial. Generalist vendors promise a broad scope of functionality and integration, while specialist vendors promise elaborated and highly adapted functionality. In 1997, the university hospital Marburg, a 1,250 bed teaching hospital, decided to introduce a new large-scale HIS. The objectives of the project included support of clinical workflows, cost effectiveness and a maximum standard of medical care. In 1997/98 a formal Europe-wide vendor contest was performed. 15 vendors, including several from the U.S., participated. Systems were checked against the hospital's objectives, functionality, and technological criteria. One of the results of both technology and market assessment was the identification of fundamental technological and design aspects strongly influencing functionality and flexibility.

  19. Category identification of changed land-use polygons in an integrated image processing/geographic information system

    NASA Technical Reports Server (NTRS)

    Westmoreland, Sally; Stow, Douglas A.

    1992-01-01

    A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.

  20. Symposium on Geographic Information Systems.

    ERIC Educational Resources Information Center

    Felleman, John, Ed.

    1990-01-01

    Six papers on geographic information systems cover the future of geographic information systems, land information systems modernization in Wisconsin, the Topologically Integrated Geographic Encoding and Referencing (TIGER) System of the U.S. Bureau of the Census, satellite remote sensing, geographic information systems and sustainable development,…

  1. Precisely timing dissipative quantum information processing.

    PubMed

    Kastoryano, M J; Wolf, M M; Eisert, J

    2013-03-15

    Dissipative engineering constitutes a framework within which quantum information processing protocols are powered by system-environment interaction rather than by unitary dynamics alone. This framework embraces noise as a resource and, consequently, offers a number of advantages compared to one based on unitary dynamics alone, e.g., that the protocols are typically independent of the initial state of the system. However, the time independent nature of this scheme makes it difficult to imagine precisely timed sequential operations, conditional measurements, or error correction. In this work, we provide a path around these challenges, by introducing basic dissipative gadgets which allow us to precisely initiate, trigger, and time dissipative operations while keeping the system Liouvillian time independent. These gadgets open up novel perspectives for thinking of timed dissipative quantum information processing. As an example, we sketch how measurement-based computation can be simulated in the dissipative setting.

  2. Precisely Timing Dissipative Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Kastoryano, M. J.; Wolf, M. M.; Eisert, J.

    2013-03-01

    Dissipative engineering constitutes a framework within which quantum information processing protocols are powered by system-environment interaction rather than by unitary dynamics alone. This framework embraces noise as a resource and, consequently, offers a number of advantages compared to one based on unitary dynamics alone, e.g., that the protocols are typically independent of the initial state of the system. However, the time independent nature of this scheme makes it difficult to imagine precisely timed sequential operations, conditional measurements, or error correction. In this work, we provide a path around these challenges, by introducing basic dissipative gadgets which allow us to precisely initiate, trigger, and time dissipative operations while keeping the system Liouvillian time independent. These gadgets open up novel perspectives for thinking of timed dissipative quantum information processing. As an example, we sketch how measurement-based computation can be simulated in the dissipative setting.

  3. A Study to Determine the Optimal Strategic Planning Process for Controlling and Coordinating the In-House Development of an Integrated Computer- Supported Hospital Information System

    DTIC Science & Technology

    1982-05-01

    This study examines Strategic Planning concepts and how they relate to the development of Hospital Information Systems. The author recommends that... Strategic Planning methods be utilized in the development of Hospital Information Systems, and provides guidance on how to do so. Keywords: Theses...Integrated information systems; Hospital administration; Computer networks; Information exchange; Health care; Strategic planning ; Information systems.

  4. Dynamic information theory and information description of dynamic systems

    NASA Astrophysics Data System (ADS)

    Xing, Xiusan

    2010-04-01

    In this paper, we develop dynamic statistical information theory established by the author. Starting from the ideas that the state variable evolution equations of stochastic dynamic systems, classical and quantum nonequilibrium statistical physical systems and special electromagnetic field systems can be regarded as their information symbol evolution equations and the definitions of dynamic information and dynamic entropy, we derive the evolution equations of dynamic information and dynamic entropy that describe the evolution laws of dynamic information. These four kinds of evolution equations are of the same mathematical type. They show in unison when information transmits in coordinate space outside the systems that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes, and that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes. When space noise can be neglected, an information wave will appear. If we only consider the information change inside the systems, dynamic information evolution equations reduce to information equations corresponding to the dynamic equations which describe evolution laws of the above dynamic systems. This reveals that the evolution laws of respective dynamic systems can be described by information equations in a unified fashion. Hence, the evolution processes of these dynamic systems can be abstracted as the evolution processes of information. Furthermore, we present the formulas for information flow, information dissipation rate, and entropy production rate. We prove that the information production probably emerges in a dynamic system with internal attractive interaction between the elements, and derive a formula for this information

  5. Information processing in miniature brains

    PubMed Central

    Chittka, L.; Skorupski, P.

    2011-01-01

    Since a comprehensive understanding of brain function and evolution in vertebrates is often hobbled by the sheer size of the nervous system, as well as ethical concerns, major research efforts have been made to understand the neural circuitry underpinning behaviour and cognition in invertebrates, and its costs and benefits under natural conditions. This special feature of Proceedings of the Royal Society B contains an idiosyncratic range of current research perspectives on neural underpinnings and adaptive benefits (and costs) of such diverse phenomena as spatial memory, colour vision, attention, spontaneous behaviour initiation, memory dynamics, relational rule learning and sleep, in a range of animals from marine invertebrates with exquisitely simple nervous systems to social insects forming societies with many thousands of individuals working together as a ‘superorganism’. This introduction provides context and history to tie the various approaches together, and concludes that there is an urgent need to understand the full neuron-to-neuron circuitry underlying various forms of information processing—not just to explore brain function comprehensively, but also to understand how (and how easily) cognitive capacities might evolve in the face of pertinent selection pressures. In the invertebrates, reaching these goals is becoming increasingly realistic. PMID:21227971

  6. Proprioceptive information processing in schizophrenia.

    PubMed

    Arnfred, Sidse M H

    2012-03-01

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus was constructed that stimulated the proprioceptive dimension of recognition of applied force. This load stimulus was tested both in simple and several types of more complex stimulus paradigms, with and without tasks, in total in 66 healthy volunteers. The evoked potential (EP) resulting from the load stimulus was named the proprioceptive EP. The later components of the proprioceptive EP (> 150 ms) were modulated similarly to previously reported electrical somatosensory EPs by repetition and cognitive task. The earlier activity was further investigated through decomposition of the time-frequency transformed data by a new non-negative matrix analysis, and previous research and visual inspection validated these results. Several time-frequency components emerged in the proprioceptive EP. The contra-lateral parietal gamma component (60-70 ms; 30-41 Hz) had not previously been described in the somatosensory modality without electrical stimulation. The parietal beta component (87-103 ms; 19-22 Hz) was increased when the proprioceptive stimulus appeared in a predictable sequence in

  7. Quantum process discrimination with information from environment

    NASA Astrophysics Data System (ADS)

    Wang, Yuan-Mei; Li, Jun-Gang; Zou, Jian; Xu, Bao-Ming

    2016-12-01

    In quantum metrology we usually extract information from the reduced probe system but ignore the information lost inevitably into the environment. However, K. Mølmer [Phys. Rev. Lett. 114, 040401 (2015)] showed that the information lost into the environment has an important effect on improving the successful probability of quantum process discrimination. Here we reconsider the model of a driven atom coupled to an environment and distinguish which of two candidate Hamiltonians governs the dynamics of the whole system. We mainly discuss two measurement methods, one of which obtains only the information from the reduced atom state and the other obtains the information from both the atom and its environment. Interestingly, for the two methods the optimal initial states of the atom, used to improve the successful probability of the process discrimination, are different. By comparing the two methods we find that the partial information from the environment is very useful for the discriminations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11274043, 11375025, and 11005008).

  8. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  9. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  10. The Government View of Information Processes

    ERIC Educational Resources Information Center

    Burchinal, L. G.

    1970-01-01

    Article focuses on government information systsms and resources, alternatives to computer storage retrieval and large scale information systems, and techniques to improve the utilization of existing information systems in colleges and universities. (Editor)

  11. An evaluation process for an electronic bar code medication administration information system in an acute care unit.

    PubMed

    Bargren, Michelle; Lu, Der-Fa

    2009-01-01

    The purpose of this case study is to present an evaluation process and recommendations for addressing the gaps found with the implementation of a new bar code medication administration (BCMA) technology in a busy acute care hospital unit. The case study analyzes workflow procedures associated with administration of medications in an inpatient labor and delivery care unit before and one year after implementation of BCMA technology. The comparison reveals a twofold increase in workflow procedures for nursing staff because of the new technology. System gaps are identified from a nursing user's perspective, and recommendations are offered to close those gaps.

  12. Quantum information processing : science & technology.

    SciTech Connect

    Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David

    2010-09-01

    Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.

  13. An Overview of Geographic Information System and its Role and Applicability in Environmental Monitoring and Process Modeling

    NASA Astrophysics Data System (ADS)

    Rusko, Miroslav; Chovanec, Roman; Rošková, Dana

    2010-01-01

    The geographical information system (GIS) is a tool used generically for any computer-based capability for manipulating geographical data. The hardware and software functions of GIS include data input, data storage, data management (data manipulation, updating, changing, exchange) and data reporting (retrieval, presentation, analysis, combination, etc.). All of these actions and operations are applied to GIS as a tool that forms its database. The paper describes the types of the GIS data formats (vector, raster), database object definitions, relationships, geometric features, and the data organization structure. Some GIS applications and examples are given for better understanding of how GIS data can be used in GIS applications, with the respect to data formats, including surface elevation and slope from digital elevation model data (DEM), with the applicability in water industry.

  14. MANAGEMENT INFORMATION SYSTEM,

    DTIC Science & Technology

    Management Information System being developed for the Institute of Cybernetics of the Academy of Sciences of the Ukrainian SSR. The work is being done at the suggestion of Academician V. M. Glushkov under the leadership of Candidate of Physico-Mathematical Sciences A. A. Stognii. Projects reports prepared in various departments of the Institute of Cybernetics in 1963-64 were used in writing this paper. Among them, the works of V. N. Afanas’ev, V. G Bodnarchuk, E. F. Skorokhod’ko, and V. I. Shurikhin should be mentioned. A great deal of factural

  15. Interstellar reddening information system

    NASA Astrophysics Data System (ADS)

    Burnashev, V. I.; Grigorieva, E. A.; Malkov, O. Yu.

    2013-10-01

    We describe an electronic bibliographic information system, based on a card catalog, containing some 2500 references (publications of 1930-2009) on interstellar extinction. We have classified the articles according to their content. We present here a list of articles devoted to two categories: maps of total extinction and variation of interstellar extinction with the distance to the object. The catalog is tested using published data on open clusters, and conclusions on the applicability of different maps of interstellar extinctions for various distances are made.

  16. Centralized Storm Information System (CSIS)

    NASA Technical Reports Server (NTRS)

    Norton, C. C.

    1985-01-01

    A final progress report is presented on the Centralized Storm Information System (CSIS). The primary purpose of the CSIS is to demonstrate and evaluate real time interactive computerized data collection, interpretation and display techniques as applied to severe weather forecasting. CSIS objectives pertaining to improved severe storm forecasting and warning systems are outlined. The positive impact that CSIS has had on the National Severe Storms Forecast Center (NSSFC) is discussed. The benefits of interactive processing systems on the forecasting ability of the NSSFC are described.

  17. BBIS: Beacon Bus Information System

    NASA Astrophysics Data System (ADS)

    Kasim, Shahreen; Hafit, Hanayanti; Pei Juin, Kong; Afizah Afif, Zehan; Hashim, Rathiah; Ruslai, Husni; Jahidin, Kamaruzzaman; Syafwan Arshad, Mohammad

    2016-11-01

    Lack of bus information for example bus timetable, status of the bus and messy advertisement on bulletin board at the bus stop will give negative impact to tourist. Therefore, a real-time update bus information bulletin board provides all information needed so that passengers can save their bus information searching time. Supported with Android or iOS, Beacon Bus Information System (BBIS) provides bus information between Batu Pahat and Kluang area. BBIS is a system that implements physical web technology and interaction on demand. It built on Backend-as-a-Service, a cloud solution and Firebase non relational database as data persistence backend and syncs between user client in the real-time. People walk through bus stop with smart device and do not require any application. Bluetooth Beacon is used to achieve smart device's best performance of data sharing. Intellij IDEA 15 is one of the tools that that used to develop the BBIS system. Multi-language included front end and backend supported Integration development environment (IDE) helped to speed up integration process.

  18. Fluidic microchemomechanical integrated circuits processing chemical information.

    PubMed

    Greiner, Rinaldo; Allerdissen, Merle; Voigt, Andreas; Richter, Andreas

    2012-12-07

    Lab-on-a-chip (LOC) technology has blossomed into a major new technology fundamentally influencing the sciences of life and nature. From a systemic point of view however, microfluidics is still in its infancy. Here, we present the concept of a microfluidic central processing unit (CPU) which shows remarkable similarities to early electronic Von Neumann microprocessors. It combines both control and execution units and, moreover, the complete power supply on a single chip and introduces the decision-making ability regarding chemical information into fluidic integrated circuits (ICs). As a consequence of this system concept, the ICs process chemical information completely in a self-controlled manner and energetically self-sustaining. The ICs are fabricated by layer-by-layer deposition of several overlapping layers based on different intrinsically active polymers. As examples we present two microchips carrying out long-term monitoring of critical parameters by around-the-clock sampling.

  19. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; Brissebrat, Guillaume; Boichard, Jean-Luc; Cloché, Sophie; Mière, Arnaud; Moulaye, Oumarou; Ramage, Karim; Favot, Florence; Boulanger, Damien

    2015-04-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to boost the data and information exchange between researchers from different disciplines. The AMMA information system includes (i) a user-friendly data management and dissemination system, (ii) quasi real-time display websites and (iii) a scientific paper exchange collaborative tool. The AMMA information system is enriched by past and ongoing projects (IMPETUS, FENNEC, ESCAPE, QweCI, ACASIS, DACCIWA...) addressing meteorology, atmospheric chemistry, extreme events, health, adaptation of human societies... It is becoming a reference information system on environmental issues in West Africa. (i) The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the AMMA data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that cover many geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health). They have been collected by operational networks since 1850, long term monitoring research networks (CATCH, IDAF, PIRATA...) and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property, property value…). The AMMA data portal counts about 900 registered users, and 50 data requests every month. The AMMA databases and data portal have been developed and are operated jointly by SEDOO and

  20. Ada (Trade Name) Foundation Technology. Volume 4. Software Requirements for WIS (WWMCCS (World Wide Military Command and Control System) Information System) Text Processing Prototypes

    DTIC Science & Technology

    1986-12-01

    type lonts and gsei-oriented "help" messages tailored to the operations being performed and user expertise In general, critical design issues...other volumes include command language, software design , description and analysis tools, database management system operating systems; planning and...summary information. d. Detailed planning capabilities must provide improved methods for designating specific units and associated sustainment

  1. TWRS information locator database system design description

    SciTech Connect

    Knutson, B.J.

    1996-09-13

    This document gives an overview and description of the Tank Waste Remediation System (TWRS) Information Locator Database (ILD)system design. The TWRS ILD system is an inventory of information used in the TWRS Systems Engineering process to represent the TWRS Technical Baseline. The inventory is maintained in the form of a relational database developed in Paradox 4.5.

  2. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  3. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  4. NEIS (NASA Environmental Information System)

    NASA Technical Reports Server (NTRS)

    Cook, Beth

    1995-01-01

    The NASA Environmental Information System (NEIS) is a tool to support the functions of the NASA Operational Environment Team (NOET). The NEIS is designed to provide a central environmental technology resource drawing on all NASA centers' capabilities, and to support program managers who must ultimately deliver hardware compliant with performance specifications and environmental requirements. The NEIS also tracks environmental regulations, usages of materials and processes, and new technology developments. It has proven to be a useful instrument for channeling information throughout the aerospace community, NASA, other federal agencies, educational institutions, and contractors. The associated paper will discuss the dynamic databases within the NEIS, and the usefulness it provides for environmental compliance efforts.

  5. Geographic Information Systems.

    PubMed

    Wieczorek, William F; Delmerico, Alan M

    2009-01-01

    This chapter presents an overview of the development, capabilities, and utilization of geographic information systems (GIS). There are nearly an unlimited number of applications that are relevant to GIS because virtually all human interactions, natural and man-made features, resources, and populations have a geographic component. Everything happens somewhere and the location often has a role that affects what occurs. This role is often called spatial dependence or spatial autocorrelation, which exists when a phenomenon is not randomly geographically distributed. GIS has a number of key capabilities that are needed to conduct a spatial analysis to assess this spatial dependence. This chapter presents these capabilities (e.g., georeferencing, adjacency/distance measures, overlays) and provides a case study to illustrate how GIS can be used for both research and planning. Although GIS has developed into a relatively mature application for basic functions, development is needed to more seamlessly integrate spatial statistics and models.The issue of location, especially the geography of human activities, interactions between humanity and nature, and the distribution and location of natural resources and features, is one of the most basic elements of scientific inquiry. Conceptualizations and physical maps of geographic space have existed since the beginning of time because all human activity takes place in a geographic context. Representing objects in space, basically where things are located, is a critical aspect of the natural, social, and applied sciences. Throughout history there have been many methods of characterizing geographic space, especially maps created by artists, mariners, and others eventually leading to the development of the field of cartography. It is no surprise that the digital age has launched a major effort to utilize geographic data, but not just as maps. A geographic information system (GIS) facilitates the collection, analysis, and reporting of

  6. Geographic Information Systems

    PubMed Central

    Wieczorek, William F.; Delmerico, Alan M.

    2009-01-01

    This chapter presents an overview of the development, capabilities, and utilization of geographic information systems (GIS). There are nearly an unlimited number of applications that are relevant to GIS because virtually all human interactions, natural and man-made features, resources, and populations have a geographic component. Everything happens somewhere and the location often has a role that affects what occurs. This role is often called spatial dependence or spatial autocorrelation, which exists when a phenomenon is not randomly geographically distributed. GIS has a number of key capabilities that are needed to conduct a spatial analysis to assess this spatial dependence. This chapter presents these capabilities (e.g., georeferencing, adjacency/distance measures, overlays) and provides a case study to illustrate how GIS can be used for both research and planning. Although GIS has developed into a relatively mature application for basic functions, development is needed to more seamlessly integrate spatial statistics and models. The issue of location, especially the geography of human activities, interactions between humanity and nature, and the distribution and location of natural resources and features, is one of the most basic elements of scientific inquiry. Conceptualizations and physical maps of geographic space have existed since the beginning of time because all human activity takes place in a geographic context. Representing objects in space, basically where things are located, is a critical aspect of the natural, social, and applied sciences. Throughout history there have been many methods of characterizing geographic space, especially maps created by artists, mariners, and others eventually leading to the development of the field of cartography. It is no surprise that the digital age has launched a major effort to utilize geographic data, but not just as maps. A geographic information system (GIS) facilitates the collection, analysis, and reporting of

  7. In vitro Reconstruction of Visual Information-Processing System Using Co-Cultured Retina and Superior Colliculus

    NASA Astrophysics Data System (ADS)

    Hirota, Shinya; Moriguchi, Hiroyuki; Takayama, Yuzo; Jimbo, Yasuhiko

    Retinotectal projection, together with the well-known retinogeniculocortical pathways, plays important roles in visual information processing. In this study, we try in vitro reconstruction of retina-superior-colliculus (SC) pathways on micro- electrode arrays (MEAs). First, retinal tissue and SC slices were prepared from newborn rats and individually cultured on MEA substrates. Spontaneous electrical activity was recorded in both retina and SC cultures. Continuous firing was observed in cultured retina, where the frequency increased from a few Hz to more than 10 Hz with culture duration. Evoked responses were also recorded from cultured retinal tissue. A single biphasic pulse successfully elicited spike trains. Spontaneous activity of SC was observed in the 6 days in vitro (DIV) cultures. Finally, retina and SC were co-cultured under the conditions established for SC-slice cultures. Within a few days, we could observe neurite outgrowth from both tissue and connections were established morphologically. Spontaneous activity was recorded from both retina and SC areas in 11 DIV cultures. The next step will be spatio-temporal analysis of signal-propagation patterns of spontaneous activity as well as SC responses to retinal stimulation.

  8. Iowa Flood Information System

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2011-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities

  9. Visual Information Processing for Television and Telerobotics

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)

    1989-01-01

    This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.

  10. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  11. Center for Information Services, Phase II: Detailed System Design and Programming, Part 7 - Text Processing, Phase IIA Final Report.

    ERIC Educational Resources Information Center

    Silva, Georgette M.

    Libraries, as well as larger information networks, are necessarily based upon the storage of information files consisting in many cases of written materials and texts such as books, serials, abstracts, manuscripts and archives. At the present stage of the "information explosion" no librarian can afford to ignore the contribution of…

  12. Center for Information Services, Phase II: Detailed System Design and Programming, Part 7 - Text Processing, Phase IIA Final Report.

    ERIC Educational Resources Information Center

    Silva, Georgette M.

    Libraries, as well as larger information networks, are necessarily based upon the storage of information files consisting in many cases of written materials and texts such as books, serials, abstracts, manuscripts and archives. At the present stage of the "information explosion" no librarian can afford to ignore the contribution of…

  13. The ideal laboratory information system.

    PubMed

    Sepulveda, Jorge L; Young, Donald S

    2013-08-01

    Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.

  14. Social Information Processing in Deaf Adolescents

    ERIC Educational Resources Information Center

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  15. Social Information Processing in Deaf Adolescents

    ERIC Educational Resources Information Center

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  16. [Geographic information systems].

    PubMed

    Hernández-Vásquez, Akram; Azañedo, Diego; Bendezú-Quispe, Guido; Pacheco-Mendoza, Josmel; Chaparro, R Martín

    2016-01-01

    The aim of this study was to geospatially explore the occurrence rates of car accidents involving pedestrians in Cercado de Lima (Lima District), Peru. Car accidents involving pedestrians recorded in the 2015 National Police Station Census of the National Statistics and Information Institute were described and georeferenced. Subsequently, a Kernel Density analysis was carried out to locate areas with high, medium, and low density events. Records of 171 car accidents involving pedestrians were studied: the types of vehicles involved were automobiles (56.7%) and smaller vehicles (22.8%). The highest percentage of car accidents involving pedestrians (38.6%) took place between 12:00 p.m. and 5:00 p.m. There were two densely populated areas and two areas with intermediate density for car accidents involving pedestrians, locations that were previously reported as critical due to their deficiencies and high probability of traffic accidents. The use of geographic information systems offers a quick overview of the occurrence rates of car accidents involving pedestrians to make comparisons and enable the local implementation of strategies.

  17. Buying a healthcare information system.

    PubMed

    Clegg, T A

    1998-01-01

    Replacing an antiquated computer system with state of the art equipment and software is a lengthy, at times frustrating, and never an easy decision. At Wesley Woods Center on Aging, Atlanta, an integrated provider of healthcare for the elderly affiliated with Emory University, the process consumed more than two and a half years. This article takes the reader through the entire process, from the initial decision to replace an existing system, through the final purchase and installation. It looks candidly at the problems that were encountered, including turnover among key personnel, difficulties with involving all of the user groups, changes in the technology and coordination with the University. The lessons Wesley Woods learned in its experience can be of benefit to any healthcare facility contemplating an information system change.

  18. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  19. Tropical Cyclone Information System

    NASA Technical Reports Server (NTRS)

    Li, P. Peggy; Knosp, Brian W.; Vu, Quoc A.; Yi, Chao; Hristova-Veleva, Svetla M.

    2009-01-01

    The JPL Tropical Cyclone Infor ma tion System (TCIS) is a Web portal (http://tropicalcyclone.jpl.nasa.gov) that provides researchers with an extensive set of observed hurricane parameters together with large-scale and convection resolving model outputs. It provides a comprehensive set of high-resolution satellite (see figure), airborne, and in-situ observations in both image and data formats. Large-scale datasets depict the surrounding environmental parameters such as SST (Sea Surface Temperature) and aerosol loading. Model outputs and analysis tools are provided to evaluate model performance and compare observations from different platforms. The system pertains to the thermodynamic and microphysical structure of the storm, the air-sea interaction processes, and the larger-scale environment as depicted by ocean heat content and the aerosol loading of the environment. Currently, the TCIS is populated with satellite observations of all tropical cyclones observed globally during 2005. There is a plan to extend the database both forward in time till present as well as backward to 1998. The portal is powered by a MySQL database and an Apache/Tomcat Web server on a Linux system. The interactive graphic user interface is provided by Google Map.

  20. Prenatal cocaine exposure and prolonged focus attention. Poor infant information processing ability or precocious maturation of attentional systems?

    PubMed

    Chiriboga, Claudia A; Starr, Denise; Kuhn, Louise; Wasserman, Gail A

    2009-01-01

    In experimental models, prenatal cocaine exposure has been found to perturb monoaminergic development of systems implicated in modulating attention. To determine whether prenatal cocaine exposure affects infant attention, we assessed visual recognition memory and focused attention during free play. We enrolled at birth 380 infants, 113 cocaine exposed, using multiple biomarkers to assess drug exposure. Behavior was videotaped and coded off-line for sustained looking time (i.e. focused attention), banging and intrusion. Prenatal cocaine exposure was not associated with visual recognition memory, but was significantly associated with longer sustained looking times (average focused attention) at ages 6 months (p = 0.02) and 12 months (p = 0.04) in analyses that adjusted for variables, including maternal intelligence, education, depressive scores and other exposures (alcohol, tobacco and marijuana). Cocaine-exposed infants at age 12 months also spent significantly less time in banging activity (p = 0.02) after adjusting for confounding variables. This finding was not explained through cocaine effects on motor development, neurological findings or time spent in focused attention. Prenatal cocaine exposure was significantly associated with longer periods of sustained looking or focused attention in infancy, a finding that could interpreted as a measure of poor processing efficiency, or alternatively as precocious maturation of attentional systems. Either interpretation has implications for later cognitive development. Lower banging activity among cocaine exposed was not explained through cocaine effects on motor development or neurological findings, suggesting that activity level itself is diminished in these infants. Whether focused attention findings impact long term development awaits further study.

  1. Reduced functional integration and segregation of distributed neural systems underlying social and emotional information processing in autism spectrum disorders.

    PubMed

    Rudie, Jeffrey D; Shehzad, Zarrar; Hernandez, Leanna M; Colich, Natalie L; Bookheimer, Susan Y; Iacoboni, Marco; Dapretto, Mirella

    2012-05-01

    A growing body of evidence suggests that autism spectrum disorders (ASDs) are related to altered communication between brain regions. Here, we present findings showing that ASD is characterized by a pattern of reduced functional integration as well as reduced segregation of large-scale brain networks. Twenty-three children with ASD and 25 typically developing matched controls underwent functional magnetic resonance imaging while passively viewing emotional face expressions. We examined whole-brain functional connectivity of two brain structures previously implicated in emotional face processing in autism: the amygdala bilaterally and the right pars opercularis of the inferior frontal gyrus (rIFGpo). In the ASD group, we observed reduced functional integration (i.e., less long-range connectivity) between amygdala and secondary visual areas, as well as reduced segregation between amygdala and dorsolateral prefrontal cortex. For the rIFGpo seed, we observed reduced functional integration with parietal cortex and increased integration with right frontal cortex as well as right nucleus accumbens. Finally, we observed reduced segregation between rIFGpo and the ventromedial prefrontal cortex. We propose that a systems-level approach-whereby the integration and segregation of large-scale brain networks in ASD is examined in relation to typical development-may provide a more detailed characterization of the neural basis of ASD.

  2. Information Processing and Human Abilities

    ERIC Educational Resources Information Center

    Kirby, John R.; Das, J. P.

    1978-01-01

    The simultaneous and successive processing model of cognitive abilities was compared to a traditional primary mental abilities model. Simultaneous processing was found to be primarily related to spatial ability; and to a lesser extent, to memory and inductive reasoning. Subjects were 104 fourth-grade urban males. (Author/GD C)

  3. An information typology for understanding living systems.

    PubMed

    Banathy, B A

    1998-04-01

    It is argued that we can improve our understanding of living systems by focusing on their informational processes. Recent developments, primarily in evolutionary biology, cybernetics and systems theory, suggest that informational processes are of at least two, and probably three, different types; and that the interaction of these types can be seen as a basis for the self-construction of living systems. Following the work of Csanyi and Kampis, a distinction is drawn between referential and nonreferential information. This typology is further extended to include statereferential information. The statereferential type serves to lend stability to informational arrangements (organization) that are viable so that they may be propagated in space and time.

  4. Defense Energy Information System. Manual

    SciTech Connect

    Carnes, J.

    1990-02-01

    The Manual provides clear, reliable, timely, accurate, and objective energy information; prescribes instructions for the preparation and submission of energy data to support the Defense Energy Information System (DEIS); and furnishes information regarding the use of the DEIS.

  5. Information Processing in Medical Imaging Meeting (IPMI)

    DTIC Science & Technology

    1993-09-30

    Information Processing in Medical Imaging - Meeting (IPMI) F49620-93-1-0352 6. AUTHOR(S) Professor Harrison H. Barrett 7. PERFORMING ORGANIZATION NAME(S) AND...distribution unlimited. Final Report of 1993 Information Processing in Medical Imaging Meeting The 1993 Information Processing in Medical Imaging (IPMI...that the extracted information is correct? Although the emphasis of the meeting was clearly on medical imaging , the techniques and issues discussed

  6. See the Sea: multi-user information system for investigating processes and phenomena in coastal zones via satellite remotely sensed data, particularly hyperspectral data

    NASA Astrophysics Data System (ADS)

    Mityagina, Marina I.; Lavrova, Olga Yu.; Uvarov, Ivan A.

    2014-10-01

    The functionality, the goals and the current state of the distributed information system "See the Sea" (STS) are presented and discussed. This system is designed for investigating various processes and phenomena in the ocean and marine atmosphere using different types of satellite remotely sensed data. The STS system provides researchers with the possibilities to deal with the satellite remote sensing data as well as with the result of its analysis. The key feature of STS is the ability to work simultaneously with satellite information of different types. STS provides tools for joint analysis of different types of satellite data, as well as data of ground meteorological stations, cartographic data etc. This paper gives an overview of the system and data processing use cases. Some example cases are described including processing and joint analysis of various satellite data. The data from different sensors (obtained by Envisat ASAR, Landsat-8 OLI, Landsat-7 ETM+, Landsat-5 TM, Terra/Aqua MODIS as well as Hyperion and HICO hyperspectral data) was analyzed jointly for differentiation between different types of coastal waters, and for reconstruction of suspended matter distribution in the test areas of the Azov and Black Seas.

  7. Implementation of Alabama Resources Information System, ARIS

    NASA Technical Reports Server (NTRS)

    Herring, B. E.

    1978-01-01

    Development of ARIS - Alabama Resources Information System is summarized. Development of data bases, system simplification for user access, and making information available to personnel having a need to use ARIS or in the process of developing ARIS type systems are discussed.

  8. Federal Energy Information Systems.

    ERIC Educational Resources Information Center

    Coyne, Joseph G.; Moneyhun, Dora H.

    1979-01-01

    Describes the Energy Information Administration (EIA) and the Technical Information Center (TIC), and lists databases accessible online to the Department of Energy and its contractors through DOE/RECON. (RAA)

  9. Information Processing Technology. Final Report.

    ERIC Educational Resources Information Center

    Choate, Larry; And Others

    A tech prep/associate degree program in information technology was developed to prepare workers for entry into and advancement in occupations entailing applications of scientific principles and higher mathematics in situations involving various office machines. According to the articulation agreement reached, students from five country regional…

  10. ECONOMIC COMPARABILITY OF INFORMATION SYSTEMS.

    DTIC Science & Technology

    not only on the probability distributions of channel in and outputs (events and messages) characterizing the information systems . This remains true when... information systems are interpreted as statistical experiments used to test hypotheses. Some pairs of information systems are, however, comparable...in the sense that one is preferable to another irrespective of the payoff function. There exists thus a partial ordering of information systems according

  11. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  12. The risk assessment information system

    SciTech Connect

    Kerr, S.B.; Bonczek, R.R.; McGinn, C.W.; Land, M.L.; Bloom, L.D.; Sample, B.E.; Dolislager, F.G.

    1998-06-01

    In an effort to provide service-oriented environmental risk assessment expertise, the Department of Energy (DOE) Center for Risk Excellence (CRE) and DOE Oak Ridge Operations Office (ORO) are sponsoring Oak Ridge National Laboratory (ORNL) to develop a web-based system for disseminating risk tools and information to its users. This system, the Risk Assessment Information System (RAIS), was initially developed to support the site-specific needs of the DOE-ORO Environmental Restoration Risk Assessment Program. With support from the CRE, the system is currently being expanded to benefit all DOE risk information users and can be tailored to meet site-specific needs. Taking advantage of searchable and executable databases, menu-driven queries, and data downloads, using the latest World Wide Web technologies, the RAIS offers essential tools that are used in the risk assessment process or anywhere from project scoping to implementation. The RAIS tools can be located directly at http://risk.lsd.ornl.gov/homepage/rap{_}tool.htm or through the CRE`s homepage at http://www.doe.gov/riskcenter/home.html.

  13. Information Fusion and Cognitive Processing

    DTIC Science & Technology

    2010-09-22

    ADA560467. Indo-US Science and Technology Round Table Meeting (4th Annual) - Power Energy and Cognitive Science Held in Bangalore, India on September 21-23...Foundation and Department of Energy First Workshop on Information Fusion Office of Naval Research was the lead sponsor, together with National Science...Foundation and Department of Energy Brought together scientists from: Engineering Computer, Science, Mathematics, Econometrics, Bioinformatics

  14. A Device for Logic Information Processing.

    ERIC Educational Resources Information Center

    Levinskiy, L. S.; Vissonova, I. A.

    Two essential components of the information-logic problem are: (1) choosing some known part of the total information block for parallel review of the entire block and (2) parallel logic processing of a sequence of codes. The described device fulfills these essential components thereby improving information processing and increasing the speed of…

  15. Value Driven Information Processing and Fusion

    DTIC Science & Technology

    2016-03-01

    SECURITY CLASSIFICATION OF: The objective of the project is to develop a general framework for value driven decentralized information processing...information value metrics as called for by different inference tasks. Major theoretical breakthroughs have been obtained under this effort...Sep-2012 22-Oct-2015 Approved for Public Release; Distribution Unlimited Final Report: Value Driven Information Processing and Fusion The views

  16. Information for Successful Interaction with Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Johnson, Kathy A.

    2003-01-01

    Interaction in heterogeneous mission operations teams is not well matched to classical models of coordination with autonomous systems. We describe methods of loose coordination and information management in mission operations. We describe an information agent and information management tool suite for managing information from many sources, including autonomous agents. We present an integrated model of levels of complexity of agent and human behavior, which shows types of information processing and points of potential error in agent activities. We discuss the types of information needed for diagnosing problems and planning interactions with an autonomous system. We discuss types of coordination for which designs are needed for autonomous system functions.

  17. Information Selection in Intelligence Processing

    DTIC Science & Technology

    2011-12-01

    formulation of preference elicitation problems,” in Proceedings of the National Conference on Artificial Intelligence , Edmonton, AB 239–246. Bron, C... Artificial Intelligence , (AAAI) Seattle, WA. Central Intelligence Agency. (1999). Consumer’s guide to Intelligence . Cohen, J., McClure, S., & Yu, A. (2007...IN INTELLIGENCE PROCESSING by Yuval Nevo December 2011 Thesis Advisors: Moshe Kress Nedialko B. Dimitrov Second Reader

  18. Mathematics of Information Processing and the Internet

    ERIC Educational Resources Information Center

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  19. Applied Information Systems. Course Five. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the fifth of seven in the Information Systems curriculum. The purpose of the course is to build on skills acquired in the earlier courses. It reviews the importance of information to management and the organization and information systems concepts within an office. These components are provided for each task area: behavioral…

  20. A chronometric functional sub-network in the thalamo-cortical system regulates the flow of neural information necessary for conscious cognitive processes.

    PubMed

    León-Domínguez, Umberto; Vela-Bueno, Antonio; Froufé-Torres, Manuel; León-Carrión, Jose

    2013-06-01

    The thalamo-cortical system has been defined as a neural network associated with consciousness. While there seems to be wide agreement that the thalamo-cortical system directly intervenes in vigilance and arousal, a divergence of opinion persists regarding its intervention in the control of other cognitive processes necessary for consciousness. In the present manuscript, we provide a review of recent scientific findings on the thalamo-cortical system and its role in the control and regulation of the flow of neural information necessary for conscious cognitive processes. We suggest that the axis formed by the medial prefrontal cortex and different thalamic nuclei (reticular nucleus, intralaminar nucleus, and midline nucleus), represents a core component for consciousness. This axis regulates different cerebral structures which allow basic cognitive processes like attention, arousal and memory to emerge. In order to produce a synchronized coherent response, neural communication between cerebral structures must have exact timing (chronometry). Thus, a chronometric functional sub-network within the thalamo-cortical system keeps us in an optimal and continuous functional state, allowing high-order cognitive processes, essential to awareness and qualia, to take place.