Science.gov

Sample records for information processing system

  1. Advanced Information Processing System (AIPS)

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.

    1993-01-01

    Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.

  2. Information Processing in Living Systems

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper; Bialek, William

    2016-03-01

    Life depends as much on the flow of information as on the flow of energy. Here we review the many efforts to make this intuition precise. Starting with the building blocks of information theory, we explore examples where it has been possible to measure, directly, the flow of information in biological networks, or more generally where information-theoretic ideas have been used to guide the analysis of experiments. Systems of interest range from single molecules (the sequence diversity in families of proteins) to groups of organisms (the distribution of velocities in flocks of birds), and all scales in between. Many of these analyses are motivated by the idea that biological systems may have evolved to optimize the gathering and representation of information, and we review the experimental evidence for this optimization, again across a wide range of scales.

  3. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  4. Library Information-Processing System

    NASA Technical Reports Server (NTRS)

    1985-01-01

    System works with Library of Congress MARC II format. System composed of subsystems that provide wide range of library informationprocessing capabilities. Format is American National Standards Institute (ANSI) format for machine-readable bibliographic data. Adaptable to any medium-to-large library.

  5. Information Processing Capacity of Dynamical Systems

    PubMed Central

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  6. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  7. Information Processing in Decision-Making Systems

    PubMed Central

    van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A. David

    2015-01-01

    Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a “forward” mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders. PMID:22492194

  8. Information processing in the mammalian olfactory system.

    PubMed

    Lledo, Pierre-Marie; Gheusi, Gilles; Vincent, Jean-Didier

    2005-01-01

    Recently, modern neuroscience has made considerable progress in understanding how the brain perceives, discriminates, and recognizes odorant molecules. This growing knowledge took over when the sense of smell was no longer considered only as a matter for poetry or the perfume industry. Over the last decades, chemical senses captured the attention of scientists who started to investigate the different stages of olfactory pathways. Distinct fields such as genetic, biochemistry, cellular biology, neurophysiology, and behavior have contributed to provide a picture of how odor information is processed in the olfactory system as it moves from the periphery to higher areas of the brain. So far, the combination of these approaches has been most effective at the cellular level, but there are already signs, and even greater hope, that the same is gradually happening at the systems level. This review summarizes the current ideas concerning the cellular mechanisms and organizational strategies used by the olfactory system to process olfactory information. We present findings that exemplified the high degree of olfactory plasticity, with special emphasis on the first central relay of the olfactory system. Recent observations supporting the necessity of such plasticity for adult brain functions are also discussed. Due to space constraints, this review focuses mainly on the olfactory systems of vertebrates, and primarily those of mammals.

  9. Latency Minimizing Tasking for Information Processing Systems

    SciTech Connect

    Horey, James L; Lagesse, Brent J

    2011-01-01

    Real-time cyber-physical systems and information processing clusters require system designers to consider the total latency involved in collecting and aggregating data. For example, applications such as wild-fire monitoring require data to be presented to users in a timely manner. However, most models and algorithms for sensor networks have focused on alternative metrics such as energy efficiency. In this paper, we present a new model of sensor network aggregation that focuses on total latency. Our model is flexible and enables users to configure varying transmission and computation time on a node-by-node basis, and thus enables the simulation of complex computational phenomena. In addition, we present results from three tasking algorithms that trade-off local communication for overall latency performance. These algorithms are evaluated in simulated networks of up to 200 nodes. We've presented an aggregation-focused model of sensor networks that can be used to study the trade-offs between computational coverage and total latency. Our model explicitly takes into account transmission and computation times, and enables users to define different values for the basestation. In addition, we've presented three different tasking algorithms that operate over model to produce aggregation schedules of varying quality. In the future, we expect to continue exploring distributed tasking algorithms for information processing systems. We've shown that the gap between highly optimized schedules that use global information is quite large relative to our distributed algorithms. This gives us encouragement that future distributed tasking algorithms can still make large gains.

  10. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  11. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  12. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  13. Atmospheric and Oceanographic Information Processing System (AOIPS) system description

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.

    1977-01-01

    The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.

  14. Advanced information processing system: Input/output system services

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Alger, Linda

    1989-01-01

    The functional requirements and detailed specifications for the Input/Output (I/O) Systems Services of the Advanced Information Processing System (AIPS) are discussed. The introductory section is provided to outline the overall architecture and functional requirements of the AIPS system. Section 1.1 gives a brief overview of the AIPS architecture as well as a detailed description of the AIPS fault tolerant network architecture, while section 1.2 provides an introduction to the AIPS systems software. Sections 2 and 3 describe the functional requirements and design and detailed specifications of the I/O User Interface and Communications Management modules of the I/O System Services, respectively. Section 4 illustrates the use of the I/O System Services, while Section 5 concludes with a summary of results and suggestions for future work in this area.

  15. Model for Process Description: From Picture to Information System

    NASA Technical Reports Server (NTRS)

    Zak, A.

    1996-01-01

    A new model for the development of proces information systems is proposed. It is robust and inexpensive, capable of providing timely, neccessary information to the user by integrating Products, Instructions, Examples, Tools, and Process.

  16. Study on advanced information processing system

    NASA Technical Reports Server (NTRS)

    Shin, Kang G.; Liu, Jyh-Charn

    1992-01-01

    Issues related to the reliability of a redundant system with large main memory are addressed. In particular, the Fault-Tolerant Processor (FTP) for Advanced Launch System (ALS) is used as a basis for our presentation. When the system is free of latent faults, the probability of system crash due to nearly-coincident channel faults is shown to be insignificant even when the outputs of computing channels are infrequently voted on. In particular, using channel error maskers (CEMs) is shown to improve reliability more effectively than increasing the number of channels for applications with long mission times. Even without using a voter, most memory errors can be immediately corrected by CEMs implemented with conventional coding techniques. In addition to their ability to enhance system reliability, CEMs--with a low hardware overhead--can be used to reduce not only the need of memory realignment, but also the time required to realign channel memories in case, albeit rare, such a need arises. Using CEMs, we have developed two schemes, called Scheme 1 and Scheme 2, to solve the memory realignment problem. In both schemes, most errors are corrected by CEMs, and the remaining errors are masked by a voter.

  17. Living is information processing: from molecules to global systems.

    PubMed

    Farnsworth, Keith D; Nelson, John; Gershenson, Carlos

    2013-06-01

    We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function-to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in 'functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may be extended to each level of organisation up to the ecological. In terms of a computer analogy, life is both the data and the program and its biochemical structure is the way the information is embodied. This idea supports the seamless integration of life at all scales with the physical universe. The innovation reported here is essentially to integrate these ideas, basing information on the 'general definition' of information, rather than simply the statistics of information, thereby explaining how functional information operates throughout life. PMID:23456459

  18. Statistical process control based chart for information systems security

    NASA Astrophysics Data System (ADS)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  19. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  20. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  1. Information systems for material flow management in construction processes

    NASA Astrophysics Data System (ADS)

    Mesároš, P.; Mandičák, T.

    2015-01-01

    The article describes the options for the management of material flows in the construction process. Management and resource planning is one of the key factors influencing the effectiveness of construction project. It is very difficult to set these flows correctly. The current period offers several options and tools to do this. Information systems and their modules can be used just for the management of materials in the construction process.

  2. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  3. Specifications for a Federal Information Processing Standard Data Dictionary System

    NASA Technical Reports Server (NTRS)

    Goldfine, A.

    1984-01-01

    The development of a software specification that Federal agencies may use in evaluating and selecting data dictionary systems (DDS) is discussed. To supply the flexibility needed by widely different applications and environments in the Federal Government, the Federal Information Processing Standard (FIPS) specifies a core DDS together with an optimal set of modules. The focus and status of the development project are described. Functional specifications for the FIPS DDS are examined for the dictionary, the dictionary schema, and the dictionary processing system. The DDS user interfaces and DDS software interfaces are discussed as well as dictionary administration.

  4. Neural Mechanisms and Information Processing in Recognition Systems

    PubMed Central

    Ozaki, Mamiko; Hefetz, Abraham

    2014-01-01

    Nestmate recognition is a hallmark of social insects. It is based on the match/mismatch of an identity signal carried by members of the society with that of the perceiving individual. While the behavioral response, amicable or aggressive, is very clear, the neural systems underlying recognition are not fully understood. Here we contrast two alternative hypotheses for the neural mechanisms that are responsible for the perception and information processing in recognition. We focus on recognition via chemical signals, as the common modality in social insects. The first, classical, hypothesis states that upon perception of recognition cues by the sensory system the information is passed as is to the antennal lobes and to higher brain centers where the information is deciphered and compared to a neural template. Match or mismatch information is then transferred to some behavior-generating centers where the appropriate response is elicited. An alternative hypothesis, that of “pre-filter mechanism”, posits that the decision as to whether to pass on the information to the central nervous system takes place in the peripheral sensory system. We suggest that, through sensory adaptation, only alien signals are passed on to the brain, specifically to an “aggressive-behavior-switching center”, where the response is generated if the signal is above a certain threshold. PMID:26462936

  5. Information Systems to Support a Decision Process at Stanford.

    ERIC Educational Resources Information Center

    Chaffee, Ellen Earle

    1982-01-01

    When a rational decision process is desired, information specialists can contribute information and also contribute to the process in which that information is used, thereby promoting rational decision-making. The contribution of Stanford's information specialists to rational decision-making is described. (MLW)

  6. A Practical Approach to Process Support in Health Information Systems

    PubMed Central

    Lenz, Richard; Elstner, Thomas; Siegele, Hannes; Kuhn, Klaus A.

    2002-01-01

    This article describes the design of a generator tool for rapid application development. The generator tool is an integral part of a healthcare information system, and newly developed applications are embedded into the healthcare information system from the very beginning. The tool-generated applications are based on a document oriented user interaction paradigm. A significant feature is the support of intra- and interdepartmental clinical processes by means of providing document flow between different user groups. For flexible storage of newly developed applications, a generic EAV-type (Entity-Attribute-Value) database schema is used. Important aspects of a consequent implementation, like database representation of structured documents, document flow, versioning, and synchronization are presented. Applications generated by this approach are in routine use in more than 200 hospitals in Germany. PMID:12386109

  7. Applications of the generalized information processing system (GIPSY)

    USGS Publications Warehouse

    Moody, D.W.; Kays, Olaf

    1972-01-01

    The Generalized Information Processing System (GIPSY) stores and retrieves variable-field, variable-length records consisting of numeric data, textual data, or codes. A particularly noteworthy feature of GIPSY is its ability to search records for words, word stems, prefixes, and suffixes as well as for numeric values. Moreover, retrieved records may be printed on pre-defined formats or formatted as fixed-field, fixed-length records for direct input to other-programs, which facilitates the exchange of data with other systems. At present there are some 22 applications of GIPSY falling in the general areas of bibliography, natural resources information, and management science, This report presents a description of each application including a sample input form, dictionary, and a typical formatted record. It is hoped that these examples will stimulate others to experiment with innovative uses of computer technology.

  8. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  9. Quantum-information processing in disordered and complex quantum systems

    SciTech Connect

    Sen, Aditi; Sen, Ujjwal; Ahufinger, Veronica; Briegel, Hans J.; Sanpera, Anna; Lewenstein, Maciej

    2006-12-15

    We study quantum information processing in complex disordered many body systems that can be implemented by using lattices of ultracold atomic gases and trapped ions. We demonstrate, first in the short range case, the generation of entanglement and the local realization of quantum gates in a disordered magnetic model describing a quantum spin glass. We show that in this case it is possible to achieve fidelities of quantum gates higher than in the classical case. Complex systems with long range interactions, such as ions chains or dipolar atomic gases, can be used to model neural network Hamiltonians. For such systems, where both long range interactions and disorder appear, it is possible to generate long range bipartite entanglement. We provide an efficient analytical method to calculate the time evolution of a given initial state, which in turn allows us to calculate its quantum correlations.

  10. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  11. Cell/tissue processing information system for regenerative medicine.

    PubMed

    Iwayama, Daisuke; Yamato, Masayuki; Tsubokura, Tetsuya; Takahashi, Minoru; Okano, Teruo

    2014-04-01

    When conducting clinical studies of regenerative medicine, compliance to good manufacturing practice (GMP) is mandatory, and thus much time is needed for manufacturing and quality management. It is therefore desired to introduce the manufacturing execution system (MES), which is being adopted by factories manufacturing pharmaceutical products. Meanwhile, in manufacturing human cell/tissue processing autologous products, it is necessary to protect patients' personal information, prevent patients from being identified and obtain information for cell/tissue identification. We therefore considered it difficult to adopt conventional MES to regenerative medicine-related clinical trials, and so developed novel software for production/quality management to be used in cell-processing centres (CPCs), conforming to GMP. Since this system satisfies the requirements of regulations in Japan and the USA for electronic records and electronic signatures (ER/ES), the use of ER/ES has been allowed, and the risk of contamination resulting from the use of recording paper has been eliminated, thanks to paperless operations within the CPC. Moreover, to reduce the risk of mix-up and cross-contamination due to contact during production, we developed a touchless input device with built-in radio frequency identification (RFID) reader-writer devices and optical sensors. The use of this system reduced the time to prepare and issue manufacturing instructions by 50% or more, compared to the conventional handwritten system. The system contributes to producing more large-scale production and to reducing production costs for cell and tissue products in regenerative medicine. Copyright © 2014 John Wiley & Sons, Ltd. PMID:24700532

  12. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  13. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  14. Enzyme-based logic systems for information processing.

    PubMed

    Katz, Evgeny; Privman, Vladimir

    2010-05-01

    In this critical review we review enzymatic systems which involve biocatalytic reactions utilized for information processing (biocomputing). Extensive ongoing research in biocomputing, mimicking Boolean logic gates has been motivated by potential applications in biotechnology and medicine. Furthermore, novel sensor concepts have been contemplated with multiple inputs processed biochemically before the final output is coupled to transducing "smart-material" electrodes and other systems. These applications have warranted recent emphasis on networking of biocomputing gates. First few-gate networks have been experimentally realized, including coupling, for instance, to signal-responsive electrodes for signal readout. In order to achieve scalable, stable network design and functioning, considerations of noise propagation and control have been initiated as a new research direction. Optimization of single enzyme-based gates for avoiding analog noise amplification has been explored, as were certain network-optimization concepts. We review and exemplify these developments, as well as offer an outlook for possible future research foci. The latter include design and uses of non-Boolean network elements, e.g., filters, as well as other developments motivated by potential novel sensor and biotechnology applications (136 references).

  15. BOOK REVIEW: Theory of Neural Information Processing Systems

    NASA Astrophysics Data System (ADS)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  16. Two Molecular Information Processing Systems Based on Catalytic Nucleic Acids

    NASA Astrophysics Data System (ADS)

    Stojanovic, Milan

    Mixtures of molecules are capable of powerful information processing [1]. This statement is in the following way self-evident: it is a hierarchically organized complex mixture of molecules that is formulating it to other similarly organized mixtures of molecules. By making such a statement I am not endorsing the extreme forms of reductionism; rather, I am making what I think is a small first step towards harnessing information processing prowess of molecules and, hopefully, overcoming some limitations of more traditional computing paradigms. There are different ideas on how to understand and use molecular information processing abilities and I will list some below. My list is far from inclusive, and delineations are far from clear-cut; whenever available, I will provide examples from our research efforts. I should stress, for a computer science audience that I am a chemist. Thus, my approach may have much different focus and mathematical rigor, then if it would be taken by a computer scientist.

  17. Information processing in the primate visual system - An integrated systems perspective

    NASA Technical Reports Server (NTRS)

    Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.

    1992-01-01

    The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.

  18. Information Processing.

    ERIC Educational Resources Information Center

    Jennings, Carol Ann; McDonald, Sandy

    This publication contains instructional materials for teacher and student use for a course in information processing. The materials are written in terms of student performance using measurable objectives. The course includes 10 units. Each instructional unit contains some or all of the basic components of a unit of instruction: performance…

  19. Dimension of physical systems, information processing, and thermodynamics

    NASA Astrophysics Data System (ADS)

    Brunner, Nicolas; Kaplan, Marc; Leverrier, Anthony; Skrzypczyk, Paul

    2014-12-01

    We ask how quantum theory compares to more general physical theories from the point of view of dimension. To do so, we first give two model-independent definitions of the dimension of physical systems, based on measurements and the capacity of storing information. While both definitions are equivalent in classical and quantum mechanics, they are different in generalized probabilistic theories. We discuss in detail the case of a theory known as ‘boxworld’, and show that such a theory features systems with dimension mismatch. This dimension mismatch can be made arbitrarily large using an amplification procedure. Furthermore, we show that the dimension mismatch of boxworld has strong consequences on its power for performing information-theoretic tasks, leading to the collapse of communication complexity and to the violation of information causality. Finally, we discuss the consequences of a dimension mismatch from the perspective of thermodynamics, and ask whether this effect could break Landauer's erasure principle and thus the second law.

  20. Nonlinear Information Processing in a Model Sensory System

    PubMed Central

    Chacron, Maurice J.

    2016-01-01

    Understanding the mechanisms by which sensory neurons encode and decode information remains an important goal in neuroscience. We quantified the performance of optimal linear and nonlinear encoding models in a well-characterized sensory system: the electric sense of weakly electric fish. We show that linear encoding models generally perform better under spatially localized stimulation than under spatially diffuse stimulation. Through pharmacological blockade of feedback input and spatial saturation of the receptive field center, we show that there is significantly less synaptic noise under spatially diffuse stimuli as compared with spatially localized stimuli. Modeling results suggest that pyramidal cells nonlinearly encode sensory information through shunting in their dendrites and clarify the influence of synaptic noise on the performance of linear encoding models. Finally, we used information theory to quantify the performance of linear decoders. While the optimal linear decoder for spatially localized stimuli could capture 60% of the information in pyramidal cell spike trains, the optimal linear decoder for spatially diffuse stimuli could only capture 40% of the information. These results show that nonlinear decoders are necessary to fully access information in pyramidal cell spike trains, and we discuss potential mechanisms by which higher-order neurons could decode this information. PMID:16495358

  1. Information theory and signal transduction systems: from molecular information processing to network inference.

    PubMed

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design.

  2. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  3. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP and...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as...

  4. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to 42 CFR 433.113(c... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR...

  5. An information theoretic model of information processing in the Drosophila olfactory system: the role of inhibitory neurons for system efficiency.

    PubMed

    Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian

    2013-12-20

    Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.

  6. Integrated System Technologies for Modular Trapped Ion Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Crain, Stephen G.

    Although trapped ion technology is well-suited for quantum information science, scalability of the system remains one of the main challenges. One of the challenges associated with scaling the ion trap quantum computer is the ability to individually manipulate the increasing number of qubits. Using micro-mirrors fabricated with micro-electromechanical systems (MEMS) technology, laser beams are focused on individual ions in a linear chain and steer the focal point in two dimensions. Multiple single qubit gates are demonstrated on trapped 171Yb+ qubits and the gate performance is characterized using quantum state tomography. The system features negligible crosstalk to neighboring ions (< 3e-4), and switching speeds comparable to typical single qubit gate times (< 2 mus). In a separate experiment, photons scattered from the 171Yb+ ion are coupled into an optical fiber with 63% efficiency using a high numerical aperture lens (0.6 NA). The coupled photons are directed to superconducting nanowire single photon detectors (SNSPD), which provide a higher detector efficiency (69%) compared to traditional photomultiplier tubes (35%). The total system photon collection efficiency is increased from 2.2% to 3.4%, which allows for fast state detection of the qubit. For a detection beam intensity of 11 mW/cm 2, the average detection time is 23.7 mus with 99.885(7)% detection fidelity. The technologies demonstrated in this thesis can be integrated to form a single quantum register with all of the necessary resources to perform local gates as well as high fidelity readout and provide a photon link to other systems.

  7. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Mechanized claims processing and information... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain State plan requirements for an automated statewide management information system, conditions for FFP...

  8. Microelectronic Information Processing Systems: Computing Systems. Summary of Awards Fiscal Year 1994.

    ERIC Educational Resources Information Center

    National Science Foundation, Arlington, VA. Directorate for Computer and Information Science and Engineering.

    The purpose of this summary of awards is to provide the scientific and engineering communities with a summary of the grants awarded in 1994 by the National Science Foundation's Division of Microelectronic Information Processing Systems. Similar areas of research are grouped together. Grantee institutions and principal investigators are identified…

  9. Advanced information processing system for advanced launch system: Avionics architecture synthesis

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1991-01-01

    The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.

  10. Multimedia information processing in the SWAN mobile networked computing system

    NASA Astrophysics Data System (ADS)

    Agrawal, Prathima; Hyden, Eoin; Krzyzanowsji, Paul; Srivastava, Mani B.; Trotter, John

    1996-03-01

    Anytime anywhere wireless access to databases, such as medical and inventory records, can simplify workflow management in a business, and reduce or even eliminate the cost of moving paper documents. Moreover, continual progress in wireless access technology promises to provide per-user bandwidths of the order of a few Mbps, at least in indoor environments. When combined with the emerging high-speed integrated service wired networks, it enables ubiquitous and tetherless access to and processing of multimedia information by mobile users. To leverage on this synergy an indoor wireless network based on room-sized cells and multimedia mobile end-points is being developed at AT&T Bell Laboratories. This research network, called SWAN (Seamless Wireless ATM Networking), allows users carrying multimedia end-points such as PDAs, laptops, and portable multimedia terminals, to seamlessly roam while accessing multimedia data streams from the wired backbone network. A distinguishing feature of the SWAN network is its use of end-to-end ATM connectivity as opposed to the connectionless mobile-IP connectivity used by present day wireless data LANs. This choice allows the wireless resource in a cell to be intelligently allocated amongst various ATM virtual circuits according to their quality of service requirements. But an efficient implementation of ATM in a wireless environment requires a proper mobile network architecture. In particular, the wireless link and medium-access layers need to be cognizant of the ATM traffic, while the ATM layers need to be cognizant of the mobility enabled by the wireless layers. This paper presents an overview of SWAN's network architecture, briefly discusses the issues in making ATM mobile and wireless, and describes initial multimedia applications for SWAN.

  11. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  12. Explainable expert systems: A research program in information processing

    NASA Technical Reports Server (NTRS)

    Paris, Cecile L.

    1993-01-01

    Our work in Explainable Expert Systems (EES) had two goals: to extend and enhance the range of explanations that expert systems can offer, and to ease their maintenance and evolution. As suggested in our proposal, these goals are complementary because they place similar demands on the underlying architecture of the expert system: they both require the knowledge contained in a system to be explicitly represented, in a high-level declarative language and in a modular fashion. With these two goals in mind, the Explainable Expert Systems (EES) framework was designed to remedy limitations to explainability and evolvability that stem from related fundamental flaws in the underlying architecture of current expert systems.

  13. Advanced Information Processing System (AIPS) proof-of-concept system functional design I/O network system services

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The function design of the Input/Output (I/O) services for the Advanced Information Processing System (AIPS) proof of concept system is described. The data flow diagrams, which show the functional processes in I/O services and the data that flows among them, are contained. A complete list of the data identified on the data flow diagrams and in the process descriptions are provided.

  14. Advanced information processing system for advanced launch system: Hardware technology survey and projections

    NASA Technical Reports Server (NTRS)

    Cole, Richard

    1991-01-01

    The major goals of this effort are as follows: (1) to examine technology insertion options to optimize Advanced Information Processing System (AIPS) performance in the Advanced Launch System (ALS) environment; (2) to examine the AIPS concepts to ensure that valuable new technologies are not excluded from the AIPS/ALS implementations; (3) to examine advanced microprocessors applicable to AIPS/ALS, (4) to examine radiation hardening technologies applicable to AIPS/ALS; (5) to reach conclusions on AIPS hardware building blocks implementation technologies; and (6) reach conclusions on appropriate architectural improvements. The hardware building blocks are the Fault-Tolerant Processor, the Input/Output Sequencers (IOS), and the Intercomputer Interface Sequencers (ICIS).

  15. Medicaid Program; Mechanized Claims Processing and Information Retrieval Systems (90/10). Final rule.

    PubMed

    2015-12-01

    This final rule will extend enhanced funding for Medicaid eligibility systems as part of a state's mechanized claims processing system, and will update conditions and standards for such systems, including adding to and updating current Medicaid Management Information Systems (MMIS) conditions and standards. These changes will allow states to improve customer service and support the dynamic nature of Medicaid eligibility, enrollment, and delivery systems.

  16. Certifying single-system steering for quantum-information processing

    NASA Astrophysics Data System (ADS)

    Li, Che-Ming; Chen, Yueh-Nan; Lambert, Neill; Chiu, Ching-Yi; Nori, Franco

    2015-12-01

    Einstein-Podolsky-Rosen (EPR) steering describes how different ensembles of quantum states can be remotely prepared by measuring one particle of an entangled pair. Here, we investigate quantum steering for single quantum d -dimensional systems (qudits) and devise efficient conditions to certify the steerability therein, which we find are applicable both to single-system steering and EPR steering. In the single-system case our steering conditions enable the unambiguous ruling out of generic classical means of mimicking steering. Ruling out "false-steering" scenarios has implications for securing channels against both cloning-based individual attack and coherent attacks when implementing quantum key distribution using qudits. We also show that these steering conditions also have applications in quantum computation, in that they can serve as an efficient criterion for the evaluation of quantum logic gates of arbitrary size. Finally, we describe how the nonlocal EPR variant of these conditions also function as tools for identifying faithful one-way quantum computation, secure entanglement-based quantum communication, and genuine multipartite EPR steering.

  17. Advanced information processing system: Fault injection study and results

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura F.; Masotto, Thomas K.; Lala, Jaynarayan H.

    1992-01-01

    The objective of the AIPS program is to achieve a validated fault tolerant distributed computer system. The goals of the AIPS fault injection study were: (1) to present the fault injection study components addressing the AIPS validation objective; (2) to obtain feedback for fault removal from the design implementation; (3) to obtain statistical data regarding fault detection, isolation, and reconfiguration responses; and (4) to obtain data regarding the effects of faults on system performance. The parameters are described that must be varied to create a comprehensive set of fault injection tests, the subset of test cases selected, the test case measurements, and the test case execution. Both pin level hardware faults using a hardware fault injector and software injected memory mutations were used to test the system. An overview is provided of the hardware fault injector and the associated software used to carry out the experiments. Detailed specifications are given of fault and test results for the I/O Network and the AIPS Fault Tolerant Processor, respectively. The results are summarized and conclusions are given.

  18. Advanced information processing system: Authentication protocols for network communication

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.

    1994-01-01

    In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.

  19. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated... Circular No. A-130, Revised, Transmittal No. 4, Appendix III, “Security of Federal Automated...

  20. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated... Circular No. A-130, Revised, Transmittal No. 4, Appendix III, “Security of Federal Automated...

  1. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated... Circular No. A-130, Revised, Transmittal No. 4, Appendix III, “Security of Federal Automated...

  2. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated... Circular No. A-130, Revised, Transmittal No. 4, Appendix III, “Security of Federal Automated...

  3. Research and Development in the Computer and Information Sciences. Volume 2, Processing, Storage, and Output Requirements in Information Processing Systems: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    Areas of concern with respect to processing, storage, and output requirements of a generalized information processing system are considered. Special emphasis is placed on multiple-access systems. Problems of system management and control are discussed, including hierarchies of storage levels. Facsimile, digital, and mass random access storage…

  4. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    PubMed

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. PMID:26723233

  5. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    PubMed

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour.

  6. IBIS - A geographic information system based on digital image processing and image raster datatype

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1976-01-01

    IBIS (Image Based Information System) is a geographic information system which makes use of digital image processing techniques to interface existing geocoded data sets and information management systems with thematic maps and remotely sensed imagery. The basic premise is that geocoded data sets can be referenced to a raster scan that is equivalent to a grid cell data set. The first applications (St. Tammany Parish, Louisiana, and Los Angeles County) have been restricted to the design of a land resource inventory and analysis system. It is thought that the algorithms and the hardware interfaces developed will be readily applicable to other Landsat imagery.

  7. Process-driven information management system at a biotech company: concept and implementation.

    PubMed

    Gobbi, Alberto; Funeriu, Sandra; Ioannou, John; Wang, Jinyi; Lee, Man-Ling; Palmer, Chris; Bamford, Bob; Hewitt, Robin

    2004-01-01

    While established pharmaceutical companies have chemical information systems in place to manage their compounds and the associated data, new startup companies need to implement these systems from scratch. Decisions made early in the design phase usually have long lasting effects on the expandability, maintenance effort, and costs associated with the information management system. Careful analysis of work and data flows, both inter- and intradepartmental, and identification of existing dependencies between activities are important. This knowledge is required to implement an information management system, which enables the research community to work efficiently by avoiding redundant registration and processing of data and by timely provision of the data whenever needed. This paper first presents the workflows existing at Anadys, then ARISE, the research information management system developed in-house at Anadys. ARISE was designed to support the preclinical drug discovery process and covers compound registration, analytical quality control, inventory management, high-throughput screening, lower throughput screening, and data reporting.

  8. The Effectiveness of Information Systems Teams as Change Agents in the Implementation of Business Process Reengineering

    ERIC Educational Resources Information Center

    Griffith, Gary L.

    2009-01-01

    Changes to information systems and technology (IS/IT) are happening faster than ever before. A literature review suggested within business process reengineering (BPR) there is limited information on what an IS/IT team could do to reduce resistance to change and increase user acceptance. The purpose of this ethnographic case study was to explore…

  9. Parallel Information Processing.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1992-01-01

    Examines parallel computer architecture and the use of parallel processors for text. Topics discussed include parallel algorithms; performance evaluation; parallel information processing; parallel access methods for text; parallel and distributed information retrieval systems; parallel hardware for text; and network models for information…

  10. Management Information System Project: Administrators Manual to the Program Oriented Accounting System. The Budgetary Process.

    ERIC Educational Resources Information Center

    Iowa Univ., Iowa City. Iowa Center for Research in School Administration.

    This document overviews the supporting relation of a Management Information System (MIS) to a Program Planning Budgeting Evaluation system (PPBE) and then concentrates on the financial tract aspects of an MIS. First, five tract areas in which an MIS provides information are discussed: pupils, personnel, finance, facilities, and community. Then, an…

  11. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  12. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  13. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  14. Polarization information processing and software system design for simultaneously imaging polarimetry

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Liu, Jing; Jin, Weiqi; Wen, Renjie

    2015-08-01

    Simultaneous imaging polarimetry can realize real-time polarization imaging of the dynamic scene, which has wide application prospect. This paper first briefly illustrates the design of the double separate Wollaston Prism simultaneous imaging polarimetry, and then emphases are put on the polarization information processing methods and software system design for the designed polarimetry. Polarization information processing methods consist of adaptive image segmentation, high-accuracy image registration, instrument matrix calibration. Morphological image processing was used for image segmentation by taking dilation of an image; The accuracy of image registration can reach 0.1 pixel based on the spatial and frequency domain cross-correlation; Instrument matrix calibration adopted four-point calibration method. The software system was implemented under Windows environment based on C++ programming language, which realized synchronous polarization images acquisition and preservation, image processing and polarization information extraction and display. Polarization data obtained with the designed polarimetry shows that: the polarization information processing methods and its software system effectively performs live realize polarization measurement of the four Stokes parameters of a scene. The polarization information processing methods effectively improved the polarization detection accuracy.

  15. Scalable mobile information system to support the treatment process and the workflow of wastewater facilities.

    PubMed

    Schuchardt, L; Steinmetz, H; Ehret, J; Ebert, A; Schmitt, T G

    2004-01-01

    In order to support the operation of wastewater systems and the workflow of sewage systems an application for demonstration has been developed to show exemplarily how a mobile information system can be transferred into practice and used by the staff. The paper presents a scalable information visualisation system, which can be used with mobile devices. The regarded information data does not only include process data, but also general information about buildings and units, work directions, occupational safety regulations as well as instructions of first aid in case of a work accident. This is particularly appropriate for the use in remote facilities. The implementation is based on but not limited to SQL, JSP and HTML.

  16. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets

    PubMed Central

    Yang, Jianji; Cohen, Aaron; Hersh, William

    2009-01-01

    Background Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS) is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. Results The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. Conclusion The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. Availability GICSS can be accessed online at: PMID:19208193

  17. Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution

    PubMed Central

    Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.

    1987-01-01

    Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.

  18. Considerations in developing geographic informations systems based on low-cost digital image processing

    NASA Technical Reports Server (NTRS)

    Henderson, F. M.; Dobson, M. W.

    1981-01-01

    The potential of digital image processing systems costing $20,000 or less for geographic information systems is assessed with the emphasis on the volume of data to be handled, the commercial hardware systems available, and the basic software for: (1) data entry, conversion and digitization; (2) georeferencing and geometric correction; (3) data structuring; (4) editing and updating; (5) analysis and retrieval; (6) output drivers; and (7) data management. Costs must also be considered as tangible and intangible factors.

  19. Information-Processing Architectures in Multidimensional Classification: A Validation Test of the Systems Factorial Technology

    ERIC Educational Resources Information Center

    Fific, Mario; Nosofsky, Robert M.; Townsend, James T.

    2008-01-01

    A growing methodology, known as the systems factorial technology (SFT), is being developed to diagnose the types of information-processing architectures (serial, parallel, or coactive) and stopping rules (exhaustive or self-terminating) that operate in tasks of multidimensional perception. Whereas most previous applications of SFT have been in…

  20. Debugging the Conversion Process: Lessons from an Administrative Information System Conversion.

    ERIC Educational Resources Information Center

    Hibbler, Fritz; Mitchell, Linda

    1995-01-01

    The process of reengineering the administrative information system at the University of Idaho required careful planning, development of a set of philosophies, and several strategies for ensuring success. The latter included a humorous symbol (a jitter bug) to express concern, improved institutional communications, and a focus on logistical issues.…

  1. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 2 2014-10-01 2012-10-01 true Mechanized claims processing and information retrieval systems; definitions. 205.35 Section 205.35 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...

  2. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  3. Quantum Fisher information flow and non-Markovian processes of open systems

    SciTech Connect

    Lu Xiaoming; Wang Xiaoguang; Sun, C. P.

    2010-10-15

    We establish an information-theoretic approach for quantitatively characterizing the non-Markovianity of open quantum processes. Here, the quantum Fisher information (QFI) flow provides a measure to statistically distinguish Markovian and non-Markovian processes. A basic relation between the QFI flow and non-Markovianity is unveiled for quantum dynamics of open systems. For a class of time-local master equations, the exactly analytic solution shows that for each fixed time the QFI flow is decomposed into additive subflows according to different dissipative channels.

  4. A Collaborative Knowledge Management Process for Implementing Healthcare Enterprise Information Systems

    NASA Astrophysics Data System (ADS)

    Cheng, Po-Hsun; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    This paper illustrates a feasible health informatics domain knowledge management process which helps gather useful technology information and reduce many knowledge misunderstandings among engineers who have participated in the IBM mainframe rightsizing project at National Taiwan University (NTU) Hospital. We design an asynchronously sharing mechanism to facilitate the knowledge transfer and our health informatics domain knowledge management process can be used to publish and retrieve documents dynamically. It effectively creates an acceptable discussion environment and even lessens the traditional meeting burden among development engineers. An overall description on the current software development status is presented. Then, the knowledge management implementation of health information systems is proposed.

  5. Automated system function allocation and display format: Task information processing requirements

    NASA Technical Reports Server (NTRS)

    Czerwinski, Mary P.

    1993-01-01

    An important consideration when designing the interface to an intelligent system concerns function allocation between the system and the user. The display of information could be held constant, or 'fixed', leaving the user with the task of searching through all of the available information, integrating it, and classifying the data into a known system state. On the other hand, the system, based on its own intelligent diagnosis, could display only relevant information in order to reduce the user's search set. The user would still be left the task of perceiving and integrating the data and classifying it into the appropriate system state. Finally, the system could display the patterns of data. In this scenario, the task of integrating the data is carried out by the system, and the user's information processing load is reduced, leaving only the tasks of perception and classification of the patterns of data. Humans are especially adept at this form of display processing. Although others have examined the relative effectiveness of alphanumeric and graphical display formats, it is interesting to reexamine this issue together with the function allocation problem. Currently, Johnson Space Center is the test site for an intelligent Thermal Control System (TCS), TEXSYS, being tested for use with Space Station Freedom. Expert TCS engineers, as well as novices, were asked to classify several displays of TEXSYS data into various system states (including nominal and anomalous states). Three different display formats were used: fixed, subset, and graphical. The hypothesis tested was that the graphical displays would provide for fewer errors and faster classification times by both experts and novices, regardless of the kind of system state represented within the display. The subset displays were hypothesized to be the second most effective display format/function allocation condition, based on the fact that the search set is reduced in these displays. Both the subset and the

  6. Information Sharing in the Process Control Systems Forum Assessing Liability Issues

    SciTech Connect

    Ray Fink

    2005-10-01

    The Process Control Systems Forum (http://www.pcsforum.org) is an open, collaborative, voluntary forum established by the Department of Homeland Security. The purpose of the Forum is to accelerate the development of technology that will enhance the security, safety, and reliability of process control systems (PCS) and supervisory control and data acquisition (SCADA) systems. It is intended as a venue for technologists from user sectors, vendors, and academia. The Forum is not a standards body. Within the Forum, there is a variety of working groups and interest groups that are focused on specific subject areas. One such Interest Group is addressing how to create a ''safe zone'' for critical information sharing. This Interest Group is concerned with topics such as: trade-offs between maintaining security and sharing best practices; secure mechanisms for sharing of critical information; legal issues associated with sharing information; institutional impediments to sharing best practices and relevant incidents; finding a meaningful manner of exchange for sharing process control security events, incidents, audit logs, etc.; and creating a database of relevant industrial cyber events. The purpose of this white paper is to address liability issues that might arise from sharing of critical information such as recommended ''best practices''. There is a concern that by publishing ''best practices'' or similar information, the Forum or its members might be inadvertently assuming some liability. The following scenarios illustrate the concerns about potential liability.

  7. PREFACE: Quantum information processing

    NASA Astrophysics Data System (ADS)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  8. Geographic information system programs for use in the water-supply-allocation permitting process

    USGS Publications Warehouse

    Dunne, Paul; Price, C.V.

    1995-01-01

    Computer programs designed for use in a geographic information system as an aid in the water-supply- allocation permitting process are described. These programs were developed by the U.S. Geological Survey during a project conducted in cooperation with the New Jersey Department of Environmental Protection. The programs enable a user to display proposed water-supply-allocation sites in a defined area together with present sites and important hydrologic and geographic features on a computer screen or on hardcopy plots. The programs are menu-driven and do not require familiarity with geographic information systems. Source codes for the programs are included in appendixes.

  9. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  10. [Information systems].

    PubMed

    Rodríguez Maniega, José Antonio; Trío Maseda, Reyes

    2005-03-01

    The arrival of victims of the terrorist attacks of 11 March at the hospital put the efficiency of its information systems to the test. To be most efficient, these systems should be simple and directed, above all, to the follow-up of victims and to providing the necessary information to patients and families. A specific and easy to use system is advisable. PMID:15771852

  11. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  12. Practical Applications of Space Systems, Supporting Paper 13: Information Services and Information Processing.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Assembly of Engineering.

    This report summarizes the findings of one of fourteen panels that studied progress in space science applications and defined user needs potentially capable of being met by space-system applications. The study was requested by the National Aeronautics and Space Administration (NASA) and was conducted by the Space Applications Board. The panels…

  13. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; Kraft, S.

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  14. Weather Information Processing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  15. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  16. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  17. Controlling Atomic, Solid-State and Hybrid Systems for Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Gullans, Michael John

    Quantum information science involves the use of precise control over quantum systems to explore new technologies. However, as quantum systems are scaled up they require an ever deeper understanding of many-body physics to achieve the required degree of control. Current experiments are entering a regime which requires active control of a mesoscopic number of coupled quantum systems or quantum bits (qubits). This thesis describes several approaches to this goal and shows how mesoscopic quantum systems can be controlled and utilized for quantum information tasks. The first system we consider is the nuclear spin environment of GaAs double quantum dots containing two electrons. We show that the through appropriate control of dynamic nuclear polarization one can prepare the nuclear spin environment in three distinct collective quantum states which are useful for quantum information processing with electron spin qubits. We then investigate a hybrid system in which an optical lattice is formed in the near field scattering off an array of metallic nanoparticles by utilizing the plasmonic resonance of the nanoparticles. We show that such a system would realize new regimes of dense, ultra-cold quantum matter and can be used to create a quantum network of atoms and plasmons. Finally we investigate quantum nonlinear optical systems. We show that the intrinsic nonlinearity for plasmons in graphene can be large enough to make a quantum gate for single photons. We also consider two nonlinear optical systems based on ultracold gases of atoms. In one case, we demonstrate an all-optical single photon switch using cavity quantum electrodynamics (QED) and slow light. In the second case, we study few photon physics in strongly interacting Rydberg polariton systems, where we demonstrate the existence of two and three photon bound states and study their properties.

  18. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  19. Intelligent information system: for automation of airborne early warning crew decision processes

    NASA Astrophysics Data System (ADS)

    Chin, Hubert H.

    1991-03-01

    This paper describes an automation of AEW crew decision processed implemented in an intelligent information system for an advanced AEW aircraft platform. The system utilizes the existing AEW aircraft database and knowledge base such that the database can provide sufficient data to solve the sizable AEW problems. A database management system is recommended for managing the large amount of data. In order to expand a conventional expert system so that is has the capacity to solve the sizable problems, a cooperative model is required to coordinate with five expert systems in the cooperative decision process. The proposed model partitions the traditional knowledge base into a set of disjoint portions which cover the needs of and are shared by the expert systems. Internal communications take place on common shared portions. A cooperative algorithm is required for updating synchronization and concurrent control. The purpose of this paper is to present a cooperative model for enhancing standard rule-based expert systems to make cooperative decision and to superimpose the global knowledge base and database in a more natural fashion. The tools being used for developing the prototype are the ADA programming language and the ORACLE relational database management system.

  20. NASA End-to-End Data System /NEEDS/ information adaptive system - Performing image processing onboard the spacecraft

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Howle, W. M.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  1. Materials And Processes Technical Information System (MAPTIS) LDEF materials data base

    NASA Technical Reports Server (NTRS)

    Funk, Joan G.; Strickland, John W.; Davis, John M.

    1993-01-01

    A preliminary Long Duration Exposure Facility (LDEF) Materials Data Base was developed by the LDEF Materials Special Investigation Group (MSIG). The LDEF Materials Data Base is envisioned to eventually contain the wide variety and vast quantity of materials data generated from LDEF. The data is searchable by optical, thermal, and mechanical properties, exposure parameters (such as atomic oxygen flux) and author(s) or principal investigator(s). Tne LDEF Materials Data Base was incorporated into the Materials and Processes Technical Information System (MAPTIS). MAPTIS is a collection of materials data which has been computerized and is available to engineers, designers, and researchers in the aerospace community involved in the design and development of spacecraft and related hardware. The LDEF Materials Data Base is described and step-by-step example searches using the data base are included. Information on how to become an authorized user of the system is included.

  2. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  3. A highly reliable, autonomous data communication subsystem for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Masotto, Thomas; Alger, Linda

    1990-01-01

    The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.

  4. Language Processing in Information Retrieval.

    ERIC Educational Resources Information Center

    Doszkocs, Tamase

    1986-01-01

    Examines role and contributions of natural-language processing in information retrieval and artificial intelligence research in context of large operational information retrieval systems and services. State-of-the-art information retrieval systems combining the functional capabilities of conventional inverted file term adjacency approach with…

  5. A user-centred design approach for introducing computer-based process information systems.

    PubMed

    Kontogiannis, T; Embrey, D

    1997-04-01

    There has been an increasing tendency to use computer-based process information systems as the main interface through which operators interact with complex industrial systems. Although the new technology has produced greater hardware reliability and maintainability, the corresponding potential benefits for operability have not always been achieved. Automation has introduced new forms of design and operating errors. One of the major reasons for this problem has been the lack of human factors advice and user participation early in the design process. This paper discusses a user-centred design approach to increase operability and user acceptance of new technologies and working practices. Application of this approach in the context of a chemical plant indicates its promise, but also highlights the difficulties involved in gaining user participation and management commitment.

  6. Kuhlthau's Information Search Process.

    ERIC Educational Resources Information Center

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  7. A flexible statistics web processing service--added value for information systems for experiment data.

    PubMed

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-01-01

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service. PMID:20410555

  8. Information Processing - Administrative Data Processing

    NASA Astrophysics Data System (ADS)

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  9. Challenges in Scheduling Aggregation in CyberPhysical Information Processing Systems

    SciTech Connect

    Horey, James L; Lagesse, Brent J

    2011-01-01

    Data aggregation (a.k.a reduce operations) is an important element in information processing systems, including MapReduce clusters and cyberphysical networks. Unlike simple sensor networks, all the data in information processing systems must be eventually aggregated. Our goal is to lower overall latency in these systems by intelligently scheduling aggregation on intermediate routing nodes. Unlike previous models, our model explicitly takes into account link latency and computa- tion time. Our model also considers heterogeneous computing capabilities. In order to understand the potential challenges associated with constructing a distributed scheduler that minimizes la- tency, we ve developed a simulation of our model and tested the results of randomly scheduling nodes. Although these experiments were designed to provide data for a null-model, preliminary results have yielded a few interesting observations. We show that in cases where the computation time is larger than transmission time, in-network aggregation can have a large effect (reducing latency by 50% or more), but that naive scheduling can have a detrimental effect. Specifically, we show that when the root node (a.k.a the basestation) is faster than the other nodes, the latency can increase with increased coverage, and that these effects vary with the number of nodes present.

  10. [Importance of health information systems in the process of reform and reconstruction of health care].

    PubMed

    Ridanović, Z

    1998-01-01

    Reform and reconstruction of health care system can not be carried out without health information systems and modern information and communication technologies. In other hand, health information system of The Federation of BiH must be an object of both reform and reconstruction. This thesis points out that reform of health information system is a crucial priorities in order to improve and fasten reform. There is a paradigmatic question: who provides service, to whom, what is the price, and what is the final solution? In order to answer this question, an integral health information system that will be computer supported is necessary. For integral work and information exchange, computers must be connected and follow the same operating procedures. Benefits of an integral health information system, as well as impact factors for its implementation are discussed in the paper. PMID:9623089

  11. Incorporating level set methods in Geographical Information Systems (GIS) for land-surface process modeling

    NASA Astrophysics Data System (ADS)

    Pullar, D.

    2005-08-01

    Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.

  12. Business Marketing Information Systems Skills. Voc-Ed Project. Business Data Processing Career Area. Report.

    ERIC Educational Resources Information Center

    Milwaukee Area Technical Coll., WI.

    This report and research analysis relate to the Milwaukee Area Technical College Research Project, a study undertaken to determine a curriculum to meet the information processing/management training needs of individuals entering or continuing careers in the information marketing and business data processing occupational clusters. The report of…

  13. Hybrid quantum information processing

    SciTech Connect

    Furusawa, Akira

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  14. Hybrid quantum information processing

    NASA Astrophysics Data System (ADS)

    Furusawa, Akira

    2014-12-01

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  15. Optical-mechanical line-scan imaging process - Its information capacity and efficiency. [satellite multispectral sensing systems application

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Park, S. K.

    1975-01-01

    Optical-mechanical line-scan techniques have been applied to earth satellite multispectral imaging systems. The capability of the imaging system is generally assessed by its information capacity. An approach based on information theory is developed to formulate the capacity of the line-scan process. Included are the effects of blurring of spatial detail, photosensor noise, aliasing, and quantization. The information efficiency is shown to be dependent on sampling rate, system frequency response shape, SNR, and quantization interval.

  16. TEX-SIS FOLLOW-UP: Student Follow-up Management Information System. Data Processing Manual.

    ERIC Educational Resources Information Center

    Tarrant County Junior Coll. District, Ft. Worth, TX.

    Project FOLLOW-UP was conducted to develop, test, and validate a statewide management information system for follow-up of Texas public junior and community college students. The result of this project was a student information system (TEX-SIS) consisting of seven subsystems: (1) Student's Educational Intent, (2) Nonreturning Student Follow-up, (3)…

  17. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    ERIC Educational Resources Information Center

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  18. Application of automation and information systems to forensic genetic specimen processing.

    PubMed

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  19. Advanced information processing system: The Army Fault-Tolerant Architecture detailed design overview

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven

    1994-01-01

    The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.

  20. Stochastic thermodynamics of information processing

    NASA Astrophysics Data System (ADS)

    Cardoso Barato, Andre

    2015-03-01

    We consider two recent advancements on theoretical aspects of thermodynamics of information processing. First we show that the theory of stochastic thermodynamics can be generalized to include information reservoirs. These reservoirs can be seen as a sequence of bits which has its Shannon entropy changed due to the interaction with the system. Second we discuss bipartite systems, which provide a convenient description of Maxwell's demon. Analyzing a special class of bipartite systems we show that they can be used to study cellular information processing, allowing for the definition of an entropic rate that quantifies how much a cell learns about a fluctuating external environment and that is bounded by the thermodynamic entropy production.

  1. Processes in construction of failure management expert systems from device design information

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Lance, Nick

    1987-01-01

    This paper analyzes the tasks and problem solving methods used by an engineer in constructing a failure management expert system from design information about the device to te diagnosed. An expert test engineer developed a trouble-shooting expert system based on device design information and experience with similar devices, rather than on specific expert knowledge gained from operating the device or troubleshooting its failures. The construction of the expert system was intensively observed and analyzed. This paper characterizes the knowledge, tasks, methods, and design decisions involved in constructing this type of expert system, and makes recommendations concerning tools for aiding and automating construction of such systems.

  2. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., and conversion plans to install the computer system. (c) The following terms are defined at 45 CFR... preparation of a detailed project plan describing when and how the computer system will be designed and... State plan requirements for an automated statewide management information system, conditions for FFP...

  3. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., and conversion plans to install the computer system. (c) The following terms are defined at 45 CFR... preparation of a detailed project plan describing when and how the computer system will be designed and... State plan requirements for an automated statewide management information system, conditions for FFP...

  4. Information systems definition architecture

    SciTech Connect

    Calapristi, A.J.

    1996-06-20

    The Tank Waste Remediation System (TWRS) Information Systems Definition architecture evaluated information Management (IM) processes in several key organizations. The intent of the study is to identify improvements in TWRS IM processes that will enable better support to the TWRS mission, and accommodate changes in TWRS business environment. The ultimate goals of the study are to reduce IM costs, Manage the configuration of TWRS IM elements, and improve IM-related process performance.

  5. Neural Analog Information Processing

    NASA Astrophysics Data System (ADS)

    Hecht-Nielsen, Robert

    1982-07-01

    Neural Analog Information Processing (NAIP) is an effort to develop general purpose pattern classification architectures based upon biological information processing principles. This paper gives an overview of NAIP and its relationship to the previous work in neural modeling from which its fundamental principles are derived. It also presents a theorem concerning the stability of response of a slab (a two dimensional array of identical simple processing units) to time-invariant (spatial) patterns. An experiment (via computer emulation) demonstrating classification of a spatial pattern by a simple, but complete NAIP architecture is described. A concept for hardware implementation of NAIP architectures is briefly discussed.

  6. Integration of Geographic Information System frameworks into domain discretisation and meshing processes for geophysical models

    NASA Astrophysics Data System (ADS)

    Candy, A. S.; Avdis, A.; Hill, J.; Gorman, G. J.; Piggott, M. D.

    2014-09-01

    Computational simulations of physical phenomena rely on an accurate discretisation of the model domain. Numerical models have increased in sophistication to a level where it is possible to support terrain-following boundaries that conform accurately to real physical interfaces, and resolve a multiscale of spatial resolutions. Whilst simulation codes are maturing in this area, pre-processing tools have not developed significantly enough to competently initialise these problems in a rigorous, efficient and recomputable manner. In the relatively disjoint field of Geographic Information Systems (GIS) however, techniques and tools for mapping and analysis of geographical data have matured significantly. If data provenance and recomputability are to be achieved, the manipulation and agglomeration of data in the pre-processing of numerical simulation initialisation data for geophysical models should be integrated into GIS. A new approach to the discretisation of geophysical domains is presented, and introduced with a verified implementation. This brings together the technologies of geospatial analysis, meshing and numerical simulation models. This platform enables us to combine and build up features, quickly drafting and updating mesh descriptions with the rigour that established GIS tools provide. This, combined with the systematic workflow, supports a strong provenance for model initialisation and encourages the convergence of standards.

  7. PEET: a Matlab tool for estimating physical gate errors in quantum information processing systems

    NASA Astrophysics Data System (ADS)

    Hocker, David; Kosut, Robert; Rabitz, Herschel

    2016-09-01

    A Physical Error Estimation Tool (PEET) is introduced in Matlab for predicting physical gate errors of quantum information processing (QIP) operations by constructing and then simulating gate sequences for a wide variety of user-defined, Hamiltonian-based physical systems. PEET is designed to accommodate the interdisciplinary needs of quantum computing design by assessing gate performance for users familiar with the underlying physics of QIP, as well as those interested in higher-level computing operations. The structure of PEET separates the bulk of the physical details of a system into Gate objects, while the construction of quantum computing gate operations are contained in GateSequence objects. Gate errors are estimated by Monte Carlo sampling of noisy gate operations. The main utility of PEET, though, is the implementation of QuantumControl methods that act to generate and then test gate sequence and pulse-shaping techniques for QIP performance. This work details the structure of PEET and gives instructive examples for its operation.

  8. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  9. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Astrophysics Data System (ADS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  10. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  11. Modeling the Retrieval Process for an Information Retrieval System Using an Ordinal Fuzzy Linguistic Approach.

    ERIC Educational Resources Information Center

    Herrera-Viedma, E.

    2001-01-01

    Proposes a linguistic model for an Information Retrieval System (IRS) defined using an ordinal fuzzy linguistic approach. The query subsystem accepts Boolean queries with terms weighted by ordinal linguistic values and the evaluation subsystem returns documents arranged in relevance classes labeled with ordinal linguistic values. The system gives…

  12. Performance assessment and adoption processes of an information monitoring and diagnostic system prototype

    SciTech Connect

    Piette, Mary Ann

    1999-10-01

    This report addresses the problem that buildings do not perform as well as anticipated during design. We partnered with an innovative building operator to evaluate a prototype Information Monitoring and Diagnostic System (IMDS). The IMDS consists of high-quality measurements archived each minute, a data visualization tool, and a web-based capability. The operators recommend similar technology be adopted in other buildings. The IMDS has been used to identify and correct a series of control problems. It has also allowed the operators to make more effective use of the building control system, freeing up time to take care of other tenant needs. They believe they have significantly improved building comfort, potentially improving tenant health, and productivity. The reduction in hours to operate the building are worth about $20,000 per year, which could pay for the IMDS in about five years. A control system retrofit based on findings from the IMDS is expected to reduce energy use by 20 percent over the next year, worth over $30,000 per year. The main conclusion of the model-based chiller fault detection work is that steady-state models can be used as reference models to monitor chiller operation and detect faults. The ability of the IMDS to measure cooling load and chiller power to one-percent accuracy with a one-minute sampling interval permits detection of additional faults. Evolutionary programming techniques were also evaluated, showing promise in the detection of patterns in building data. We also evaluated two technology adoption processes, radical and routine. In routine adoption, managers enhance features of existing products that are already well understood. In radical adoption, innovative building managers introduce novel technology into their organizations without using the rigorous payback criteria used in routine innovations.

  13. Development of Energy Models for Production Systems and Processes to Inform Environmentally Benign Decision-Making

    NASA Astrophysics Data System (ADS)

    Diaz-Elsayed, Nancy

    Between 2008 and 2035 global energy demand is expected to grow by 53%. While most industry-level analyses of manufacturing in the United States (U.S.) have traditionally focused on high energy consumers such as the petroleum, chemical, paper, primary metal, and food sectors, the remaining sectors account for the majority of establishments in the U.S. Specifically, of the establishments participating in the Energy Information Administration's Manufacturing Energy Consumption Survey in 2006, the non-energy intensive" sectors still consumed 4*109 GJ of energy, i.e., one-quarter of the energy consumed by the manufacturing sectors, which is enough to power 98 million homes for a year. The increasing use of renewable energy sources and the introduction of energy-efficient technologies in manufacturing operations support the advancement towards a cleaner future, but having a good understanding of how the systems and processes function can reduce the environmental burden even further. To facilitate this, methods are developed to model the energy of manufacturing across three hierarchical levels: production equipment, factory operations, and industry; these methods are used to accurately assess the current state and provide effective recommendations to further reduce energy consumption. First, the energy consumption of production equipment is characterized to provide machine operators and product designers with viable methods to estimate the environmental impact of the manufacturing phase of a product. The energy model of production equipment is tested and found to have an average accuracy of 97% for a product requiring machining with a variable material removal rate profile. However, changing the use of production equipment alone will not result in an optimal solution since machines are part of a larger system. Which machines to use, how to schedule production runs while accounting for idle time, the design of the factory layout to facilitate production, and even the

  14. Improvement of Organizational Performance and Instructional Design: An Analogy Based on General Principles of Natural Information Processing Systems

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Kalyuga, Slava

    2012-01-01

    The process of improving organizational performance through designing systemic interventions has remarkable similarities to designing instruction for improving learners' performance. Both processes deal with subjects (learners and organizations correspondingly) with certain capabilities that are exposed to novel information designed for producing…

  15. Information-processing architectures in multidimensional classification: a validation test of the systems factorial technology.

    PubMed

    Fific, Mario; Nosofsky, Robert M; Townsend, James T

    2008-04-01

    A growing methodology, known as the systems factorial technology (SFT), is being developed to diagnose the types of information-processing architectures (serial, parallel, or coactive) and stopping rules (exhaustive or self-terminating) that operate in tasks of multidimensional perception. Whereas most previous applications of SFT have been in domains of simple detection and visual-memory search, this research extends the applications to foundational issues in multidimensional classification. Experiments are conducted in which subjects are required to classify objects into a conjunctive-rule category structure. In one case the stimuli vary along highly separable dimensions, whereas in another case they vary along integral dimensions. For the separable-dimension stimuli, the SFT methodology revealed a serial or parallel architecture with an exhaustive stopping rule. By contrast, for the integral-dimension stimuli, the SFT methodology provided clear evidence of coactivation. The research provides a validation of the SFT in the domain of classification and adds to the list of converging operations for distinguishing between separable-dimension and integral-dimension interactions.

  16. Blogs and Social Network Sites as Activity Systems: Exploring Adult Informal Learning Process through Activity Theory Framework

    ERIC Educational Resources Information Center

    Heo, Gyeong Mi; Lee, Romee

    2013-01-01

    This paper uses an Activity Theory framework to explore adult user activities and informal learning processes as reflected in their blogs and social network sites (SNS). Using the assumption that a web-based space is an activity system in which learning occurs, typical features of the components were investigated and each activity system then…

  17. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  18. A data and information system for processing, archival, and distribution of data for global change research

    NASA Technical Reports Server (NTRS)

    Graves, Sara J.

    1994-01-01

    Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.

  19. Applications of Social Science to Management Information Systems and Evaluation Process: A Peace Corps Model.

    ERIC Educational Resources Information Center

    Lassey, William R.; And Others

    This study discusses some of the central concepts, assumptions and methods used in the development and design of a Management Information and Evaluation System for the Peace Corps in Colombia. Methodological problems encountered are reviewed. The model requires explicit project or program objectives, individual staff behavioral objectives, client…

  20. Selecting an Information System for the '90s: Can a User Driven Process Work?

    ERIC Educational Resources Information Center

    Jonas, Stephen; And Others

    In 1988, Sinclair Community College (SCC) began a comprehensive study of the need for a new administrative information system that would improve the college's effectiveness and flexibility in providing educational and administrative services. A planning committee provided college-wide coordination for the development of a request for proposal…

  1. A study on airborne integrated display system and human information processing

    NASA Technical Reports Server (NTRS)

    Mizumoto, K.; Iwamoto, H.; Shimizu, S.; Kuroda, I.

    1983-01-01

    The cognitive behavior of pilots was examined in an experiment involving mock ups of an eight display electronic attitude direction indicator for an airborne integrated display. Displays were presented in digital, analog digital, and analog format to experienced pilots. Two tests were run, one involving the speed of memorization in a single exposure and the other comprising two five second exposures spaced 30 sec apart. Errors increased with the speed of memorization. Generally, the analog information was assimilated faster than the digital data, with regard to the response speed. Information processing was quantified as 25 bits for the first five second exposure and 15 bits during the second.

  2. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    PubMed

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly.

  3. Formalize clinical processes into electronic health information systems: Modelling a screening service for diabetic retinopathy.

    PubMed

    Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José

    2015-08-01

    Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly. PMID:26049092

  4. Examination of the Nonlinear Dynamic Systems Associated with Science Student Cognition While Engaging in Science Information Processing

    ERIC Educational Resources Information Center

    Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq

    2016-01-01

    A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…

  5. An Ada implementation of the network manager for the advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail A.

    1986-01-01

    From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.

  6. Organization of Neural Systems for Aversive Information Processing: Pain, Error, and Punishment

    PubMed Central

    Kobayashi, Shunsuke

    2012-01-01

    The avoidance of aversive events is critically important for the survival of organisms. It has been proposed that the medial pain system, including the amygdala, periaqueductal gray (PAG), and anterior cingulate cortex (ACC), contains the neural circuitry that signals pain affect and negative value. This system appears to have multiple defense mechanisms, such as rapid stereotyped escape, aversive association learning, and cognitive adaptation. These defense mechanisms vary in speed and flexibility, reflecting different strategies of self-protection. Over the course of evolution, the medial pain system appears to have developed primitive, associative, and cognitive solutions for aversive avoidance. There may be a functional grading along the caudal-rostral axis, such that the amygdala-PAG system underlies automatic and autonomic responses, the amygdala-orbitofrontal system contributes to associative learning, and the ACC controls cognitive processes in cooperation with the lateral prefrontal cortex. A review of behavioral and physiological studies on the aversive system is presented, and a conceptual framework for understanding the neural organization of the aversive avoidance system is proposed. PMID:23049496

  7. [A Medical Devices Management Information System Supporting Full Life-Cycle Process Management].

    PubMed

    Tang, Guoping; Hu, Liang

    2015-07-01

    Medical equipments are essential supplies to carry out medical work. How to ensure the safety and reliability of the medical equipments in diagnosis, and reduce procurement and maintenance costs is a topic of concern to everyone. In this paper, product lifecycle management (PLM) and enterprise resource planning (ERP) are cited to establish a lifecycle management information system. Through integrative and analysis of the various stages of the relevant data in life-cycle, it can ensure safety and reliability of medical equipments in the operation and provide the convincing data for meticulous management. PMID:26665958

  8. [A Medical Devices Management Information System Supporting Full Life-Cycle Process Management].

    PubMed

    Tang, Guoping; Hu, Liang

    2015-07-01

    Medical equipments are essential supplies to carry out medical work. How to ensure the safety and reliability of the medical equipments in diagnosis, and reduce procurement and maintenance costs is a topic of concern to everyone. In this paper, product lifecycle management (PLM) and enterprise resource planning (ERP) are cited to establish a lifecycle management information system. Through integrative and analysis of the various stages of the relevant data in life-cycle, it can ensure safety and reliability of medical equipments in the operation and provide the convincing data for meticulous management.

  9. Development of a prototype spatial information processing system for hydrologic research

    NASA Technical Reports Server (NTRS)

    Sircar, Jayanta K.

    1991-01-01

    Significant advances have been made in the last decade in the areas of Geographic Information Systems (GIS) and spatial analysis technology, both in hardware and software. Science user requirements are so problem specific that currently no single system can satisfy all of the needs. The work presented here forms part of a conceptual framework for an all-encompassing science-user workstation system. While definition and development of the system as a whole will take several years, it is intended that small scale projects such as the current work will address some of the more short term needs. Such projects can provide a quick mechanism to integrate tools into the workstation environment forming a larger, more complete hydrologic analysis platform. Described here are two components that are very important to the practical use of remote sensing and digital map data in hydrology. Described here is a graph-theoretic technique to rasterize elevation contour maps. Also described is a system to manipulate synthetic aperture radar (SAR) data files and extract soil moisture data.

  10. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  11. Application of digital image processing techniques and information systems to water quality monitoring of Lake Tahoe

    NASA Technical Reports Server (NTRS)

    Smith, A. Y.; Blackwell, R. J.

    1981-01-01

    The Tahoe basin occupies over 500 square miles of territory located in a graben straddling the boundary between California and Nevada. Lake Tahoe contains 126 million acre-feet of water. Since the 1950's the basin has experienced an ever increasing demand for land development at the expense of the natural watershed. Discharge of sediment to the lake has greatly increased owing to accelerated human interference, and alterations to the natural drainage patterns are evident in some areas. In connection with an investigation of the utility of a comprehensive system that takes into account the causes as well as the effects of lake eutrophication, it has been attempted to construct an integrated and workable data base, comprised of currently available data sources for the Lake Tahoe region. Attention is given to the image based information system (IBIS), the construction of the Lake Tahoe basin data base, and the application of the IBIS concept to the Lake Tahoe basin.

  12. Engineering Review Information System

    NASA Technical Reports Server (NTRS)

    Grems, III, Edward G. (Inventor); Henze, James E. (Inventor); Bixby, Jonathan A. (Inventor); Roberts, Mark (Inventor); Mann, Thomas (Inventor)

    2015-01-01

    A disciplinal engineering review computer information system and method by defining a database of disciplinal engineering review process entities for an enterprise engineering program, opening a computer supported engineering item based upon the defined disciplinal engineering review process entities, managing a review of the opened engineering item according to the defined disciplinal engineering review process entities, and closing the opened engineering item according to the opened engineering item review.

  13. A Pressure Injection System for Investigating the Neuropharmacology of Information Processing in Awake Behaving Macaque Monkey Cortex

    PubMed Central

    Veith, Vera K.; Quigley, Cliodhna; Treue, Stefan

    2016-01-01

    The top-down modulation of feed-forward cortical information processing is functionally important for many cognitive processes, including the modulation of sensory information processing by attention. However, little is known about which neurotransmitter systems are involved in such modulations. A practical way to address this question is to combine single-cell recording with local and temporary neuropharmacological manipulation in a suitable animal model. Here we demonstrate a technique combining acute single-cell recordings with the injection of neuropharmacological agents in the direct vicinity of the recording electrode. The video shows the preparation of the pressure injection/recording system, including preparation of the substance to be injected. We show a rhesus monkey performing a visual attention task and the procedure of single-unit recording with block-wise pharmacological manipulations. PMID:27023110

  14. Next generation information systems

    SciTech Connect

    Limback, Nathan P; Medina, Melanie A; Silva, Michelle E

    2010-01-01

    The Information Systems Analysis and Development (ISAD) Team of the Safeguards Systems Group at Los Alamos National Laboratory (LANL) has been developing web based information and knowledge management systems for sixteen years. Our vision is to rapidly and cost effectively provide knowledge management solutions in the form of interactive information systems that help customers organize, archive, post and retrieve nonproliferation and safeguards knowledge and information vital to their success. The team has developed several comprehensive information systems that assist users in the betterment and growth of their organizations and programs. Through our information systems, users are able to streamline operations, increase productivity, and share and access information from diverse geographic locations. The ISAD team is also producing interactive visual models. Interactive visual models provide many benefits to customers beyond the scope of traditional full-scale modeling. We have the ability to simulate a vision that a customer may propose, without the time constraints of traditional engineering modeling tools. Our interactive visual models can be used to access specialized training areas, controlled areas, and highly radioactive areas, as well as review site-specific training for complex facilities, and asset management. Like the information systems that the ISAD team develops, these models can be shared and accessed from any location with access to the internet. The purpose of this paper is to elaborate on the capabilities of information systems and interactive visual models as well as consider the possibility of combining the two capabilities to provide the next generation of infonnation systems. The collection, processing, and integration of data in new ways can contribute to the security of the nation by providing indicators and information for timely action to decrease the traditional and new nuclear threats. Modeling and simulation tied to comprehensive

  15. Smoking Abstinence and Depressive Symptoms Modulate the Executive Control System During Emotional Information Processing

    PubMed Central

    Froeliger, Brett; Modlin, Leslie A.; Kozink, Rachel V.; Wang, Lihong; McClernon, F. Joseph

    2011-01-01

    Background Smoking abstinence disrupts affective and cognitive processes. In this study, functional magnetic resonance imaging (fMRI) was used to investigate the effects of smoking abstinence on emotional information processing (EIP). Methods Smokers (n=17) and nonsmokers (n=18) underwent fMRI while performing an emotional distractor oddball task in which rare targets were presented following negative and neutral task-irrelevant distractors. Smokers completed two sessions: once following 24-hr abstinence and once while satiated. The abstinent versus satiated states were compared by evaluating responses to distractor images and to targets following each distractor valence within frontal executive and limbic brain regions. Regression analyses were done to investigate whether self-reported negative affect influences brain response to images and targets. Exploratory regression analyses examined relations between baseline depressive symptoms and smoking state on brain function. Results Smoking state affected response to target detection in the right inferior frontal gyrus (IFG). During satiety, activation was greater in response to targets following negative versus neutral distractors; following abstinence, the reverse was observed. Withdrawal-related negative affect was associated with right insula activation to negative images. Finally, depression symptoms were associated with abstinence-induced hypoactive response to negative emotional distractors and task-relevant targets following negative distractors in frontal brain regions. Conclusions Neural processes related to novelty detection/attention in the right IFG may be disrupted by smoking abstinence and negative stimuli. Reactivity to emotional stimuli and the interfering effects on cognition are moderated by the magnitude of smoking state-dependent negative affect and baseline depressive symptoms. PMID:22081878

  16. Introduction to Focus Issue: Intrinsic and Designed Computation: Information Processing in Dynamical Systems-Beyond the Digital Hegemony

    NASA Astrophysics Data System (ADS)

    Crutchfield, James P.; Ditto, William L.; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws—that predicted the inexorable improvement in digital circuitry—to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  17. Process evaluation distributed system

    NASA Technical Reports Server (NTRS)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  18. Information processing of motion in facial expression and the geometry of dynamical systems

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; McMenamin, Brenton W.

    2004-12-01

    An interesting problem in analysis of video data concerns design of algorithms that detect perceptually significant features in an unsupervised manner, for instance methods of machine learning for automatic classification of human expression. A geometric formulation of this genre of problems could be modeled with help of perceptual psychology. In this article, we outline one approach for a special case where video segments are to be classified according to expression of emotion or other similar facial motions. The encoding of realistic facial motions that convey expression of emotions for a particular person P forms a parameter space XP whose study reveals the "objective geometry" for the problem of unsupervised feature detection from video. The geometric features and discrete representation of the space XP are independent of subjective evaluations by observers. While the "subjective geometry" of XP varies from observer to observer, levels of sensitivity and variation in perception of facial expressions appear to share a certain level of universality among members of similar cultures. Therefore, statistical geometry of invariants of XP for a sample of population could provide effective algorithms for extraction of such features. In cases where frequency of events is sufficiently large in the sample data, a suitable framework could be provided to facilitate the information-theoretic organization and study of statistical invariants of such features. This article provides a general approach to encode motion in terms of a particular genre of dynamical systems and the geometry of their flow. An example is provided to illustrate the general theory.

  19. Information processing of motion in facial expression and the geometry of dynamical systems

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; McMenamin, Brenton W.

    2005-01-01

    An interesting problem in analysis of video data concerns design of algorithms that detect perceptually significant features in an unsupervised manner, for instance methods of machine learning for automatic classification of human expression. A geometric formulation of this genre of problems could be modeled with help of perceptual psychology. In this article, we outline one approach for a special case where video segments are to be classified according to expression of emotion or other similar facial motions. The encoding of realistic facial motions that convey expression of emotions for a particular person P forms a parameter space XP whose study reveals the "objective geometry" for the problem of unsupervised feature detection from video. The geometric features and discrete representation of the space XP are independent of subjective evaluations by observers. While the "subjective geometry" of XP varies from observer to observer, levels of sensitivity and variation in perception of facial expressions appear to share a certain level of universality among members of similar cultures. Therefore, statistical geometry of invariants of XP for a sample of population could provide effective algorithms for extraction of such features. In cases where frequency of events is sufficiently large in the sample data, a suitable framework could be provided to facilitate the information-theoretic organization and study of statistical invariants of such features. This article provides a general approach to encode motion in terms of a particular genre of dynamical systems and the geometry of their flow. An example is provided to illustrate the general theory.

  20. The evolutionary function of conscious information processing is revealed by its task-dependency in the olfactory system

    PubMed Central

    Keller, Andreas

    2014-01-01

    Although many responses to odorous stimuli are mediated without olfactory information being consciously processed, some olfactory behaviors require conscious information processing. I will here contrast situations in which olfactory information is processed consciously to situations in which it is processed non-consciously. This contrastive analysis reveals that conscious information processing is required when an organism is faced with tasks in which there are many behavioral options available. I therefore propose that it is the evolutionary function of conscious information processing to guide behaviors in situations in which the organism has to choose between many possible responses. PMID:24550876

  1. Hybrid quantum information processing

    NASA Astrophysics Data System (ADS)

    Furusawa, Akira

    2013-03-01

    There are two types of schemes for quantum information processing (QIP). One is based on qubits, and the other is based on continuous variables (CVs), where the computational basis for qubit QIP is { | 0 > , | 1 > } and that for CV QIP is { | x > } (- ∞ < x < ∞). A universal gate set for qubit QIP is {`bit flip'σx, `phase flip'σz, `Hadamard gate'H, ` π / 8 gate', `controlled NOT (CNOT) gate'}. Similarly, a universal gate set for CV QIP is {` x-displacement' D& circ; (x) , ` p-displacement' D& circ; (ip) , `Fourier gate' F& circ;, `cubic phase gate'e ikxcirc;3, `quantum non-demolition (QND) gate'}. There is one-to-one correspondence between them. CV version of `bit flip'σx is ` x-displacement' D& circ; (x) , which changes the value of the computational basis. Similarly, CV version of `phase flip'σz is ` p-displacement' D& circ; (ip) , where `phase flip'σz switches the ``value'' of `conjugate basis' of qubit { | + > , | - > } (| +/- > = (| 0 > +/- | 1 >) / √{ 2}) and ` p-displacement' D& circ; (ip) changes the value of CV conjugate basis { | p > }. `Hadamard' and `Fourier' gates transform computational bases to respective conjugate bases. CV version of ` π / 8 gate' is a `cubic phase gate'e ikxcirc;3, and CV version of CNOT gate is a QND gate. However, the origin of nonlinearity for QIP is totally different, here the very basic nonlinear operation is calculation of multiplication and of course it is the heart of information processing. The nonlinearity of qubit QIP comes from a CNOT gate, while that of CV QIP comes from a cubic phase gate. Since nonlinear operations are harder to realize compared to linear operations, the most difficult operation for qubit is a CNOT gate, while the counter part, a QND gate, is not so difficult. CNOT and QND gates are both entangling gates, it follows that creating entanglement is easier for CV QIP compared to qubit QIP. Here, creating entanglement is the heart of QIP. So, it is a big advantage of CV QIP. On

  2. Computerization of workflows, guidelines, and care pathways: a review of implementation challenges for process-oriented health information systems

    PubMed Central

    Roudsari, Abdul

    2011-01-01

    Objective There is a need to integrate the various theoretical frameworks and formalisms for modeling clinical guidelines, workflows, and pathways, in order to move beyond providing support for individual clinical decisions and toward the provision of process-oriented, patient-centered, health information systems (HIS). In this review, we analyze the challenges in developing process-oriented HIS that formally model guidelines, workflows, and care pathways. Methods A qualitative meta-synthesis was performed on studies published in English between 1995 and 2010 that addressed the modeling process and reported the exposition of a new methodology, model, system implementation, or system architecture. Thematic analysis, principal component analysis (PCA) and data visualisation techniques were used to identify and cluster the underlying implementation ‘challenge’ themes. Results One hundred and eight relevant studies were selected for review. Twenty-five underlying ‘challenge’ themes were identified. These were clustered into 10 distinct groups, from which a conceptual model of the implementation process was developed. Discussion and conclusion We found that the development of systems supporting individual clinical decisions is evolving toward the implementation of adaptable care pathways on the semantic web, incorporating formal, clinical, and organizational ontologies, and the use of workflow management systems. These architectures now need to be implemented and evaluated on a wider scale within clinical settings. PMID:21724740

  3. Automated process planning system

    NASA Technical Reports Server (NTRS)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  4. The information processing organisms.

    PubMed

    Arianova, L

    1996-06-01

    In spite of the tremendous progress in recent decades of biological science, many aspects of the behaviour of organisms in general and of humans in particular remain still somewhat obscure. A new approach towards the study of the behaviour of man was presented by Heisenberg when he emphasized that a Cartesian view of nature as an object "out there" is an illusion in so far as "the observer is always part of the formula, the man viewing nature must be figured in, the experimenter into his experiment and the artist in the scene he paints." (Heisenberg, 1969). The present study is an attempt to make a step forward in this direction by focusing on the ways and means of involvement of the observer which make him an indelible part of the observation. To get a fresh start let us have a look at the physical universe. Although showing an immense variety, all objects, living and non-living, have some characteristics in common. They all obey the physical laws and they all are engaged in perpetual interactions. How do we tell then the difference between living and non-living objects? According to the traditional concept it is the capacity for reproduction that distinguishes living from non-living objects. (Luria et al., 1981). The non-traditional concept presented in this study stresses the way in which objects interact as the crucial point of difference between living and non-living objects. This concept claims that living objects assert themselves as such only when and while interacting in terms of information processing. Under such conditions only, living objects are able to display relative independence of the physical laws, for instance active movement. This display of relative independence is governed by biological laws and defines the behaviour of the living objects as active in principle. All objects who share these characteristics are called living, they behave as wholes assessing themselves as individuals. The definition suggests that they all share the same internal

  5. MARKETING WESTERN WATER: CAN A PROCESS BASED GEOGRAPHIC INFORMATION SYSTEM IMPROVE REALLOCATION DECISIONS? (R828070)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  7. Development of Automatic Live Linux Rebuilding System with Flexibility in Science and Engineering Education and Applying to Information Processing Education

    NASA Astrophysics Data System (ADS)

    Sonoda, Jun; Yamaki, Kota

    We develop an automatic Live Linux rebuilding system for science and engineering education, such as information processing education, numerical analysis and so on. Our system is enable to easily and automatically rebuild a customized Live Linux from a ISO image of Ubuntu, which is one of the Linux distribution. Also, it is easily possible to install/uninstall packages and to enable/disable init daemons. When we rebuild a Live Linux CD using our system, we show number of the operations is 8, and the rebuilding time is about 33 minutes on CD version and about 50 minutes on DVD version. Moreover, we have applied the rebuilded Live Linux CD in a class of information processing education in our college. As the results of a questionnaires survey from our 43 students who used the Live Linux CD, we obtain that the our Live Linux is useful for about 80 percents of students. From these results, we conclude that our system is able to easily and automatically rebuild a useful Live Linux in short time.

  8. Integrated healthcare information systems.

    PubMed

    Miller, J

    1995-01-01

    When it comes to electronic data processing in healthcare, we offer a guarded, but hopeful, prognosis. To be sure, the age of electronic information processing has hit healthcare. Employers, insurance companies, hospitals, physicians and a host of ancillary service providers are all being ushered into a world of high speed, high tech electronic information. Some are even predicting that the health information business will grow from $20 billion to over $100 billion in a decade. Yet, out industry lags behind other industries in its overall movement to the paperless world. Selecting and installing the most advanced integrated information system isn't a simple task, as we've seen. As in life, compromises can produce less than optimal results. Nevertheless, integrated healthcare systems simply won't achieve their goals without systems designed to support the operation of a continuum of services. That's the reality! It is difficult to read about the wonderful advances in other sectors, while realizing that many trees still fall each year in the name of the health care industry. Yes, there are some outstanding examples of organizations pushing the envelop in a variety of areas. Yet from a very practical standpoint, many (like our physician's office) are still struggling or are on the sidelines wondering what to do. Given the competitive marketplace, organizations without effective systems may not have long to wonder and wait.

  9. Implications of Information Processing to Reading Research.

    ERIC Educational Resources Information Center

    Geyer, John J.

    Information processing is discussed as a rapid coalescing of basic disciplines around a point of view with relevance to the reading processes and ultimately to learning to read. Two types of reading models under information processing are analyzed: the O-type model which delineates the organismic systems operating between input and output at a…

  10. 76 FR 52581 - Automated Data Processing and Information Retrieval System Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... related Notice published at [48 FR 29114 for SNP; 48 FR 29115 for FSP], June 24, 1983, this Program is... information collection requirements that will be merged into OMB Control Number 0584-0083, once approved by... checklist have been met. The certifying agency issues some sort of statement or document attesting to...

  11. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  12. Information Processing Theory: Classroom Applications.

    ERIC Educational Resources Information Center

    Slate, John R.; Charlesworth, John R., Jr.

    The information processing model, a theoretical framework of how humans think, reason, and learn, views human cognitive functioning as analogous to the operation of a computer. This paper uses the increased understanding of the information processing model to provide teachers with suggestions for improving the teaching-learning process. Major…

  13. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists.

  14. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... calendar quarter). Subject to 45 CFR 95.611(a), the State shall obtain prior written approval from CMS when... determination systems that do not meet the standards and conditions by December 31, 2015....

  15. Scaling-up Process-Oriented Guided Inquiry Learning Techniques for Teaching Large Information Systems Courses

    ERIC Educational Resources Information Center

    Trevathan, Jarrod; Myers, Trina; Gray, Heather

    2014-01-01

    Promoting engagement during lectures becomes significantly more challenging as class sizes increase. Therefore, lecturers need to experiment with new teaching methodologies to embolden deep learning outcomes and to develop interpersonal skills amongst students. Process Oriented Guided Inquiry Learning is a teaching approach that uses highly…

  16. The DEFENSE (debris Flows triggEred by storms - nowcasting system): An early warning system for torrential processes by radar storm tracking using a Geographic Information System (GIS)

    NASA Astrophysics Data System (ADS)

    Tiranti, Davide; Cremonini, Roberto; Marco, Federica; Gaeta, Armando Riccardo; Barbero, Secondo

    2014-09-01

    Debris flows, responsible for economic losses and occasionally casualties in the alpine region, are mainly triggered by heavy rains characterized by hourly peaks of varying intensity, depending on the features of the basin under consideration. By integrating a recent classification of alpine basins with the radar storm tracking method, an innovative early warning system called DEFENSE (DEbris Flows triggEred by storms - Nowcasting SystEm) was developed using a Geographical Information System (GIS). Alpine catchments were classified into three main classes based on the weathering capacity of the bedrock into clay or clay-like minerals, the amount of which, in unconsolidated material, directly influences the debris flow rheology, and thus the sedimentary processes, the alluvial fan architecture, as well as the triggering frequency and seasonal occurrence probability of debris flows. Storms were identified and tracked by processing weather radar observations; subsequently, rainfall intensities and storm severity were estimated over each classified basin. Due to rainfall threshold values determined for each basin class, based on statistical analysis of historical records, an automatic corresponding warning could be issued to municipalities.

  17. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... central service cost allocation plan required by OMB Circular A-87 to identify and assign costs incurred... efficiency, economy and effectiveness of the system. Failure to provide full access by appropriate State and... property purchased with Food Stamp Program funds, which appear at 7 CFR 277.13 are applicable to...

  18. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... central service cost allocation plan required by OMB Circular A-87 to identify and assign costs incurred... efficiency, economy and effectiveness of the system. Failure to provide full access by appropriate State and... property purchased with Food Stamp Program funds, which appear at 7 CFR 277.13 are applicable to...

  19. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... central service cost allocation plan required by OMB Circular A-87 to identify and assign costs incurred... efficiency, economy and effectiveness of the system. Failure to provide full access by appropriate State and... property purchased with Food Stamp Program funds, which appear at 7 CFR 277.13 are applicable to...

  20. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... central service cost allocation plan required by OMB Circular A-87 to identify and assign costs incurred... efficiency, economy and effectiveness of the system. Failure to provide full access by appropriate State and... property purchased with Food Stamp Program funds, which appear at 7 CFR 277.13 are applicable to...

  1. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... narrative and diagrams describing the generic architecture of a system as opposed to the detailed... if the procurement strategy is not adequately described and justified in an APD. The State agency... or if the procurement strategy is not adequately described and justified in an APD. The State...

  2. Landfill site selection using geographic information system and analytical hierarchy process: A case study Al-Hillah Qadhaa, Babylon, Iraq.

    PubMed

    Chabuk, Ali; Al-Ansari, Nadhir; Hussain, Hussain Musa; Knutsson, Sven; Pusch, Roland

    2016-05-01

    Al-Hillah Qadhaa is located in the central part of Iraq. It covers an area of 908 km(2) with a total population of 856,804 inhabitants. This Qadhaa is the capital of Babylon Governorate. Presently, no landfill site exists in that area based on scientific site selection criteria. For this reason, an attempt has been carried out to find the best locations for landfills. A total of 15 variables were considered in this process (groundwater depth, rivers, soil types, agricultural land use, land use, elevation, slope, gas pipelines, oil pipelines, power lines, roads, railways, urban centres, villages and archaeological sites) using a geographic information system. In addition, an analytical hierarchy process was used to identify the weight for each variable. Two suitable candidate landfill sites were determined that fulfil the requirements with an area of 9.153 km(2) and 8.204 km(2) These sites can accommodate solid waste till 2030.

  3. All-Union Conference on Information Retrieval Systems and Automatic Processing of Scientific and Technical Information, 3rd, Moscow, 1967, Transactions. (Selected Articles).

    ERIC Educational Resources Information Center

    Air Force Systems Command, Wright-Patterson AFB, OH. Foreign Technology Div.

    The role and place of the machine in scientific and technical information is explored including: basic trends in the development of information retrieval systems; preparation of engineering and scientific cadres with respect to mechanization and automation of information works; the logic of descriptor retrieval systems; the 'SETKA-3' automated…

  4. Versatile microwave-driven trapped ion spin system for quantum information processing.

    PubMed

    Piltz, Christian; Sriarunothai, Theeraphot; Ivanov, Svetoslav S; Wölk, Sabine; Wunderlich, Christof

    2016-07-01

    Using trapped atomic ions, we demonstrate a tailored and versatile effective spin system suitable for quantum simulations and universal quantum computation. By simply applying microwave pulses, selected spins can be decoupled from the remaining system and, thus, can serve as a quantum memory, while simultaneously, other coupled spins perform conditional quantum dynamics. Also, microwave pulses can change the sign of spin-spin couplings, as well as their effective strength, even during the course of a quantum algorithm. Taking advantage of the simultaneous long-range coupling between three spins, a coherent quantum Fourier transform-an essential building block for many quantum algorithms-is efficiently realized. This approach, which is based on microwave-driven trapped ions and is complementary to laser-based methods, opens a new route to overcoming technical and physical challenges in the quest for a quantum simulator and a quantum computer. PMID:27419233

  5. Versatile microwave-driven trapped ion spin system for quantum information processing.

    PubMed

    Piltz, Christian; Sriarunothai, Theeraphot; Ivanov, Svetoslav S; Wölk, Sabine; Wunderlich, Christof

    2016-07-01

    Using trapped atomic ions, we demonstrate a tailored and versatile effective spin system suitable for quantum simulations and universal quantum computation. By simply applying microwave pulses, selected spins can be decoupled from the remaining system and, thus, can serve as a quantum memory, while simultaneously, other coupled spins perform conditional quantum dynamics. Also, microwave pulses can change the sign of spin-spin couplings, as well as their effective strength, even during the course of a quantum algorithm. Taking advantage of the simultaneous long-range coupling between three spins, a coherent quantum Fourier transform-an essential building block for many quantum algorithms-is efficiently realized. This approach, which is based on microwave-driven trapped ions and is complementary to laser-based methods, opens a new route to overcoming technical and physical challenges in the quest for a quantum simulator and a quantum computer.

  6. Versatile microwave-driven trapped ion spin system for quantum information processing

    PubMed Central

    Piltz, Christian; Sriarunothai, Theeraphot; Ivanov, Svetoslav S.; Wölk, Sabine; Wunderlich, Christof

    2016-01-01

    Using trapped atomic ions, we demonstrate a tailored and versatile effective spin system suitable for quantum simulations and universal quantum computation. By simply applying microwave pulses, selected spins can be decoupled from the remaining system and, thus, can serve as a quantum memory, while simultaneously, other coupled spins perform conditional quantum dynamics. Also, microwave pulses can change the sign of spin-spin couplings, as well as their effective strength, even during the course of a quantum algorithm. Taking advantage of the simultaneous long-range coupling between three spins, a coherent quantum Fourier transform—an essential building block for many quantum algorithms—is efficiently realized. This approach, which is based on microwave-driven trapped ions and is complementary to laser-based methods, opens a new route to overcoming technical and physical challenges in the quest for a quantum simulator and a quantum computer. PMID:27419233

  7. The research process: informed consent.

    PubMed

    Summers, S

    1993-12-01

    Informed consent is a vital part of the research proposal and study. Informed consent is based on principles of autonomy ie, individuals have a right to full disclosure of information in order to make an informed decision and to assume responsibility for the consequences of their decision. Informed consent also contains a legal element whereby failure to obtain it is considered negligence and/or battery. Conducting studies when patients are medicated, ill, or very young or very old require special steps to verify that subjects are fully informed. It is important that PACU nurses pay particular attention to the informed consent process when planning and conducting research studies.

  8. Advanced information processing system: Hosting of advanced guidance, navigation and control algorithms on AIPS using ASTER

    NASA Technical Reports Server (NTRS)

    Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John

    1994-01-01

    This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.

  9. Business Information Processing Curriculum Guide.

    ERIC Educational Resources Information Center

    Sullivan, Carol

    This curriculum guide is designed to train students in the competencies necessary to meet the needs of the automated office in entry-level information processing positions. The guide is organized into 16 units that are correlated with the essential elements for the business information processing course. Introductory materials include a scope and…

  10. Using Medical Text Extraction, Reasoning and Mapping System (MTERMS) to Process Medication Information in Outpatient Clinical Notes

    PubMed Central

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Karipineni, Neelima; Chang, Frank; Yan, Xuemin; Chang, Fenny; Dimaggio, Dana; Goldman, Debora S.; Rocha, Roberto A.

    2011-01-01

    Clinical information is often coded using different terminologies, and therefore is not interoperable. Our goal is to develop a general natural language processing (NLP) system, called Medical Text Extraction, Reasoning and Mapping System (MTERMS), which encodes clinical text using different terminologies and simultaneously establishes dynamic mappings between them. MTERMS applies a modular, pipeline approach flowing from a preprocessor, semantic tagger, terminology mapper, context analyzer, and parser to structure inputted clinical notes. Evaluators manually reviewed 30 free-text and 10 structured outpatient clinical notes compared to MTERMS output. MTERMS achieved an overall F-measure of 90.6 and 94.0 for free-text and structured notes respectively for medication and temporal information. The local medication terminology had 83.0% coverage compared to RxNorm’s 98.0% coverage for free-text notes. 61.6% of mappings between the terminologies are exact match. Capture of duration was significantly improved (91.7% vs. 52.5%) from systems in the third i2b2 challenge. PMID:22195230

  11. Geographic Names Information System

    USGS Publications Warehouse

    U.S. Geological Survey

    1984-01-01

    The Geographic Names Information System (GNIS) is an automated data system developed by the U.S. Geological Survey (USGS) to standardize and disseminate information on geographic names. GNIS provides primary information for all known places, features, and areas in the United States identified by a proper name. The information in the system can be manipulated to meet varied needs. You can incorporate information from GNIS into your own data base for special applications.

  12. Mission Medical Information System

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.; Joe, John C.; Follansbee, Nicole M.

    2008-01-01

    This viewgraph presentation gives an overview of the Mission Medical Information System (MMIS). The topics include: 1) What is MMIS?; 2) MMIS Goals; 3) Terrestrial Health Information Technology Vision; 4) NASA Health Information Technology Needs; 5) Mission Medical Information System Components; 6) Electronic Medical Record; 7) Longitudinal Study of Astronaut Health (LSAH); 8) Methods; and 9) Data Submission Agreement (example).

  13. Medical Information Systems.

    ERIC Educational Resources Information Center

    Smith, Kent A.

    1986-01-01

    Description of information services from the National Library of Medicine (NLM) highlights a new system for retrieving information from NLM's databases (GRATEFUL MED); a formal Regional Medical Library Network; DOCLINE; the Unified Medical Language System; and Integrated Academic Information Management Systems. Research and development and the…

  14. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  15. Effects of age on spatial information processing: relationship to senescent changes in brain noradrenergic and opioid systems

    SciTech Connect

    Rapp, P.R.

    1985-01-01

    A major focus in current research on aging is the identification of senescent changes in cognitive function in laboratory animals. This literature indicates that the processing of spatial information may be particularly impaired during senescence. The degree to which nonspecific factors (eg. sensory of motor deficits) contribute to behavioral impairments in aging, however, remains largely uninvestigated. In addition, few studies have attempted to identify senescent changes in brain structure and function which might underlie the behavioral manifestations of aging. In the behavioral experiments reported here, the authors tested young, middle-age, and senescent rates in several versions of a spatial memory task, the Morris water maze. The results of these investigations demonstrate that aged rats are significantly impaired in the Morris task compared to young or middle-age animals. In addition, these studies indicate that age-related deficits in the water maze reflect a specific dysfunction in the ability of older animals to effectively process spatial information rather than a senescent decline in sensory or motor functions. Using the subjects from the behavioral studies, additional investigations assessed whether age-dependent changes in neurochemical and neuroanatomical systems which are known to mediate spatial learning in young animals were related to the behavioral deficits exhibited by aged rats. The results of these studies demonstrate that a portion of senescent animals exhibit significant increases in lateral septal /sup 3/H-desmethylimipramine binding and decrease in /sup 3/H-naloxone binding in this same region as assessed by quantitative in vitro autoradiography.

  16. Information-computational system for storage, search and analytical processing of environmental datasets based on the Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Titov, A.; Gordov, E.; Okladnikov, I.

    2009-04-01

    a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.

  17. Spatial Analysis in Determination Of Flood Prone Areas Using Geographic Information System and Analytical Hierarchy Process at Sungai Sembrong's Catchment

    NASA Astrophysics Data System (ADS)

    Bukari, S. M.; Ahmad, M. A.; Wai, T. L.; Kaamin, M.; Alimin, N.

    2016-07-01

    Floods that struck Johor state in 2006 and 2007 and the East Coastal in 2014 have triggered a greatly impact to the flood management here in Malaysia. Accordingly, this study conducted to determine potential areas of flooding, especially in Batu Pahat district since it faces terrifying experienced with heavy flood. This objective is archived by using the application of Geographic Information Systems (GIS) on study area of flood risk location at the watershed area of Sungai Sembrong. GIS functions as spatial analysis is capable to produce new information based on analysis of data stored in the system. Meanwhile the Analytical Hierarchy Process (AHP) was used as a method for setting up in decision making concerning the existing data. By using AHP method, preparation and position of the criteria and parameters required in GIS are neater and easier to analyze. Through this study, a flood prone area in the watershed of Sungai Sembrong was identified with the help of GIS and AHP. Analysis was conducted to test two different cell sizes, which are 30 and 5. The analysis of flood prone areas were tested on both cell sizes with two different water levels and the results of the analysis were displayed by GIS. Therefore, the use of AHP and GIS are effective and able to determine the potential flood plain areas in the watershed area of Sungai Sembrong.

  18. Semantic processing in information retrieval.

    PubMed Central

    Rindflesch, T. C.; Aronson, A. R.

    1993-01-01

    Intuition suggests that one way to enhance the information retrieval process would be the use of phrases to characterize the contents of text. A number of researchers, however, have noted that phrases alone do not improve retrieval effectiveness. In this paper we briefly review the use of phrases in information retrieval and then suggest extensions to this paradigm using semantic information. We claim that semantic processing, which can be viewed as expressing relations between the concepts represented by phrases, will in fact enhance retrieval effectiveness. The availability of the UMLS domain model, which we exploit extensively, significantly contributes to the feasibility of this processing. PMID:8130547

  19. Information Systems in Dentistry

    PubMed Central

    Masic, Fedja

    2012-01-01

    Introduction: Almost the entire human creativity today, from the standpoint of its efficiency and expediency, is conditioned with the existence of information systems. Most information systems are oriented to the management and decision-making, including health information system. System of health and health insurance together form one of the most important segments of society and its functioning as a compact unit. Increasing requirements for reducing health care costs while preserving or improving the quality of services provided represent a difficult task for the health system. Material and methods: Using descriptive metods by retreiiving literature we analyzed the latest solutions in information and telecommunications technology is the basis for building an effective and efficient health system. Computerization does not have the primary objective of saving, but the rationalization of spending in health care. It is estimated that at least 20-30% of money spent in health care can be rationally utilized. Computerization should give the necessary data and indicators for this rationalization. Very important are the goals of this project and the achievement of other uses and benefits, improving overall care for patients and policyholders, increasing the speed and accuracy of diagnosis in determining treatment using electronic diagnostic and therapeutic guidelines. Results and discussion: Computerization in dentistry began similarly as in other human activities–recording large amounts of data on digital media, and by replacing manual data processing to machine one. But specifics of the dental profession have led to the specifics of the application of information technology (IT), and continue to require special development of dental oriented and applied IT. Harmonization of dental software with global standards will enable doctors and dentists to with a few mouse clicks via the internet reach the general medical information about their patients from the central

  20. Neural processing of gravity information

    NASA Technical Reports Server (NTRS)

    Schor, Robert H.

    1992-01-01

    The goal of this project was to use the linear acceleration capabilities of the NASA Vestibular Research Facility (VRF) at Ames Research Center to directly examine encoding of linear accelerations in the vestibular system of the cat. Most previous studies, including my own, have utilized tilt stimuli, which at very low frequencies (e.g., 'static tilt') can be considered a reasonably pure linear acceleration (e.g., 'down'); however, higher frequencies of tilt, necessary for understanding the dynamic processing of linear acceleration information, necessarily involves rotations which can stimulate the semicircular canals. The VRF, particularly the Long Linear Sled, has promise to provide controlled pure linear accelerations at a variety of stimulus frequencies, with no confounding angular motion.

  1. Information processing, computation, and cognition

    PubMed Central

    Scarantino, Andrea

    2010-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958

  2. Integrated clinical information system.

    PubMed

    Brousseau, G

    1995-01-01

    SIDOCI (Système Informatisé de DOnnées Cliniques Intégrées) is a Canadian joint venture introducing newly-operating paradigms into hospitals. The main goal of SIDOCI is to maintain the quality of care in todayUs tightening economy. SIDOCI is a fully integrated paperless patient-care system which automates and links all information about a patient. Data is available on-line and instantaneously to doctors, nurses, and support staff in the format that best suits their specific requirements. SIDOCI provides a factual and chronological summary of the patient's progress by drawing together clinical information provided by all professionals working with the patient, regardless of their discipline, level of experience, or physical location. It also allows for direct entry of the patient's information at the bedside. Laboratory results, progress notes, patient history and graphs are available instantaneously on screen, eliminating the need for physical file transfers. The system, incorporating a sophisticated clinical information database, an intuitive graphical user interface, and customized screens for each medical discipline, guides the user through standard procedures. Unlike most information systems created for the health care industry, SIDOCI is longitudinal, covering all aspects of the health care process through its link to various vertical systems already in place. A multidisciplinary team has created a clinical dictionary that provides the user with most of the information she would normally use: symptoms, signs, diagnoses, allergies, medications, interventions, etc. This information is structured and displayed in such a manner that health care professionals can document the clinical situation at the touch of a finger. The data is then encoded into the patient's file. Once encoded, the structured data is accessible for research, statistics, education, and quality assurance. This dictionary complies with national and international nomenclatures. It also

  3. Integrated clinical information system.

    PubMed

    Brousseau, G

    1995-01-01

    SIDOCI (Système Informatisé de DOnnées Cliniques Intégrées) is a Canadian joint venture introducing newly-operating paradigms into hospitals. The main goal of SIDOCI is to maintain the quality of care in todayUs tightening economy. SIDOCI is a fully integrated paperless patient-care system which automates and links all information about a patient. Data is available on-line and instantaneously to doctors, nurses, and support staff in the format that best suits their specific requirements. SIDOCI provides a factual and chronological summary of the patient's progress by drawing together clinical information provided by all professionals working with the patient, regardless of their discipline, level of experience, or physical location. It also allows for direct entry of the patient's information at the bedside. Laboratory results, progress notes, patient history and graphs are available instantaneously on screen, eliminating the need for physical file transfers. The system, incorporating a sophisticated clinical information database, an intuitive graphical user interface, and customized screens for each medical discipline, guides the user through standard procedures. Unlike most information systems created for the health care industry, SIDOCI is longitudinal, covering all aspects of the health care process through its link to various vertical systems already in place. A multidisciplinary team has created a clinical dictionary that provides the user with most of the information she would normally use: symptoms, signs, diagnoses, allergies, medications, interventions, etc. This information is structured and displayed in such a manner that health care professionals can document the clinical situation at the touch of a finger. The data is then encoded into the patient's file. Once encoded, the structured data is accessible for research, statistics, education, and quality assurance. This dictionary complies with national and international nomenclatures. It also

  4. Dynamic Information Architecture System

    1997-02-12

    The Dynamic Information System (DIAS) is a flexible object-based software framework for concurrent, multidiscplinary modeling of arbitrary (but related) processes. These processes are modeled as interrelated actions caused by and affecting the collection of diverse real-world objects represented in a simulation. The DIAS architecture allows independent process models to work together harmoniously in the same frame of reference and provides a wide range of data ingestion and output capabilities, including Geographic Information System (GIS) typemore » map-based displays and photorealistic visualization of simulations in progress. In the DIAS implementation of the object-based approach, software objects carry within them not only the data which describe their static characteristics, but also the methods, or functions, which describe their dynamic behaviors. There are two categories of objects: (1) Entity objects which have real-world counterparts and are the actors in a simulation, and (2) Software infrastructure objects which make it possible to carry out the simulations. The Entity objects contain lists of Aspect objects, each of which addresses a single aspect of the Entity''s behavior. For example, a DIAS Stream Entity representing a section of a river can have many aspects correspondimg to its behavior in terms of hydrology (as a drainage system component), navigation (as a link in a waterborne transportation system), meteorology (in terms of moisture, heat, and momentum exchange with the atmospheric boundary layer), and visualization (for photorealistic visualization or map type displays), etc. This makes it possible for each real-world object to exhibit any or all of its unique behaviors within the context of a single simulation.« less

  5. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  6. Information Processing Theory: Classroom Applications.

    ERIC Educational Resources Information Center

    Slate, John R.; Charlesworth, John R., Jr.

    1989-01-01

    Utilizes the information processing model of human memory to provide teachers with suggestions for improving the teaching-learning process. Briefly explains and specifies applications of major theoretical concepts: attention, active learning, meaningfulness, organization, advanced organizers, memory aids, overlearning, automatically, and…

  7. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  8. One of the Methods of Organizing the Information Storage Unit of a System of Data Retrieval and Processing (SPOD).

    ERIC Educational Resources Information Center

    Askinazi, R. B.; Papina, I. L.

    The paper deals with one method of organizing the storage unit of a descriptor IPS (information retrieval system) of the SPOD type, the information array of which constitutes the totality of uniform documents with ordered disposition of data within each of them. Three categories of data composing the retrieval form of document were defined…

  9. Enabling Business Processes through Information Management and IT Systems: The FastFit and Winter Gear Distributors Case Studies

    ERIC Educational Resources Information Center

    Kesner, Richard M.; Russell, Bruce

    2009-01-01

    The "FastFit Case Study" and its companion, the "Winter Gear Distributors Case Study" provide undergraduate business students with a suitable and even familiar business context within which to initially consider the role of information management (IM) and to a lesser extent the role of information technology (IT) systems in enabling a business.…

  10. Landfill site selection using geographic information system and analytical hierarchy process: A case study Al-Hillah Qadhaa, Babylon, Iraq.

    PubMed

    Chabuk, Ali; Al-Ansari, Nadhir; Hussain, Hussain Musa; Knutsson, Sven; Pusch, Roland

    2016-05-01

    Al-Hillah Qadhaa is located in the central part of Iraq. It covers an area of 908 km(2) with a total population of 856,804 inhabitants. This Qadhaa is the capital of Babylon Governorate. Presently, no landfill site exists in that area based on scientific site selection criteria. For this reason, an attempt has been carried out to find the best locations for landfills. A total of 15 variables were considered in this process (groundwater depth, rivers, soil types, agricultural land use, land use, elevation, slope, gas pipelines, oil pipelines, power lines, roads, railways, urban centres, villages and archaeological sites) using a geographic information system. In addition, an analytical hierarchy process was used to identify the weight for each variable. Two suitable candidate landfill sites were determined that fulfil the requirements with an area of 9.153 km(2) and 8.204 km(2) These sites can accommodate solid waste till 2030. PMID:26965404

  11. Weather Information System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    WxLink is an aviation weather system based on advanced airborne sensors, precise positioning available from the satellite-based Global Positioning System, cockpit graphics and a low-cost datalink. It is a two-way system that uplinks weather information to the aircraft and downlinks automatic pilot reports of weather conditions aloft. Manufactured by ARNAV Systems, Inc., the original technology came from Langley Research Center's cockpit weather information system, CWIN (Cockpit Weather INformation). The system creates radar maps of storms, lightning and reports of surface observations, offering improved safety, better weather monitoring and substantial fuel savings.

  12. Revitalizing executive information systems.

    PubMed

    Crockett, F

    1992-01-01

    As the saying goes, "garbage in, garbage out"--and this is as true for executive information systems as for any other computer system. Crockett presents a methodology he has used with clients to help them develop more useful systems that produce higher quality information. The key is to develop performance measures based on critical success factors and stakeholder expectations and then to link them cross functionally to show how progress is being made on strategic goals. Feedback from the executive information system then informs strategy formulation, business plan development, and operational activities.

  13. Information processing in miniature brains.

    PubMed

    Chittka, L; Skorupski, P

    2011-03-22

    Since a comprehensive understanding of brain function and evolution in vertebrates is often hobbled by the sheer size of the nervous system, as well as ethical concerns, major research efforts have been made to understand the neural circuitry underpinning behaviour and cognition in invertebrates, and its costs and benefits under natural conditions. This special feature of Proceedings of the Royal Society B contains an idiosyncratic range of current research perspectives on neural underpinnings and adaptive benefits (and costs) of such diverse phenomena as spatial memory, colour vision, attention, spontaneous behaviour initiation, memory dynamics, relational rule learning and sleep, in a range of animals from marine invertebrates with exquisitely simple nervous systems to social insects forming societies with many thousands of individuals working together as a 'superorganism'. This introduction provides context and history to tie the various approaches together, and concludes that there is an urgent need to understand the full neuron-to-neuron circuitry underlying various forms of information processing-not just to explore brain function comprehensively, but also to understand how (and how easily) cognitive capacities might evolve in the face of pertinent selection pressures. In the invertebrates, reaching these goals is becoming increasingly realistic.

  14. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  15. Flawless information system implementation.

    PubMed

    Schmitz, J W

    1993-01-01

    When it was decided to replace the homegrown materiel management information system at Barnes Hospital, a 1,200-bed hospital in St. Louis, Missouri, with a more comprehensive one, the aim was to have a swift, error-free selection, testing and implementation process. It met these goals by dedicating the following resources to the process: 1) a dedicated, full-time user responsible for requirements definition, testing, training and user support, 2) a dedicated IS support team for selection, testing and implementation of the software package, 3) availability of additional personnel in Materiel Management for general assistance, 4) a team approach, both at the project team level, and hospital wide, 5) a total commitment to quality at every phase, 6) a thorough approach to testing, both at the system level, and at the unit, or program level and 7) the vendor commitment of extra time, money and energy to help us make the system work to the best of its ability. PMID:10123862

  16. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  17. Image-plane processing of visual information

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.

    1984-01-01

    Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.

  18. Optical Hybrid Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Takeda, Shuntaro; Furusawa, Akira

    Historically, two complementary approaches to optical quantum information processing have been pursued: qubits and continuous-variables, each exploiting either particle or wave nature of light. However, both approaches have pros and cons. In recent years, there has been a significant progress in combining both approaches with a view to realizing hybrid protocols that overcome the current limitations. In this chapter, we first review the development of the two approaches with a special focus on quantum teleportation and its applications. We then introduce our recent research progress in realizing quantum teleportation by a hybrid scheme, and mention its future applications to universal and fault-tolerant quantum information processing.

  19. Information System Overview.

    ERIC Educational Resources Information Center

    Burrows, J. H.

    This paper was prepared for distribution to the California Educational Administrators participating in the "Executive Information Systems" Unit of Instruction as part of the instructional program of Operation PEP (Prepare Educational Planners). The purpose of the course was to introduce some basic concepts of information systems technology to…

  20. Environmental geographic information system.

    SciTech Connect

    Peek, Dennis W; Helfrich, Donald Alan; Gorman, Susan

    2010-08-01

    This document describes how the Environmental Geographic Information System (EGIS) was used, along with externally received data, to create maps for the Site-Wide Environmental Impact Statement (SWEIS) Source Document project. Data quality among the various classes of geographic information system (GIS) data is addressed. A complete listing of map layers used is provided.

  1. Extraction of motion information from peripheral processes.

    PubMed

    Jain, R

    1981-05-01

    This paper is mainly concerned with low-level processes in machine perception of motion. A motion analysis system should exploit information contained in ``early warning signals'' during the intensity based peripheral phase of motion perception. We show that intensity based difference pictures contain motion information about objects in a dynamic scene, and present methods for the extraction of motion information in the peripheral phase. Some experiments with laboratory generated and real world scenes demonstrate the potential of the technique.

  2. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Fleury, Laurence; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Asencio, Nicole; Favot, Florence; Roussot, Odile

    2013-04-01

    The AMMA information system aims at expediting data and scientific results communication inside the AMMA community and beyond. It has already been adopted as the data management system by several projects and is meant to become a reference information system about West Africa area for the whole scientific community. The AMMA database and the associated on line tools have been developed and are managed by two French teams (IPSL Database Centre, Palaiseau and OMP Data Service, Toulouse). The complete system has been fully duplicated and is operated by AGRHYMET Regional Centre in Niamey, Niger. The AMMA database contains a wide variety of datasets: - about 250 local observation datasets, that cover geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health...) They come from either operational networks or scientific experiments, and include historical data in West Africa from 1850; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Database users can access all the data using either the portal http://database.amma-international.org or http://amma.agrhymet.ne/amma-data. Different modules are available. The complete catalogue enables to access metadata (i.e. information about the datasets) that are compliant with the international standards (ISO19115, INSPIRE...). Registration pages enable to read and sign the data and publication policy, and to apply for a user database account. The data access interface enables to easily build a data extraction request by selecting various criteria like location, time, parameters... At present, the AMMA database counts more than 740 registered users and process about 80 data requests every month In order to monitor day-to-day meteorological and environment information over West Africa, some quick look and report display websites have

  3. Information symmetries in irreversible processes

    NASA Astrophysics Data System (ADS)

    Ellison, Christopher J.; Mahoney, John R.; James, Ryan G.; Crutchfield, James P.; Reichardt, Jörg

    2011-09-01

    We study dynamical reversibility in stationary stochastic processes from an information-theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations. In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes. As a consequence, the computational resources necessary to generate a process in the forward and reverse temporal directions are generally not the same. In fact, an exhaustive survey indicates that most stationary processes are irreversible. We study the ensuing relations between model topology in different representations, the process's statistical properties, and its reversibility in detail. A process's temporal asymmetry is efficiently captured using two canonical unifilar representations of the generating model, the forward-time and reverse-time ɛ-machines. We analyze example irreversible processes whose ɛ-machine representations change size under time reversal, including one which has a finite number of recurrent causal states in one direction, but an infinite number in the opposite. From the forward-time and reverse-time ɛ-machines, we are able to construct a symmetrized, but nonunifilar, generator of a process—the bidirectional machine. Using the bidirectional machine, we show how to directly calculate a process's fundamental information properties, many of which are otherwise only poorly approximated via process samples. The tools we introduce and the insights we offer provide a better understanding of the many facets of reversibility and irreversibility in stochastic processes.

  4. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  5. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  6. Global Resources Information System

    NASA Technical Reports Server (NTRS)

    Estes, J. E.; Star, J. L. (Principal Investigator); Cosentino, M. J.; Mann, L. J.

    1984-01-01

    The basic design criteria and operating characteristics of a Global Resources Information System GRIS are defined. Researchers are compiling background material and aiding JPL personnel in this project definition phase of GRIS. A bibliography of past studies and current work on large scale information systems is compiled. The material in this bibliography will be continuously updated throughout the lifetime of this grant. Project management, systems architecture, and user applications are also discussed.

  7. Processing Of Visual Information In Primate Brains

    NASA Technical Reports Server (NTRS)

    Anderson, Charles H.; Van Essen, David C.

    1991-01-01

    Report reviews and analyzes information-processing strategies and pathways in primate retina and visual cortex. Of interest both in biological fields and in such related computational fields as artificial neural networks. Focuses on data from macaque, which has superb visual system similar to that of humans. Authors stress concept of "good engineering" in understanding visual system.

  8. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2015-06-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  9. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  10. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; Brissebrat, Guillaume; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Favot, Florence; Roussot, Odile

    2014-05-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to facilitate and speed up data and information exchange between researchers from different disciplines. The AMMA information system includes (i) a multidisciplinary user-friendly data management and dissemination system, (ii) report and chart archives associated with display websites and (iii) a scientific paper exchange system. The AMMA information system is enriched by several previous (IMPETUS...) and following projects (FENNEC, ESCAPE, QweCI, DACCIWA…) and is becoming a reference information system about West Africa monsoon. (i) The AMMA project includes airborne, ground-based and ocean measurements, satellite data use, modelling studies and value-added product development. Therefore, the AMMA database user interface enables to access a great amount and a large variety of data: - 250 local observation datasets, that cover many geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health). They have been collected by operational networks from 1850 to present, long term monitoring research networks (CATCH, IDAF, PIRATA...) or scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. All the data are documented in compliance with metadata international standards, and delivered into standard formats. The data request user interface takes full advantage of the data and metadata base relational structure and enables users to elaborate easily multicriteria data requests (period, area, property, property value…). The AMMA data portal counts around 800 registered users and process about 50 data requests every month. The AMMA databases and data portal have been developed and are operated jointly by SEDOO and ESPRI in France

  11. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  12. Occurrence reporting and processing of operations information

    SciTech Connect

    1997-07-21

    DOE O 232.1A, Occurrence Reporting and Processing of Operations Information, and 10 CFR 830.350, Occurrence Reporting and Processing of Operations Information (when it becomes effective), along with this manual, set forth occurrence reporting requirements for Department of Energy (DOE) Departmental Elements and contractors responsible for the management and operation of DOE-owned and -leased facilities. These requirements include categorization of occurrences related to safety, security, environment, health, or operations (``Reportable Occurrences``); DOE notification of these occurrences; and the development and submission of documented follow-up reports. This Manual provides detailed information for categorizing and reporting occurrences at DOE facilities. Information gathered by the Occurrence Reporting and processing System is used for analysis of the Department`s performance in environmental protection, safeguards and security, and safety and health of its workers and the public. This information is also used to develop lessons learned and document events that significantly impact DOE operations.

  13. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  14. Arkansas Technology Information System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan; Parette, Howard P., Jr.

    The Arkansas Technology Information System (ARTIS) was developed to fill a significant void in existing systems of technical support to Arkansans with disabilities by creating and maintaining a consumer-responsive statewide system of data storage and retrieval regarding assistive technology and services. ARTIS goals also include establishment of a…

  15. SHRIF, a General-Purpose System for Heuristic Retrieval of Information and Facts, Applied to Medical Knowledge Processing.

    ERIC Educational Resources Information Center

    Findler, Nicholas V.; And Others

    1992-01-01

    Describes SHRIF, a System for Heuristic Retrieval of Information and Facts, and the medical knowledge base that was used in its development. Highlights include design decisions; the user-machine interface, including the language processor; and the organization of the knowledge base in an artificial intelligence (AI) project like this one. (57…

  16. HOPE information system review

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshiaki; Nishiyama, Kenji; Ono, Shuuji; Fukuda, Kouin

    1992-08-01

    An overview of the review conducted on H-2 Orbiting Plane (HOPE) is presented. A prototype model was constructed by inputting various technical information proposed by related laboratories. Especially operation flow which enables understanding of correlation between various analysis items, judgement criteria, technical data, and interfaces with others was constructed. Technical information data base and retrieval systems were studied. A Macintosh personal computer was selected for information shaping because of its excellent function, performance, operability, and software completeness.

  17. Practicality of quantum information processing

    NASA Astrophysics Data System (ADS)

    Lau, Hoi-Kwan

    Quantum Information Processing (QIP) is expected to bring revolutionary enhancement to various technological areas. However, today's QIP applications are far from being practical. The problem involves both hardware issues, i.e., quantum devices are imperfect, and software issues, i.e., the functionality of some QIP applications is not fully understood. Aiming to improve the practicality of QIP, in my PhD research I have studied various topics in quantum cryptography and ion trap quantum computation. In quantum cryptography, I first studied the security of position-based quantum cryptography (PBQC). I discovered a wrong assumption in the previous literature that the cheaters are not allowed to share entangled resources. I proposed entanglement attacks that could cheat all known PBQC protocols. I also studied the practicality of continuous-variable (CV) quantum secret sharing (QSS). While the security of CV QSS was considered by the literature only in the limit of infinite squeezing, I found that finitely squeezed CV resources could also provide finite secret sharing rate. Our work relaxes the stringent resources requirement of implementing QSS. In ion trap quantum computation, I studied the phase error of quantum information induced by dc Stark effect during ion transportation. I found an optimized ion trajectory for which the phase error is the minimum. I also defined a threshold speed, above which ion transportation would induce significant error. In addition, I proposed a new application for ion trap systems as universal bosonic simulators (UBS). I introduced two architectures, and discussed their respective strength and weakness. I illustrated the implementations of bosonic state initialization, transformation, and measurement by applying radiation fields or by varying the trap potential. When comparing with conducting optical experiments, the ion trap UBS is advantageous in higher state initialization efficiency and higher measurement accuracy. Finally, I

  18. Information processing. [in human performance

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher D.; Flach, John M.

    1988-01-01

    Theoretical models of sensory-information processing by the human brain are reviewed from a human-factors perspective, with a focus on their implications for aircraft and avionics design. The topics addressed include perception (signal detection and selection), linguistic factors in perception (context provision, logical reversals, absence of cues, and order reversals), mental models, and working and long-term memory. Particular attention is given to decision-making problems such as situation assessment, decision formulation, decision quality, selection of action, the speed-accuracy tradeoff, stimulus-response compatibility, stimulus sequencing, dual-task performance, task difficulty and structure, and factors affecting multiple task performance (processing modalities, codes, and stages).

  19. Forest Resource Information System

    NASA Technical Reports Server (NTRS)

    Mrocznyski, R. P.

    1983-01-01

    Twenty-three processing functions aid in utilizing LANDSAT data for forest resource management. Designed to work primarily with digital data obtained from measurements recorded by multispectral remote sensors mounted on aerospace platforms. communication between processing functions, simplicity of control, and commonality of data files in LARSFRIS enhance usefulness of system as tool for research and development of remote sensing systems.

  20. NLP Meets the Jabberwocky: Natural Language Processing in Information Retrieval.

    ERIC Educational Resources Information Center

    Feldman, Susan

    1999-01-01

    Focuses on natural language processing (NLP) in information retrieval. Defines the seven levels at which people extract meaning from text/spoken language. Discusses the stages of information processing; how an information retrieval system works; advantages to adding full NLP to information retrieval systems; and common problems with information…

  1. Management Information Systems.

    ERIC Educational Resources Information Center

    Crump, Kelvin

    An Australian university architect studying management information systems programs at academic institutions in the United States visited 26 universities and colleges and nine educational and professional associations, including extended visits at the University of Wisconsin and the National Center of Higher Education Management Systems. During…

  2. FLEXIBLE APPLICATION OF THE JLAB PANSOPHY INFORMATION SYSTEM FOR PROJECT REPORTS, PROCESS MONITORING, AND R&D SAMPLE TRACKING

    SciTech Connect

    Valerie Bookwalter; Bonnie Madre; Charles Reece

    2008-02-12

    The use and features of the JLab SRF Institute IT system Pansophy1,2 continue to expand. In support of the cryomodule rework project for CEBAF a full set of web-based travelers has been implemented and an integrated set of live summary reports has been created. A graphical user interface within the reports enables navigation to either higher-level summaries or drill-down to the original source data. In addition to collection of episodic data, Pansophy is now used to capture, coordinate, and display continuously logged process parameter that relate to technical water systems and clean room environmental conditions. In a new expansion, Pansophy is being used to collect and track process and analytical data sets associated with SRF material samples that are part of the surface creation, processing, and characterization R&D program.

  3. Quantum communication and information processing

    NASA Astrophysics Data System (ADS)

    Beals, Travis Roland

    Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.

  4. Electrostatic containerless processing system

    NASA Astrophysics Data System (ADS)

    Rulison, Aaron J.; Watkins, John L.; Zambrano, Brian

    1997-07-01

    We introduce a materials science tool for investigating refractory solids and melts: the electrostatic containerless processing system (ESCAPES). ESCAPES maintains refractory specimens of materials in a pristine state by levitating and heating them in a vacuum chamber, thereby avoiding the contaminating influences of container walls and ambient gases. ESCAPES is designed for the investigation of thermophysical properties, phase equilibria, metastable phase formation, undercooling and nucleation, time-temperature-transformation diagrams, and other aspects of materials processing. ESCAPES incorporates several design improvements over prior electrostatic levitation technology. It has an informative and responsive computer control system. It has separate light sources for heating and charging, which prevents runaway discharging. Both the heating and charging light sources are narrow band, which allows the use of optical pyrometry and other diagnostics at all times throughout processing. Heat is provided to the levitated specimens by a 50 W Nd:YAG laser operating at 1.064 μm. A deuterium arc lamp charges the specimen through photoelectric emission. ESCAPES can heat metals, ceramics, and semiconductors to temperatures exceeding 2300 K; specimens range in size from 1 to 3 mm diam. This article describes the design, capabilities, and applications of ESCAPES, focusing on improvements over prior electrostatic levitation technology.

  5. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  6. Information processing for aerospace structural health monitoring

    NASA Astrophysics Data System (ADS)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  7. Network Information System

    1996-05-01

    The Network Information System (NWIS) was initially implemented in May 1996 as a system in which computing devices could be recorded so that unique names could be generated for each device. Since then the system has grown to be an enterprise wide information system which is integrated with other systems to provide the seamless flow of data through the enterprise. The system Iracks data for two main entities: people and computing devices. The following aremore » the type of functions performed by NWIS for these two entities: People Provides source information to the enterprise person data repository for select contractors and visitors Generates and tracks unique usernames and Unix user IDs for every individual granted cyber access Tracks accounts for centrally managed computing resources, and monitors and controls the reauthorization of the accounts in accordance with the DOE mandated interval Computing Devices Generates unique names for all computing devices registered in the system Tracks the following information for each computing device: manufacturer, make, model, Sandia property number, vendor serial number, operating system and operating system version, owner, device location, amount of memory, amount of disk space, and level of support provided for the machine Tracks the hardware address for network cards Tracks the P address registered to computing devices along with the canonical and alias names for each address Updates the Dynamic Domain Name Service (DDNS) for canonical and alias names Creates the configuration files for DHCP to control the DHCP ranges and allow access to only properly registered computers Tracks and monitors classified security plans for stand-alone computers Tracks the configuration requirements used to setup the machine Tracks the roles people have on machines (system administrator, administrative access, user, etc...) Allows systems administrators to track changes made on the machine (both hardware and software) Generates an

  8. Manufacturing information system

    NASA Astrophysics Data System (ADS)

    Allen, D. K.; Smith, P. R.; Smart, M. J.

    1983-12-01

    The size and cost of manufacturing equipment has made it extremely difficult to perform realistic modeling and simulation of the manufacturing process in university research laboratories. Likewise the size and cost factors, coupled with many uncontrolled variables of the production situation has even made it difficult to perform adequate manufacturing research in the industrial setting. Only the largest companies can afford manufacturing research laboratories; research results are often held proprietary and seldom find their way into the university classroom to aid in education and training of new manufacturing engineers. It is the purpose for this research to continue the development of miniature prototype equipment suitable for use in an integrated CAD/CAM Laboratory. The equipment being developed is capable of actually performing production operations (e.g. drilling, milling, turning, punching, etc.) on metallic and non-metallic workpieces. The integrated CAD/CAM Mini-Lab is integrating high resolution, computer graphics, parametric design, parametric N/C parts programmings, CNC machine control, automated storage and retrieval, with robotics materials handling. The availability of miniature CAD/CAM laboratory equipment will provide the basis for intensive laboratory research on manufacturing information systems.

  9. Management Information System

    NASA Technical Reports Server (NTRS)

    1984-01-01

    New Automated Management Information Center (AMIC) employs innovative microcomputer techniques to create color charts, viewgraphs, or other data displays in a fraction of the time formerly required. Developed under Kennedy Space Center's contract by Boeing Services International Inc., Seattle, WA, AMIC can produce an entirely new informational chart in 30 minutes, or an updated chart in only five minutes. AMIC also has considerable potential as a management system for business firms.

  10. Information extraction during simultaneous motion processing.

    PubMed

    Rideaux, Reuben; Edwards, Mark

    2014-02-01

    When confronted with multiple moving objects the visual system can process them in two stages: an initial stage in which a limited number of signals are processed in parallel (i.e. simultaneously) followed by a sequential stage. We previously demonstrated that during the simultaneous stage, observers could discriminate between presentations containing up to 5 vs. 6 spatially localized motion signals (Edwards & Rideaux, 2013). Here we investigate what information is actually extracted during the simultaneous stage and whether the simultaneous limit varies with the detail of information extracted. This was achieved by measuring the ability of observers to extract varied information from low detail, i.e. the number of signals presented, to high detail, i.e. the actual directions present and the direction of a specific element, during the simultaneous stage. The results indicate that the resolution of simultaneous processing varies as a function of the information which is extracted, i.e. as the information extraction becomes more detailed, from the number of moving elements to the direction of a specific element, the capacity to process multiple signals is reduced. Thus, when assigning a capacity to simultaneous motion processing, this must be qualified by designating the degree of information extraction. PMID:24333279

  11. The Process of Learning from Information.

    ERIC Educational Resources Information Center

    Kuhlthau, Carol Collier

    1995-01-01

    Presents the process of learning from information as the key concept for the library media center in the information age school. The Information Search Process Approach is described as a model for developing information skills fundamental to information literacy, and process learning is discussed. (Author/LRW)

  12. Global Land Information System

    USGS Publications Warehouse

    ,

    1999-01-01

    The Global Land Information System (GLIS) is a World Wide Web-based query tool developed by the U.S. Geological Survey (USGS) to provide data and information about the Earth's land surface. Examples of holdings available through the GLIS include cartographic data, topographic data, soils data, aerial photographs, and satellite images from various agencies and cooperators located around the world. Both hard copy and digital data collections are represented in the GLIS, and preview images are available for millions of the products in the system.

  13. Geographic information systems

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.

    1982-01-01

    Information and activities are provided to: (1) enhance the ability to distinguish between a Geographic Information System (GIS) and a data management system; (2) develop understanding of spatial data handling by conventional methods versus the automated approach; (3) promote awareness of GIS design and capabilities; (4) foster understanding of the concepts and problems of data base development and management; (5) facilitate recognition of how a computerized GIS can model conditions in the present "real world" to project conditions in the future; and (6) appreciate the utility of integrating LANDSAT and other remotely sensed data into the GIS.

  14. The Phobos information system

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I. P.; Oberst, J.; Zubarev, A. E.; Nadezhdina, I. E.; Kokhanov, A. A.; Garov, A. S.; Uchaev, D. V.; Uchaev, Dm. V.; Malinnikov, V. A.; Klimkin, N. D.

    2014-11-01

    We have developed a Geo-information system (GIS) for Phobos, based on data from the Mars Express and Viking Orbiter missions, which includes orthoimages, global maps, terrain- and gravity field models, all referenced to the Phobos coordinate system. The data are conveniently stored in the ArcGIS software system, which provides an environment for mapping and which allows us to carry out joint data analysis and miscellaneous data cross-comparisons. We have compiled catalogs of Phobos craters using manual and automated techniques, which includes about 5500 and 6400 craters correspondingly. While crater numbers are biased by available image data resolution and illumination, we estimate that our catalog of manually detected craters contains all Phobos craters with diameters D>250 m which is a total of 1072 and catalog of automated detected craters are complete for craters D>400 m (360 craters). Statistical analysis of these large craters reveals a surplus of craters on the anti-Mars hemisphere, whereas differences in crater abundance between leading and trailing hemisphere cannot be confirmed. This in contrast to previous papers, where no such asymmetry was found (Schmedemann et al., 2014). But we cannot rule out remaining biases due to resolution, viewing angles or illumination effects. Using digital terrain model (DTM) derived from photogrammetry image processing we estimate depths of 25 craters larger than 2 km using geometric and dynamic heights (for discussion of Phobos crater morphometry see Kokhanov et al., 2014). We also have compiled catalogs of lineaments, and boulders. In particular, we mapped 546 individual grooves or crater chains, which extend in length from 0.3 km to 16.2 km. We identified and determined the sizes and locations of 1379 boulders near crater Stickney. Cross-comparisons of gravity field models against distribution patterns of grooves and boulders are currently under way and may shed light on their possible origins. Finally, we have developed

  15. Cardiovascular information systems.

    PubMed

    Marion, Joe

    2012-01-01

    The ARRA/HITECH Act has made electronic medical records a front burner issue, and many believe that EMRs will make departmental systems redundant. Some cardiologists beg to differ, arguing that cardiovascular information systems are deeply clinical and essential to the cardiovascular workflow. Here's a look at the evolution of CVIS, EMR, and their roles as the healthcare landscape is being transformed by meaningful use.

  16. Process and information integration via hypermedia

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.

    1990-01-01

    Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.

  17. The EH safety representative information system on the safety performance measurement system is where you will find... Word processing and helps with a V-PLUS

    NASA Astrophysics Data System (ADS)

    Loo, P. I.

    What are some of the current environmental, safety, and health problems being found at different DOE facilities? What are some of latest software products available for HP-3000 on-line application? How can I meet my customer's ever-changing requirements? These and many other questions will be focused on within this review of the Environment, Safety, and Health (EH) Safety Representative Information System (SRIS) located on the Safety Performance Measurement System (SPMS). SPMS is a collection of automated environmental, safety, and health information modules for references by DOE and DOE contractors. SPMS is operated by the Management Information Systems (MIS) Unit of the System Safety Development Center at EG&G Idaho, Inc. In the following sections an overview of SRIS, an on-line system designed for the HP-3000, will be presented along with an analysis of design methods and software packages used to develop the system.

  18. Quantum Information Processing with Trapped Ions

    NASA Astrophysics Data System (ADS)

    Roos, Christian

    Trapped ions constitute a well-isolated small quantum system that offers low decoherence rates and excellent opportunities for quantum control and measurement by laser-induced manipulation of the ions. These properties make trapped ions an attractive system for experimental investigations of quantum information processing. In the following, the basics of storing, manipulating and measuring quantum information encoded in a string of trapped ions will be discussed. Based on these techniques, entanglement can be created and simple quantum protocols like quantum teleportation be realized. This chapter concludes with a discussion of the use of entangling laser-ion interactions for quantum simulations and quantum logic spectroscopy.

  19. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  20. Conceptual Model of Multidimensional Marketing Information System

    NASA Astrophysics Data System (ADS)

    Kriksciuniene, Dalia; Urbanskiene, Ruta

    This article is aimed to analyse, why the information systems at the enterprise not always satisfy the expectations of marketing management specialists. The computerized systems more and more successfully serve information needs of those areas of enterprise management, where they can create the information equivalent of real management processes. Yet their inability to effectively fulfill marketing needs indicate the gaps not only in ability to structure marketing processes, but in the conceptual development of marketing information systems (MkIS) as well.

  1. Category identification of changed land-use polygons in an integrated image processing/geographic information system

    NASA Technical Reports Server (NTRS)

    Westmoreland, Sally; Stow, Douglas A.

    1992-01-01

    A framework is proposed for analyzing ancillary data and developing procedures for incorporating ancillary data to aid interactive identification of land-use categories in land-use updates. The procedures were developed for use within an integrated image processsing/geographic information systems (GIS) that permits simultaneous display of digital image data with the vector land-use data to be updated. With such systems and procedures, automated techniques are integrated with visual-based manual interpretation to exploit the capabilities of both. The procedural framework developed was applied as part of a case study to update a portion of the land-use layer in a regional scale GIS. About 75 percent of the area in the study site that experienced a change in land use was correctly labeled into 19 categories using the combination of automated and visual interpretation procedures developed in the study.

  2. Human Resource Information System

    ERIC Educational Resources Information Center

    Swinford, Paul

    1978-01-01

    A computer at Valley View Schools, Illinois, is used to collect, store, maintain, and retrieve information about a school district's human resources. The system was designed to increase the efficiency and thoroughness of personnel and payroll record keeping, and reporting functions. (Author/MLF)

  3. Geographic information systems

    USGS Publications Warehouse

    ,

    1992-01-01

    Geographic information systems (GIS) technology can be used for scientific investigations, resource management, and developmental planning. For example, a GIS might allow emergency planners to easily calculate emergency response times in the event of a natural disaster, or a GIS might be used to find wetlands that need protection form pollution.

  4. Communication and Information Systems.

    ERIC Educational Resources Information Center

    Wheeler, Peter

    1982-01-01

    Discusses the Microelectronics Education Programme's work in the communication and information systems domain, suggesting that teachers understanding the new technologies and incorporate them into regular classroom instruction. Focuses on computers in the classroom, economy of time, keyboard skills, life skills, and vocational training. (Author/JN)

  5. Studies Probe Information Systems.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo

    1979-01-01

    Summarizes the two recent studies of interest to the library community: (1) an A. D. Little analysis of past and present systems for dissemination of scientific technical information, and (2) Fritz Machlup's economic profile of key disseminators of scholarly, scientific, and intellectual knowledge. (FM)

  6. Energy Information Systems.

    ERIC Educational Resources Information Center

    Hales, Celia E.

    This paper examines the need for accurate, reliable data on energy, flowing upward to the national government from various energy-intensive information systems. Part I explores the need for a national policy coordinating this flow within both the United States and, for comparative purposes, Great Britain. Part II presents in outline form the…

  7. Pharmacology Information System Ready

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1973

    1973-01-01

    Discusses the development and future of Prophet,'' a specialized information handling system for pharmacology research. It is designed to facilitate the acquisition and dissemination of knowledge about mechanisms of drug action, and it is hoped that it will aid in converting pharmacology research from an empirical to a predictive science. (JR)

  8. Demystifying radiology information systems.

    PubMed

    Swearingen, R

    2000-01-01

    Selecting the right radiology information system (RIS) can be a difficult and tedious task for radiology managers. Sometimes the information systems department ends up selecting the RIS. As a radiology manager, you can help yourself and your department greatly by becoming more educated concerning the technology and terminology of radiology information systems. You can then participate in one of the most important decisions that will ever be made regarding your department. There is much confusion about the meanings of the terms interfaced and integrated. Two applications are generally considered integrated if they freely access and update each other's databases. Two applications are generally considered interfaced if they pass data to each other but don't directly access nor update the other's databases. Two more terms are centralized and decentralized. Centralized is the concept of "putting all of your eggs in one basket." Decentralization means you spread your resources out. The main difference between centralized and decentralized is that all components of a centralized system share the same fate (good or bad), while decentralized components operate independently and aren't affected directly by failures in another system. Another significant term relevant to RIS systems is HL7, which is a standardized data format that allows one application to pass data to another application in a format that the receiving application understands. RIS vendors generally fall in three categories: single-source vendors, multiproduct vendors and single-product vendors. Single-product vendors include best-of-breed vendors. No one approach is necessarily better than the others; which you choose will depend on your needs. When considering the purchase of an RIS system, there are important questions to ask yourself, the vendor and the vendors' customers as you gather information and prepare to make a decision.

  9. Toward intelligent information system

    NASA Astrophysics Data System (ADS)

    Takano, Fumio; Hinatsu, Ken'ichi

    This article describes the indexing aid system and project at JICST, API, NLM and BIOSIS. They are dealing with the very broad domain of science, medicine and technological literatures and indexing is done by use of controlled terms, the indexing is routinely performed by highly skilled indexers. Because of the high cost of controlled indexing of bibliographic information they have designed automated indexing system and/or expert-like system to take advantage of many years of experienced indexing using knowledge bases and /on thesauri.

  10. Safeguards Information Management Systems (SIMS)

    SciTech Connect

    Sorenson, R.J.; Sheely, K.B.; Brown, J.B.; Horton, R.D.; Strittmatter, R.; Manatt, D.R.

    1994-04-01

    The requirements for the management of information at the International Atomic Energy Agency (IAEA) and its Department of Safeguards are rapidly changing. Historically, the Department of Safeguards has had the requirement to process large volumes of conventional safeguards information. An information management system is currently in place that adequately handles the IAEA`s conventional safeguards data needs. In the post-Iraq environment, however, there is a growing need to expand the IAEA information management capability to include unconventional forms of information. These data include environmental sampling results, photographs, video film, lists of machine tools, and open-source materials such as unclassified publications. The US Department of Energy (DOE) has responded to this information management need by implementing the Safeguards Information Management Systems (SIMS) initiative. SIMS was created by the DOE to anticipate and respond to IAEA information management needs through a multilaboratory initiative that will utilize an integrated approach to develop and deploy technology in a timely and cost-effective manner. The DOE will use the SIMS initiative to coordinate US information management activities that support the IAEA Department of Safeguards.

  11. Symposium on Geographic Information Systems.

    ERIC Educational Resources Information Center

    Felleman, John, Ed.

    1990-01-01

    Six papers on geographic information systems cover the future of geographic information systems, land information systems modernization in Wisconsin, the Topologically Integrated Geographic Encoding and Referencing (TIGER) System of the U.S. Bureau of the Census, satellite remote sensing, geographic information systems and sustainable development,…

  12. Nuclear criticality information system

    SciTech Connect

    Koponen, B.L.; Hampel, V.E.

    1981-11-30

    The nuclear criticality safety program at LLNL began in the 1950's with a critical measurements program which produced benchmark data until the late 1960's. This same time period saw the rapid development of computer technology useful for both computer modeling of fissile systems and for computer-aided management and display of the computational benchmark data. Database management grew in importance as the amount of information increased and as experimental programs were terminated. Within the criticality safety program at LLNL we began at that time to develop a computer library of benchmark data for validation of computer codes and cross sections. As part of this effort, we prepared a computer-based bibliography of criticality measurements on relatively simple systems. However, it is only now that some of these computer-based resources can be made available to the nuclear criticality safety community at large. This technology transfer is being accomplished by the DOE Technology Information System (TIS), a dedicated, advanced information system. The NCIS database is described.

  13. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  14. Information sciences experiment system

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Murray, Nicholas D.; Benz, Harry F.; Bowker, David E.; Hendricks, Herbert D.

    1990-01-01

    The rapid expansion of remote sensing capability over the last two decades will take another major leap forward with the advent of the Earth Observing System (Eos). An approach is presented that will permit experiments and demonstrations in onboard information extraction. The approach is a non-intrusive, eavesdropping mode in which a small amount of spacecraft real estate is allocated to an onboard computation resource. How such an approach allows the evaluation of advanced technology in the space environment, advanced techniques in information extraction for both Earth science and information science studies, direct to user data products, and real-time response to events, all without affecting other on-board instrumentation is discussed.

  15. Clinical Protocol Information System

    PubMed Central

    Wirtschafter, David D.; Gams, Richard; Ferguson, Carol; Blackwell, William; Boackle, Paul

    1980-01-01

    The Clinical Protocol Information System (CPIS) supports the clinical research and patient care objectives of the SouthEastern Cancer Study Group (SEG). The information system goals are to improve the evaluability of clinical trials, decrease the frequency of adverse patient events, implement drug toxicity surveillance, improve the availability of study data and demonstrate the criteria for computer networks that can impact on the general medical care of the community. Nodes in the network consist of Data General MicroNova MP-100 minicomputers that drive the interactive data dialogue and communicate with the network concentrator (another DG MicroNova) in Birmingham. Functions supported include: source data editing, care “advice,” care “audit,” care “explanation,” and treatment note printing. The complete database is updated nightly and resides on UAB's IBM 370/158-AP.

  16. Proprioceptive information processing in schizophrenia.

    PubMed

    Arnfred, Sidse M H

    2012-03-01

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus was constructed that stimulated the proprioceptive dimension of recognition of applied force. This load stimulus was tested both in simple and several types of more complex stimulus paradigms, with and without tasks, in total in 66 healthy volunteers. The evoked potential (EP) resulting from the load stimulus was named the proprioceptive EP. The later components of the proprioceptive EP (> 150 ms) were modulated similarly to previously reported electrical somatosensory EPs by repetition and cognitive task. The earlier activity was further investigated through decomposition of the time-frequency transformed data by a new non-negative matrix analysis, and previous research and visual inspection validated these results. Several time-frequency components emerged in the proprioceptive EP. The contra-lateral parietal gamma component (60-70 ms; 30-41 Hz) had not previously been described in the somatosensory modality without electrical stimulation. The parietal beta component (87-103 ms; 19-22 Hz) was increased when the proprioceptive stimulus appeared in a predictable sequence in

  17. Proprioceptive information processing in schizophrenia.

    PubMed

    Arnfred, Sidse M H

    2012-03-01

    This doctoral thesis focuses on brain activity in response to proprioceptive stimulation in schizophrenia. The works encompass methodological developments substantiated by investigations of healthy volunteers and two clinical studies of schizophrenia spectrum patients. American psychiatrist Sandor Rado (1890-1972) suggested that one of two un-reducible deficits in schizophrenia was a disorder of proprioception. Exploration of proprioceptive information processing is possible through the measurement of evoked and event related potentials. Event related EEG can be analyzed as conventional time-series averages or as oscillatory averages transformed into the frequency domain. Gamma activity evoked by electricity or by another type of somatosensory stimulus has not been reported before in schizophrenia. Gamma activity is considered to be a manifestation of perceptual integration. A new load stimulus was constructed that stimulated the proprioceptive dimension of recognition of applied force. This load stimulus was tested both in simple and several types of more complex stimulus paradigms, with and without tasks, in total in 66 healthy volunteers. The evoked potential (EP) resulting from the load stimulus was named the proprioceptive EP. The later components of the proprioceptive EP (> 150 ms) were modulated similarly to previously reported electrical somatosensory EPs by repetition and cognitive task. The earlier activity was further investigated through decomposition of the time-frequency transformed data by a new non-negative matrix analysis, and previous research and visual inspection validated these results. Several time-frequency components emerged in the proprioceptive EP. The contra-lateral parietal gamma component (60-70 ms; 30-41 Hz) had not previously been described in the somatosensory modality without electrical stimulation. The parietal beta component (87-103 ms; 19-22 Hz) was increased when the proprioceptive stimulus appeared in a predictable sequence in

  18. Medical image processing system

    NASA Astrophysics Data System (ADS)

    Wang, Dezong; Wang, Jinxiang

    1994-12-01

    In this paper a medical image processing system is described. That system is named NAI200 Medical Image Processing System and has been appraised by Chinese Government. Principles and cases provided here. Many kinds of pictures are used in modern medical diagnoses, for example B-supersonic, X-ray, CT and MRI. Some times the pictures are not good enough for diagnoses. The noises interfere with real situation on these pictures. That means the image processing is needed. A medical image processing system is described in this paper. That system is named NAI200 Medical Image Processing System and has been appraised by Chinese Government. There are four functions in that system. The first part is image processing. More than thirty four programs are involved. The second part is calculating. The areas or volumes of single or multitissues are calculated. Three dimensional reconstruction is the third part. The stereo images of organs or tumors are reconstructed with cross-sections. The last part is image storage. All pictures can be transformed to digital images, then be stored in hard disk or soft disk. In this paper not only all functions of that system are introduced, also the basic principles of these functions are explained in detail. This system has been applied in hospitals. The images of hundreds of cases have been processed. We describe the functions combining real cases. Here we only introduce a few examples.

  19. TWRS information locator database system design description

    SciTech Connect

    Knutson, B.J.

    1996-09-13

    This document gives an overview and description of the Tank Waste Remediation System (TWRS) Information Locator Database (ILD)system design. The TWRS ILD system is an inventory of information used in the TWRS Systems Engineering process to represent the TWRS Technical Baseline. The inventory is maintained in the form of a relational database developed in Paradox 4.5.

  20. Quantum information processing : science & technology.

    SciTech Connect

    Horton, Rebecca; Carroll, Malcolm S.; Tarman, Thomas David

    2010-09-01

    Qubits demonstrated using GaAs double quantum dots (DQD). The qubit basis states are the (1) singlet and (2) triplet stationary states. Long spin decoherence times in silicon spurs translation of GaAs qubit in to silicon. In the near term the goals are: (1) Develop surface gate enhancement mode double quantum dots (MOS & strained-Si/SiGe) to demonstrate few electrons and spin read-out and to examine impurity doped quantum-dots as an alternative architecture; (2) Use mobility, C-V, ESR, quantum dot performance & modeling to feedback and improve upon processing, this includes development of atomic precision fabrication at SNL; (3) Examine integrated electronics approaches to RF-SET; (4) Use combinations of numerical packages for multi-scale simulation of quantum dot systems (NEMO3D, EMT, TCAD, SPICE); and (5) Continue micro-architecture evaluation for different device and transport architectures.

  1. The information systems heritage

    NASA Astrophysics Data System (ADS)

    Kurzhals, P. R.; Bricker, R. W.; Jensen, A. S.; Smith, A. T.

    1981-05-01

    This paper addresses key developments in the evolution of information systems over the past five decades. Major areas covered include the growth of imaging sensors from such pioneering devices as the iconoscope and orthicon which ushered in television, through a wide range of vidicon tubes, to the solid-state arrays which characterize current systems; the phenomenal expansion of electronic communications from telegraph and telephone wires, through the introduction of broadcast and microwave relay services, to the present era of worldwide satellite communications and data networks; and the key role of digital computers from their ancient precursors like the abacus and the mechanical calculating engines, through the appearance of the first large-scale electronic computers and their transistorized successors, to the rapid proliferation of miniaturized processors which impact every aspect of aerospace systems today.

  2. Center for Information Services, Phase II: Detailed System Design and Programming, Part 7 - Text Processing, Phase IIA Final Report.

    ERIC Educational Resources Information Center

    Silva, Georgette M.

    Libraries, as well as larger information networks, are necessarily based upon the storage of information files consisting in many cases of written materials and texts such as books, serials, abstracts, manuscripts and archives. At the present stage of the "information explosion" no librarian can afford to ignore the contribution of modern…

  3. Centralized Storm Information System (CSIS)

    NASA Technical Reports Server (NTRS)

    Norton, C. C.

    1985-01-01

    A final progress report is presented on the Centralized Storm Information System (CSIS). The primary purpose of the CSIS is to demonstrate and evaluate real time interactive computerized data collection, interpretation and display techniques as applied to severe weather forecasting. CSIS objectives pertaining to improved severe storm forecasting and warning systems are outlined. The positive impact that CSIS has had on the National Severe Storms Forecast Center (NSSFC) is discussed. The benefits of interactive processing systems on the forecasting ability of the NSSFC are described.

  4. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Method of implementation of optoelectronic multiparametric signal processing systems based on multivalued-logic principles

    NASA Astrophysics Data System (ADS)

    Arestova, M. L.; Bykovskii, A. Yu

    1995-10-01

    An architecture is proposed for a specialised optoelectronic multivalued logic processor based on the Allen—Givone algebra. The processor is intended for multiparametric processing of data arriving from a large number of sensors or for tackling spectral analysis tasks. The processor architecture makes it possible to obtain an approximate general estimate of the state of an object being diagnosed on a p-level scale. Optoelectronic systems are proposed for MAXIMUM, MINIMUM, and LITERAL logic gates, based on optical-frequency encoding of logic levels. Corresponding logic gates form a complete set of logic functions in the Allen—Givone algebra.

  5. Chemical information systems

    SciTech Connect

    Zebora, M.

    1994-12-31

    The growing number of Federal and State regulations today from EPA to OSHA is putting a large burden on organizations to comply with regard to chemicals in the work place. The cornerstone of chemical information is the Material Safety Data Sheet (MSDS). The MSDS has been a requirement for chemical manufacturers for over fifteen years. Manufacturers of hazardous materials must provide MSDSs to purchasers. However, recent additional regulations from OSHA, in particular the Right To Know and the Hazard Communication Standard (Haz-Com) require that employers who use chemicals must be capable of providing an MSDS to every employee that requests one for a material that they work with. Paper filing systems to managing MSDSs are hard to maintain, costly, and inefficient. In multifacility organizations this can result in delays in distributions of those MSDSs to employees. At AT and T Bell Laboratories the Environmental Health and Safety Center has invested over a decade of development work into producing an integrated Chemical Inventory System/MSDS System. That system meets the requirements discussed in this paper and serves six major R and D laboratory facilities in three states. The system resides on a desktop personal computer. Operation of the system relies on teamwork between several diverse organizations which are involved in management of chemical safety at AT and T Bell Laboratories. The departments represented on that team are Industrial Hygiene and Safety, Environmental Management, Facilities Operations, Purchasing, Health Services, Research, and Environmental Data Management Services.

  6. Educational Management Information Systems: Progress and Prospectives.

    ERIC Educational Resources Information Center

    Evans, John A.

    An educational management information system is a network of communication channels, information sources, computer storage and retrieval devices, and processing routines that provide data to educational managers at different levels, places, and times to facilitate decisionmaking. Management information systems should be differentiated from…

  7. Visual Information Processing for Television and Telerobotics

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O. (Editor); Park, Stephen K. (Editor)

    1989-01-01

    This publication is a compilation of the papers presented at the NASA conference on Visual Information Processing for Television and Telerobotics. The conference was held at the Williamsburg Hilton, Williamsburg, Virginia on May 10 to 12, 1989. The conference was sponsored jointly by NASA Offices of Aeronautics and Space Technology (OAST) and Space Science and Applications (OSSA) and the NASA Langley Research Center. The presentations were grouped into three sessions: Image Gathering, Coding, and Advanced Concepts; Systems; and Technologies. The program was organized to provide a forum in which researchers from industry, universities, and government could be brought together to discuss the state of knowledge in image gathering, coding, and processing methods.

  8. Industrial process surveillance system

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Singer, Ralph M.; Mott, Jack E.

    1998-01-01

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  9. Industrial Process Surveillance System

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W; Singer, Ralph M.; Mott, Jack E.

    2001-01-30

    A system and method for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy.

  10. Industrial process surveillance system

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Singer, R.M.; Mott, J.E.

    1998-06-09

    A system and method are disclosed for monitoring an industrial process and/or industrial data source. The system includes generating time varying data from industrial data sources, processing the data to obtain time correlation of the data, determining the range of data, determining learned states of normal operation and using these states to generate expected values, comparing the expected values to current actual values to identify a current state of the process closest to a learned, normal state; generating a set of modeled data, and processing the modeled data to identify a data pattern and generating an alarm upon detecting a deviation from normalcy. 96 figs.

  11. Social Information Processing in Deaf Adolescents

    ERIC Educational Resources Information Center

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2016-01-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…

  12. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  13. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; Brissebrat, Guillaume; Boichard, Jean-Luc; Cloché, Sophie; Mière, Arnaud; Moulaye, Oumarou; Ramage, Karim; Favot, Florence; Boulanger, Damien

    2015-04-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to boost the data and information exchange between researchers from different disciplines. The AMMA information system includes (i) a user-friendly data management and dissemination system, (ii) quasi real-time display websites and (iii) a scientific paper exchange collaborative tool. The AMMA information system is enriched by past and ongoing projects (IMPETUS, FENNEC, ESCAPE, QweCI, ACASIS, DACCIWA...) addressing meteorology, atmospheric chemistry, extreme events, health, adaptation of human societies... It is becoming a reference information system on environmental issues in West Africa. (i) The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the AMMA data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that cover many geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health). They have been collected by operational networks since 1850, long term monitoring research networks (CATCH, IDAF, PIRATA...) and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property, property value…). The AMMA data portal counts about 900 registered users, and 50 data requests every month. The AMMA databases and data portal have been developed and are operated jointly by SEDOO and

  14. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  15. Geographic Information Systems.

    PubMed

    Wieczorek, William F; Delmerico, Alan M

    2009-01-01

    This chapter presents an overview of the development, capabilities, and utilization of geographic information systems (GIS). There are nearly an unlimited number of applications that are relevant to GIS because virtually all human interactions, natural and man-made features, resources, and populations have a geographic component. Everything happens somewhere and the location often has a role that affects what occurs. This role is often called spatial dependence or spatial autocorrelation, which exists when a phenomenon is not randomly geographically distributed. GIS has a number of key capabilities that are needed to conduct a spatial analysis to assess this spatial dependence. This chapter presents these capabilities (e.g., georeferencing, adjacency/distance measures, overlays) and provides a case study to illustrate how GIS can be used for both research and planning. Although GIS has developed into a relatively mature application for basic functions, development is needed to more seamlessly integrate spatial statistics and models.The issue of location, especially the geography of human activities, interactions between humanity and nature, and the distribution and location of natural resources and features, is one of the most basic elements of scientific inquiry. Conceptualizations and physical maps of geographic space have existed since the beginning of time because all human activity takes place in a geographic context. Representing objects in space, basically where things are located, is a critical aspect of the natural, social, and applied sciences. Throughout history there have been many methods of characterizing geographic space, especially maps created by artists, mariners, and others eventually leading to the development of the field of cartography. It is no surprise that the digital age has launched a major effort to utilize geographic data, but not just as maps. A geographic information system (GIS) facilitates the collection, analysis, and reporting of

  16. Implementation of Alabama Resources Information System, ARIS

    NASA Technical Reports Server (NTRS)

    Herring, B. E.

    1978-01-01

    Development of ARIS - Alabama Resources Information System is summarized. Development of data bases, system simplification for user access, and making information available to personnel having a need to use ARIS or in the process of developing ARIS type systems are discussed.

  17. WEAVE core processing system

    NASA Astrophysics Data System (ADS)

    Walton, Nicholas A.; Irwin, Mike; Lewis, James R.; Gonzalez-Solares, Eduardo; Dalton, Gavin; Trager, Scott; Aguerri, J. Alfonso L.; Allende Prieto, Carlos; Benn, Chris R.; Abrams, Don Carlos; Picó, Sergio; Middleton, Kevin; Lodi, Marcello; Bonifacio, Piercarlo

    2014-07-01

    WEAVE is an approved massive wide field multi-object optical spectrograph (MOS) currently entering its build phase, destined for use on the 4.2-m William Herschel Telescope (WHT). It will be commissioned and begin survey operations in 2017. This paper describes the core processing system (CPS) system being developed to process the bulk data flow from WEAVE. We describe the processes and techniques to be used in producing the scientifically validated 'Level 1' data products from the WEAVE data. CPS outputs will include calibrated one-d spectra and initial estimates of basic parameters such as radial velocities (for stars) and redshifts (for galaxies).

  18. Laser material processing system

    DOEpatents

    Dantus, Marcos

    2015-04-28

    A laser material processing system and method are provided. A further aspect of the present invention employs a laser for micromachining. In another aspect of the present invention, the system uses a hollow waveguide. In another aspect of the present invention, a laser beam pulse is given broad bandwidth for workpiece modification.

  19. Image Processing System

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Mallinckrodt Institute of Radiology (MIR) is using a digital image processing system which employs NASA-developed technology. MIR's computer system is the largest radiology system in the world. It is used in diagnostic imaging. Blood vessels are injected with x-ray dye, and the images which are produced indicate whether arteries are hardened or blocked. A computer program developed by Jet Propulsion Laboratory known as Mini-VICAR/IBIS was supplied to MIR by COSMIC. The program provides the basis for developing the computer imaging routines for data processing, contrast enhancement and picture display.

  20. System for Information Discovery

    SciTech Connect

    Crow, Vern; Nakamura, Grant; Younkin, Chance

    1998-09-25

    SID characterizes natural language based documents so that they may be related and retrieved based on content similarity. This technology processes textual documents, autonoumsly identifies the major topics of the document set, and constructs an interpretable, high dimensional representation of each document. SID also provides the ability to interactively reweight representations based on user need, so users may analyze the dataset from multiple points of view. The particular advantages SID offers are speed, data compression, flexibility in representation, and incremental processing. SPIRE consists of software for visual analysis of text-based information sources. This technology enables users to make discoveries about the content of very large sets of textual documents without requiring the user to read or presort the documents. It employs algorithms for text and word proximity analysis to identify the key themes within the documents. The results of this analysis are projected onto a visual spatial proximity display (Galaxies or Themescape) where document proximity represents the degree of relatedness of theme.

  1. System for Information Discovery

    1998-09-25

    SID characterizes natural language based documents so that they may be related and retrieved based on content similarity. This technology processes textual documents, autonoumsly identifies the major topics of the document set, and constructs an interpretable, high dimensional representation of each document. SID also provides the ability to interactively reweight representations based on user need, so users may analyze the dataset from multiple points of view. The particular advantages SID offers are speed, data compression,more » flexibility in representation, and incremental processing. SPIRE consists of software for visual analysis of text-based information sources. This technology enables users to make discoveries about the content of very large sets of textual documents without requiring the user to read or presort the documents. It employs algorithms for text and word proximity analysis to identify the key themes within the documents. The results of this analysis are projected onto a visual spatial proximity display (Galaxies or Themescape) where document proximity represents the degree of relatedness of theme.« less

  2. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  3. Television Viewing vs. Reading: Testing Information Processing Assumptions.

    ERIC Educational Resources Information Center

    Meadowcroft, Jeanne M.; Olson, Beth

    As universities gain access to satellite delivery systems, faculty are asking questions about how information processing varies between print versus television delivery systems. A study compared 68 undergradaute adults' information processing activity when the same message is presented in print vs. on television. Results reveal little differences…

  4. Information for Successful Interaction with Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Johnson, Kathy A.

    2003-01-01

    Interaction in heterogeneous mission operations teams is not well matched to classical models of coordination with autonomous systems. We describe methods of loose coordination and information management in mission operations. We describe an information agent and information management tool suite for managing information from many sources, including autonomous agents. We present an integrated model of levels of complexity of agent and human behavior, which shows types of information processing and points of potential error in agent activities. We discuss the types of information needed for diagnosing problems and planning interactions with an autonomous system. We discuss types of coordination for which designs are needed for autonomous system functions.

  5. Information Search Process in Science Education.

    ERIC Educational Resources Information Center

    McNally, Mary Jane; Kuhlthau, Carol C.

    1994-01-01

    Discussion of the development of an information skills curriculum focuses on science education. Topics addressed include information seeking behavior; information skills models; the search process of scientists; science education; a process approach for student activities; and future possibilities. (Contains 15 references.) (LRW)

  6. Image gathering and processing - Information and fidelity

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Halyo, N.; Samms, R. W.; Stacy, K.

    1985-01-01

    In this paper we formulate and use information and fidelity criteria to assess image gathering and processing, combining optical design with image-forming and edge-detection algorithms. The optical design of the image-gathering system revolves around the relationship among sampling passband, spatial response, and signal-to-noise ratio (SNR). Our formulations of information, fidelity, and optimal (Wiener) restoration account for the insufficient sampling (i.e., aliasing) common in image gathering as well as for the blurring and noise that conventional formulations account for. Performance analyses and simulations for ordinary optical-design constraints and random scences indicate that (1) different image-forming algorithms prefer different optical designs; (2) informationally optimized designs maximize the robustness of optimal image restorations and lead to the highest-spatial-frequency channel (relative to the sampling passband) for which edge detection is reliable (if the SNR is sufficiently high); and (3) combining the informationally optimized design with a 3 by 3 lateral-inhibitory image-plane-processing algorithm leads to a spatial-response shape that approximates the optimal edge-detection response of (Marr's model of) human vision and thus reduces the data preprocessing and transmission required for machine vision.

  7. Student Satisfaction Process in Virtual Learning System: Considerations Based in Information and Service Quality from Brazil's Experience

    ERIC Educational Resources Information Center

    Machado-Da-Silva, Fábio Nazareno; Meirelles, Fernando de Souza; Filenga, Douglas; Filho, Marino Brugnolo

    2014-01-01

    Distance learning has undergone great changes, especially since the advent of the Internet and communication and information technology. Questions have been asked following the growth of this mode of instructional activity. Researchers have investigated methods to assess the benefits of e-learning from a number of perspectives. This survey…

  8. Mathematics of Information Processing and the Internet

    ERIC Educational Resources Information Center

    Hart, Eric W.

    2010-01-01

    The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…

  9. Tropical Cyclone Information System

    NASA Technical Reports Server (NTRS)

    Li, P. Peggy; Knosp, Brian W.; Vu, Quoc A.; Yi, Chao; Hristova-Veleva, Svetla M.

    2009-01-01

    The JPL Tropical Cyclone Infor ma tion System (TCIS) is a Web portal (http://tropicalcyclone.jpl.nasa.gov) that provides researchers with an extensive set of observed hurricane parameters together with large-scale and convection resolving model outputs. It provides a comprehensive set of high-resolution satellite (see figure), airborne, and in-situ observations in both image and data formats. Large-scale datasets depict the surrounding environmental parameters such as SST (Sea Surface Temperature) and aerosol loading. Model outputs and analysis tools are provided to evaluate model performance and compare observations from different platforms. The system pertains to the thermodynamic and microphysical structure of the storm, the air-sea interaction processes, and the larger-scale environment as depicted by ocean heat content and the aerosol loading of the environment. Currently, the TCIS is populated with satellite observations of all tropical cyclones observed globally during 2005. There is a plan to extend the database both forward in time till present as well as backward to 1998. The portal is powered by a MySQL database and an Apache/Tomcat Web server on a Linux system. The interactive graphic user interface is provided by Google Map.

  10. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  11. The risk assessment information system

    SciTech Connect

    Kerr, S.B.; Bonczek, R.R.; McGinn, C.W.; Land, M.L.; Bloom, L.D.; Sample, B.E.; Dolislager, F.G.

    1998-06-01

    In an effort to provide service-oriented environmental risk assessment expertise, the Department of Energy (DOE) Center for Risk Excellence (CRE) and DOE Oak Ridge Operations Office (ORO) are sponsoring Oak Ridge National Laboratory (ORNL) to develop a web-based system for disseminating risk tools and information to its users. This system, the Risk Assessment Information System (RAIS), was initially developed to support the site-specific needs of the DOE-ORO Environmental Restoration Risk Assessment Program. With support from the CRE, the system is currently being expanded to benefit all DOE risk information users and can be tailored to meet site-specific needs. Taking advantage of searchable and executable databases, menu-driven queries, and data downloads, using the latest World Wide Web technologies, the RAIS offers essential tools that are used in the risk assessment process or anywhere from project scoping to implementation. The RAIS tools can be located directly at http://risk.lsd.ornl.gov/homepage/rap{_}tool.htm or through the CRE`s homepage at http://www.doe.gov/riskcenter/home.html.

  12. Natural language processing and advanced information management

    NASA Technical Reports Server (NTRS)

    Hoard, James E.

    1989-01-01

    Integrating diverse information sources and application software in a principled and general manner will require a very capable advanced information management (AIM) system. In particular, such a system will need a comprehensive addressing scheme to locate the material in its docuverse. It will also need a natural language processing (NLP) system of great sophistication. It seems that the NLP system must serve three functions. First, it provides an natural language interface (NLI) for the users. Second, it serves as the core component that understands and makes use of the real-world interpretations (RWIs) contained in the docuverse. Third, it enables the reasoning specialists (RSs) to arrive at conclusions that can be transformed into procedures that will satisfy the users' requests. The best candidate for an intelligent agent that can satisfactorily make use of RSs and transform documents (TDs) appears to be an object oriented data base (OODB). OODBs have, apparently, an inherent capacity to use the large numbers of RSs and TDs that will be required by an AIM system and an inherent capacity to use them in an effective way.

  13. Role of RIS/APC for manufacturing RFG/LSD. [Refinery Information Systems/Advanced Process Control, ReFormulated Gasoline/Low Sulfur Diesels

    SciTech Connect

    Latour, P.R. )

    1994-01-01

    Revolutionary changes in quality specifications (number, complexity, uncertainty, economic sensitivity) for reformulated gasolines (RFG) and low-sulfur diesels (LSD) are being addressed by powerful, new, computer-integrated manufacturing technology for Refinery Information Systems and Advanced Process Control (RIS/APC). This paper shows how the five active RIS/APC functions: performance measurement, optimization, scheduling, control and integration are used to manufacture new, clean fuels competitively. With current industry spending for this field averaging 2 to 3 cents/bbl crude, many refineries can capture 50 to 100 cents/bbl if the technology is properly employed and sustained throughout refining operations, organizations, and businesses.

  14. Nanophotonics for integrated information systems

    NASA Astrophysics Data System (ADS)

    Levy, Uriel; Tetz, Kevin; Rokitski, Rostislav; Kim, Hyu-Chang; Tsai, Chia-Ho; Abashin, Maxim; Pang, Lin; Zezhad, Maziar; Fainman, Yeshaiahu

    2006-02-01

    Optical technology plays an increasingly important role in numerous information system applications, including optical communications, storage, signal processing, biology, medicine, and sensing. As optical technology develops, there is a growing need to develop scalable and reliable photonic integration technologies. These include the development of passive and active optical components that can be integrated into functional optical circuits and systems, including filters, electrically or optically controlled switching fabrics, optical sources, detectors, amplifiers, etc. We explore the unique capabilities and advantages of nanotechnology in developing next generation integrated photonic information systems. Our approach includes design, modeling and simulations of selected components and devices, their nanofabrication, followed by validation via characterization and testing of the fabricated devices. The latter exploits our recently constructed near field complex amplitude imaging tool. The understanding of near field interactions in nanophotonic devices and systems is a crucial step as these interactions provide a variety of functionalities useful for optical systems integration. Furthermore, near-field optical devices facilitate miniaturization, and simultaneously enhance multifunctionality, greatly increasing the functional complexity per unit volume of the photonic system. Since the optical properties of near-field materials are controlled by the geometry, there is flexibility in the choice of constituent materials, facilitating the implementation of a wide range of devices using compatible materials for ease of fabrication and integration.

  15. Effects of foveal information processing

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.

    1984-01-01

    The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.

  16. Geographic Information Systems and Libraries: Patrons, Maps, and Spatial Information. Papers presented at the Clinic on Library Applications of Data Processing (Champaign, Illinois, April 10-12, 1995).

    ERIC Educational Resources Information Center

    Smith, Linda C., Ed.; Gluck, Myke, Ed.

    This document assembles conference papers which focus on how electronic technologies are creating new ways of meeting user needs for spatial and cartographic information. Contents include: (1) "Mapping Technology in Transition" (Mark Monmonier); (2) "Cataloging Planetospatial Data in Digital Form: Old Wine, New Bottles--New Wine, Old Bottles"…

  17. Selection and Implementation of New Information Systems.

    PubMed

    Kaplan, Keith J; Rao, Luigi K F

    2015-06-01

    The single most important element to consider when evaluating clinical information systems for a practice is workflow. Workflow can be broadly defined as an orchestrated and repeatable pattern of business activity enabled by the systematic organization of resources into processes that transform materials, provide services, or process information. PMID:26065798

  18. Responsibility Factors of Reducing Inefficiencies in Information System Processes and Their Role on Intention to Acquire Six Sigma Certification

    ERIC Educational Resources Information Center

    Hejazi, Sara

    2009-01-01

    Organizations worldwide have been turning to Six Sigma program (SSP) to eliminate the defects in their products or drive out the variability in their processes to attain a competitive advantage in their marketplace. An effective certification program has been touted as a major contributor to successful implementation of SSP. An effective…

  19. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  20. Quartz resonator processing system

    DOEpatents

    Peters, Roswell D. M.

    1983-01-01

    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  1. Evaluating geographic information systems technology

    USGS Publications Warehouse

    Guptill, Stephen C.

    1989-01-01

    Computerized geographic information systems (GISs) are emerging as the spatial data handling tools of choice for solving complex geographical problems. However, few guidelines exist for assisting potential users in identifying suitable hardware and software. A process to be followed in evaluating the merits of GIS technology is presented. Related standards and guidelines, software functions, hardware components, and benchmarking are discussed. By making users aware of all aspects of adopting GIS technology, they can decide if GIS is an appropriate tool for their application and, if so, which GIS should be used.

  2. Modelling Information System Dynamics: A Perspective.

    ERIC Educational Resources Information Center

    Oswitch, Pauline

    1983-01-01

    Describes British Library's work on Systems Dynamics, a set of techniques for building simulation models based on analysis of information feedback loops. Highlights include macro-simulation modelling activities of social science disciplines, systems analyses and models of information retrieval processes and library services, policy models, and…

  3. Forced guidance and distribution of practice in sequential information processing.

    NASA Technical Reports Server (NTRS)

    Decker, L. R.; Rogers, C. A., Jr.

    1973-01-01

    Distribution of practice and forced guidance were used in a sequential information-processing task in an attempt to increase the capacity of human information-processing mechanisms. A reaction time index of the psychological refractory period was used as the response measure. Massing of practice lengthened response times while forced guidance shortened them. Interpretation was in terms of load reduction upon the response-selection stage of the information-processing system.-

  4. Information processing in convex operational theories

    SciTech Connect

    Barnum, Howard Nelch; Wilce, Alexander G

    2008-01-01

    In order to understand the source and extent of the greater-than-classical information processing power of quantum systems, one wants to characterize both classical and quantum mechanics as points in a broader space of possible theories. One approach to doing this, pioneered by Abramsky and Coecke, is to abstract the essential categorical features of classical and quantum mechanics that support various information-theoretic constraints and possibilities, e.g., the impossibility of cloning in the latter, and the possibility of teleportation in both. Another approach, pursued by the authors and various collaborators, is to begin with a very conservative, and in a sense very concrete, generalization of classical probability theory--which is still sufficient to encompass quantum theory--and to ask which 'quantum' informational phenomena can be reproduced in this much looser setting. In this paper, we review the progress to date in this second programme, and offer some suggestions as to how to link it with the categorical semantics for quantum processes developed by Abramsky and Coecke.

  5. The Use of SQL and Second Generation Database Management Systems for Data Processing and Information Retrieval in Libraries.

    ERIC Educational Resources Information Center

    Leigh, William; Paz, Noemi

    1989-01-01

    Describes Structured Query Language (SQL), the result of an American National Standards Institute effort to standardize language used to query computer databases and a common element in second generation database management systems. The discussion covers implementations of SQL, associated products, and techniques for its use in online catalogs,…

  6. Trapped Atomic Ions and Quantum Information Processing

    SciTech Connect

    Wineland, D. J.; Leibfried, D.; Bergquist, J. C.; Blakestad, R. B.; Bollinger, J. J.; Britton, J.; Chiaverini, J.; Epstein, R. J.; Hume, D. B.; Itano, W. M.; Jost, J. D.; Koelemeij, J. C. J.; Langer, C.; Ozeri, R.; Reichle, R.; Rosenband, T.; Schaetz, T.; Schmidt, P. O.; Seidelin, S.; Shiga, N.

    2006-11-07

    The basic requirements for quantum computing and quantum simulation (single- and multi-qubit gates, long memory times, etc.) have been demonstrated in separate experiments on trapped ions. Construction of a large-scale information processor will require synthesis of these elements and implementation of high-fidelity operations on a very large number of qubits. This is still well in the future. NIST and other groups are addressing part of the scaling issue by trying to fabricate multi-zone arrays of traps that would allow highly-parallel and scalable processing. In the near term, some simple quantum processing protocols are being used to aid in quantum metrology, such as in atomic clocks. As the number of qubits increases, Schroedinger's cat paradox and the measurement problem in quantum mechanics become more apparent; with luck, trapped ion systems might be able to shed light on these fundamental issues.

  7. Four dimensional observations of clouds from geosynchronous orbit using stereo display and measurement techniques on an interactive information processing system

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Desjardins, M.; Shenk, W. E.

    1979-01-01

    Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.

  8. [Automatic information processing, the frontal system and blunted affect. From clinical dimensions to cognitive processes toward a psychobiological explanation of temperament].

    PubMed

    Partiot, A; Pierson, A; Renault, B; Widlöcher, D; Jouvent, R

    1994-01-01

    Several theorists have drawn a distinction between automatic and attentional or controlled processing. Hasher and Zacks (1979), were the very first to argue that effortful processes are reduced under conditions of stress including depression. They suggested that, in these conditions, no such deficit occurs in automatic processing. Then, Weingartner and co-workers provided some experiments which seemed to support such an interpretation of the cognitive dysfunction in depressed patients. However, some recent data do not fit with this well admitted theoretical framework. The purpose of our article is to try to clarify this issue both from a theoretical and from a methodological point of view. First, we make a critical review of the most recent results in three fields of experimentation related to the "automatic versus controlled" topic: 1) The classical neuropsychology of memory which manipulates the level of effort required to perform the tasks. Confusion arises when theories at the process level are tested with reference to data collected at the task level. The transparency assumption could be false: Impairment in an effort-demanding task could be due to a defect in automatic processes and it is possible to hypothesize that the more automatic processes are deficient, the more controlled processes are saturated and the effort demanding task impaired. The emergence of controlled processes could depend on the level of automaticity and the regulation of automatic processes could be determinant for the ability of the subject to make associations. 2) The recent studies on implicit memory in depression.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7828514

  9. Advanced information processing system: The Army fault tolerant architecture conceptual study. Volume 1: Army fault tolerant architecture overview

    NASA Technical Reports Server (NTRS)

    Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.

    1992-01-01

    Digital computing systems needed for Army programs such as the Computer-Aided Low Altitude Helicopter Flight Program and the Armored Systems Modernization (ASM) vehicles may be characterized by high computational throughput and input/output bandwidth, hard real-time response, high reliability and availability, and maintainability, testability, and producibility requirements. In addition, such a system should be affordable to produce, procure, maintain, and upgrade. To address these needs, the Army Fault Tolerant Architecture (AFTA) is being designed and constructed under a three-year program comprised of a conceptual study, detailed design and fabrication, and demonstration and validation phases. Described here are the results of the conceptual study phase of the AFTA development. Given here is an introduction to the AFTA program, its objectives, and key elements of its technical approach. A format is designed for representing mission requirements in a manner suitable for first order AFTA sizing and analysis, followed by a discussion of the current state of mission requirements acquisition for the targeted Army missions. An overview is given of AFTA's architectural theory of operation.

  10. Network command processing system overview

    NASA Technical Reports Server (NTRS)

    Nam, Yon-Woo; Murphy, Lisa D.

    1993-01-01

    The Network Command Processing System (NCPS) developed for the National Aeronautics and Space Administration (NASA) Ground Network (GN) stations is a spacecraft command system utilizing a MULTIBUS I/68030 microprocessor. This system was developed and implemented at ground stations worldwide to provide a Project Operations Control Center (POCC) with command capability for support of spacecraft operations such as the LANDSAT, Shuttle, Tracking and Data Relay Satellite, and Nimbus-7. The NCPS consolidates multiple modulation schemes for supporting various manned/unmanned orbital platforms. The NCPS interacts with the POCC and a local operator to process configuration requests, generate modulated uplink sequences, and inform users of the ground command link status. This paper presents the system functional description, hardware description, and the software design.

  11. Analytic Hierarchy Process for Personalising Environmental Information

    ERIC Educational Resources Information Center

    Kabassi, Katerina

    2014-01-01

    This paper presents how a Geographical Information System (GIS) can be incorporated in an intelligent learning software system for environmental matters. The system is called ALGIS and incorporates the GIS in order to present effectively information about the physical and anthropogenic environment of Greece in a more interactive way. The system…

  12. Environmental information system for visualizing environmental impact assessment information.

    PubMed

    Cserny, Angelika; Kovács, Zsófia; Domokos, Endre; Rédey, Akos

    2009-01-01

    The Institute of Environmental Engineering at the University of Pannonia has undertaken the challenge to develop an online environmental information system. This system is able to receive and process the collected environmental data via Internet. The authors have attached importance to the presentation of the data and have included other comprehensible information for laymen as well in order to work out visualisation techniques that are expressive and attract attention for environmental questions through the developed information system. The ways of visualizing physical and chemical parameters of surface water and the effects of motorway construction were examined.

  13. System status display information

    NASA Technical Reports Server (NTRS)

    Summers, L. G.; Erickson, J. B.

    1984-01-01

    The system Status Display is an electronic display system which provides the flight crew with enhanced capabilities for monitoring and managing aircraft systems. Guidelines for the design of the electronic system displays were established. The technical approach involved the application of a system engineering approach to the design of candidate displays and the evaluation of a Hernative concepts by part-task simulation. The system engineering and selection of candidate displays are covered.

  14. Is Visual Information Processing Related to Reading?

    ERIC Educational Resources Information Center

    Palmer, John C.; And Others

    A large stratified sample of university undergraduate students differing in reading ability performed a diverse set of psychometric and information processing tasks in a study exploring the role of visual information processing skill as a component in reading ability. Using a correlation analysis of individual differences, the interrelationships…

  15. A Clinical Information Display System

    PubMed Central

    Blum, Bruce J.; Lenhard, Raymond E.; Braine, Hayden; Kammer, Anne

    1977-01-01

    A clinical information display system has been implemented as part of a prototype Oncology Clinical Information System for the Johns Hopkins Oncology Center. The information system has been developed to support the management of patient therapy. Capabilities in the prototype include a patient data system, a patient abstract, a tumor registry, an appointment system, a census system, and a clinical information display system. This paper describes the clinical information display component of the prototype. It has the capability of supporting up to 10,000 patient records with online data entry and editing. At the present time, the system is being used only in the Oncology Center. There are plans, however, for trial use by other departments, and the system represents a tool with a potential for more general application.

  16. DESIGN INFORMATION ON FINE PORE AERATION SYSTEMS

    EPA Science Inventory

    Field studies were conducted over several years at municipal wastewater treatment plants employing line pore diffused aeration systems. These studies were designed to produce reliable information on the performance and operational requirements of fine pore devices under process ...

  17. Advanced information processing system: The Army fault tolerant architecture conceptual study. Volume 2: Army fault tolerant architecture design and analysis

    NASA Technical Reports Server (NTRS)

    Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.

    1992-01-01

    Described here is the Army Fault Tolerant Architecture (AFTA) hardware architecture and components and the operating system. The architectural and operational theory of the AFTA Fault Tolerant Data Bus is discussed. The test and maintenance strategy developed for use in fielded AFTA installations is presented. An approach to be used in reducing the probability of AFTA failure due to common mode faults is described. Analytical models for AFTA performance, reliability, availability, life cycle cost, weight, power, and volume are developed. An approach is presented for using VHSIC Hardware Description Language (VHDL) to describe and design AFTA's developmental hardware. A plan is described for verifying and validating key AFTA concepts during the Dem/Val phase. Analytical models and partial mission requirements are used to generate AFTA configurations for the TF/TA/NOE and Ground Vehicle missions.

  18. Copying and the Information System

    ERIC Educational Resources Information Center

    Kenyon, Richard L.

    1975-01-01

    Calls on the users and producers and publishers of scientific information to aid in the design of practical systems for information dissemination that will encompass not only copyright law but also computer file access. (GS)

  19. Structured Information Management Using New Techniques for Processing Text.

    ERIC Educational Resources Information Center

    Gibb, Forbes; Smart, Godfrey

    1990-01-01

    Describes the development of a software system, SIMPR (Structured Information Management: Processing and Retrieval), that will process documents by indexing them and classifying their subjects. Topics discussed include information storage and retrieval, file inversion techniques, modelling the user, natural language searching, automatic indexing,…

  20. Holledge gauge failure testing using concurrent information processing algorithm

    SciTech Connect

    Weeks, G.E.; Daniel, W.E.; Edwards, R.E.; Jannarone, R.J.; Joshi, S.N.; Palakodety, S.S.; Qian, D.

    1996-04-11

    For several decades, computerized information processing systems and human information processing models have developed with a good deal of mutual influence. Any comprehensive psychology text in this decade uses terms that originated in the computer industry, such as ``cache`` and ``memory``, to describe human information processing. Likewise, many engineers today are using ``artificial intelligence``and ``artificial neural network`` computing tools that originated as models of human thought to solve industrial problems. This paper concerns a recently developed human information processing model, called ``concurrent information processing`` (CIP), and a related set of computing tools for solving industrial problems. The problem of focus is adaptive gauge monitoring; the application is pneumatic pressure repeaters (Holledge gauges) used to measure liquid level and density in the Defense Waste Processing Facility and the Integrated DWPF Melter System.

  1. Estimation of potential loss of two pesticides in runoff in Fillmore County, Minnesota using a field-scale process-based model and a geographic information system

    USGS Publications Warehouse

    Capel, P.D.; Zhang, H.

    2000-01-01

    In assessing the occurrence, behavior, and effects of agricultural chemicals in surface water, the scales of study (i.e., watershed, county, state, and regional areas) are usually much larger than the scale of agricultural fields, where much of the understanding of processes has been developed. Field-scale areas are characterized by relatively homogeneous conditions. The combination of process-based simulation models and geographic information system technology can be used to help extend our understanding of field processes to water-quality concerns at larger scales. To demonstrate this, the model "Groundwater Loading Effects of Agricultural Management Systems" was used to estimate the potential loss of two pesticides (atrazine and permethrin) in runoff to surface water in Fillmore County in southeastern Minnesota. The county was divided into field-scale areas on the basis of a 100 m by 100 m grid, and the influences of soil type and surface topography on the potential losses of the two pesticides in runoff was evaluated for each individual grid cell. The results could be used for guidance for agricultural management and regulatory decisions, for planning environmental monitoring programs, and as an educational tool for the public.

  2. Conditional deletion of ERK5 MAP kinase in the nervous system impairs pheromone information processing and pheromone-evoked behaviors.

    PubMed

    Zou, Junhui; Storm, Daniel R; Xia, Zhengui

    2013-01-01

    ERK5 MAP kinase is highly expressed in the developing nervous system but absent in most regions of the adult brain. It has been implicated in regulating the development of the main olfactory bulb and in odor discrimination. However, whether it plays an essential role in pheromone-based behavior has not been established. Here we report that conditional deletion of the Mapk7 gene which encodes ERK5 in mice in neural stem cells impairs several pheromone-mediated behaviors including aggression and mating in male mice. These deficits were not caused by a reduction in the level of testosterone, by physical immobility, by heightened fear or anxiety, or by depression. Using mouse urine as a natural pheromone-containing solution, we provide evidence that the behavior impairment was associated with defects in the detection of closely related pheromones as well as with changes in their innate preference for pheromones related to sexual and reproductive activities. We conclude that expression of ERK5 during development is critical for pheromone response and associated animal behavior in adult mice.

  3. Cockpit weather information system

    NASA Technical Reports Server (NTRS)

    Tu, Jeffrey Chen-Yu (Inventor)

    2000-01-01

    Weather information, periodically collected from throughout a global region, is periodically assimilated and compiled at a central source and sent via a high speed data link to a satellite communication service, such as COMSAT. That communication service converts the compiled weather information to GSDB format, and transmits the GSDB encoded information to an orbiting broadcast satellite, INMARSAT, transmitting the information at a data rate of no less than 10.5 kilobits per second. The INMARSAT satellite receives that data over its P-channel and rebroadcasts the GDSB encoded weather information, in the microwave L-band, throughout the global region at a rate of no less than 10.5 KB/S. The transmission is received aboard an aircraft by means of an onboard SATCOM receiver and the output is furnished to a weather information processor. A touch sensitive liquid crystal panel display allows the pilot to select the weather function by touching a predefined icon overlain on the display's surface and in response a color graphic display of the weather is displayed for the pilot.

  4. Career Information Delivery Systems Inventory.

    ERIC Educational Resources Information Center

    Olson, Gerald T.; Whitman, Patricia D.

    This inventory highlights similarities and differences between 19 computerized career information delivery systems (CIDS) so practitioners may make more informed choices concerning the adoption of such systems, and policymakers may monitor the developing scope of system features and costs. It was developed through a survey of computer products…

  5. Digital TV processing system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Two digital video data compression systems directly applicable to the Space Shuttle TV Communication System were described: (1) For the uplink, a low rate monochrome data compressor is used. The compression is achieved by using a motion detection technique in the Hadamard domain. To transform the variable source rate into a fixed rate, an adaptive rate buffer is provided. (2) For the downlink, a color data compressor is considered. The compression is achieved first by intra-color transformation of the original signal vector, into a vector which has lower information entropy. Then two-dimensional data compression techniques are applied to the Hadamard transformed components of this last vector. Mathematical models and data reliability analyses were also provided for the above video data compression techniques transmitted over a channel encoded Gaussian channel. It was shown that substantial gains can be achieved by the combination of video source and channel coding.

  6. The ISO/IEC 9126-1 as a Supporting Means for the System Development Process of a Patient Information Web Service.

    PubMed

    Hörbst, Alexander; Fink, Kerstin; Goebel, Georg

    2005-01-01

    The development of patient information systems faces the mayor problems of increasing and more complex content as well as the introduction of new techniques of system implementation. An integrated development demands for a method to deal with both aspects. The ISO/IEC 9126-1 offers a framework where both views can be integrated to a general view of the system and can be used as a basis for further development. This article wants to introduce the ISO/IEC 9126-1 as a supporting means for the development of patient information systems considering the example of a web service for a patient information system.

  7. Global Land Information System (GLIS)

    USGS Publications Warehouse

    ,

    1992-01-01

    The Global Land Information System (GLIS) is an interactive computer system developed by the U.S. Geological Survey (USGS) for scientists seeking sources of information about the Earth's land surfaces. GLIS contains "metadata," that is, descriptive information about data sets. Through GLIS, scientists can evaluate data sets, determine their availability, and place online requests for products. GLIS is more, however, than a mere list of products. It offers online samples of earth science data that may be ordered through the system.

  8. Multitasking Information Seeking and Searching Processes.

    ERIC Educational Resources Information Center

    Spink, Amanda; Ozmutlu, H. Cenk; Ozmutlu, Seda

    2002-01-01

    Presents findings from four studies of the prevalence of multitasking information seeking and searching by Web (via the Excite search engine), information retrieval system (mediated online database searching), and academic library users. Highlights include human information coordinating behavior (HICB); and implications for models of information…

  9. Centralized versus Decentralized Information Systems

    NASA Astrophysics Data System (ADS)

    Hugoson, Mats-Åke

    This paper brings into question whether information systems should be centralized or decentralized in order to provide greater support for different business processes. During the last century companies and organizations have used different approaches for centralization and decentralization; a simple answer to the question does not exist. This paper provides a survey of the evolution of centralized and decentralized approaches, mainly in a Nordic perspective. Based on critical reflections on the situation in the end of the century we can discuss what we can learn from history to achieve alignment between centralized and decentralized systems and the business structure. The conclusion is that theories, management and practice for decisions on centralization or decentralization of information systems must be improved. A conscious management and control of centralization /decentralization of IT support is a vital question in the company or the organization, and this is not a task that can be handled only by IT-specialists. There is a need for business oriented IT management of centralization/decentralization.

  10. Mapping individual logical processes in information searching

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.

    1974-01-01

    An interactive dialog with a computerized information collection was recorded and plotted in the form of a flow chart. The process permits one to identify the logical processes employed in considerable detail and is therefore suggested as a tool for measuring individual thought processes in a variety of situations. A sample of an actual test case is given.

  11. Medical Information Management System (MIMS): A generalized interactive information system

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Friedman, C. A.; Hipkins, K. R.

    1975-01-01

    An interactive information system is described. It is a general purpose, free format system which offers immediate assistance where manipulation of large data bases is required. The medical area is a prime area of application. Examples of the system's operation, commentary on the examples, and a complete listing of the system program are included.

  12. Medical Information Management System (MIMS): An automated hospital information system

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Simmons, P. B.; Schwartz, R. A.

    1971-01-01

    An automated hospital information system that handles all data related to patient-care activities is described. The description is designed to serve as a manual for potential users, nontechnical medical personnel who may use the system. Examples of the system's operation, commentary on the examples, and a complete listing of the system program are included.

  13. Condition Assessment Information System

    2002-09-16

    CAIS2000 records, tracks and cost maintenance deficiencies associated with condition assessments of real property assets. Cost information is available for 39,000 items in the currenht RS Means, Facilities Construction Manual. These costs can, in turn, be rolled by by asset to produce the summary condition of an asset or site.

  14. Toward intelligent information system

    NASA Astrophysics Data System (ADS)

    Komatsu, Sanzo

    NASA/RECON, the predecessor of DIALOG System, was originally designed as a user friendly system for astronauts, so that they should not miss-operate the machine in spite of tension in the outer space. Since then, DIALOG has endeavoured to develop a series of user friendly systems, such as knowledge index, inbound gateway, as well as Version II. In this so-called end user searching era, DIALOG has released a series of front end systems successively; DIALOG Business Connection, DIALOG Medical Connection and OneSearch in 1986, early and late 1987 respectively. They are all called expert systems. In this paper, the features of each system are described in some detail and the remaining critical issues are also discussed.

  15. Clementine Sensor Processing System

    NASA Technical Reports Server (NTRS)

    Feldstein, A. A.

    1993-01-01

    The design of the DSPSE Satellite Controller (DSC) is baselined as a single-string satellite controller. The DSC performs two main functions: health and maintenance of the spacecraft; and image capture, storage, and playback. The DSC contains two processors: a radiation-hardened Mil-Std-1750, and a commercial R3000. The Mil-Std-1750 processor performs all housekeeping operations, while the R3000 is mainly used to perform the image processing functions associated with the navigation functions, as well as performing various experiments. The DSC also contains a data handling unit (DHU) used to interface to various spacecraft imaging sensors and to capture, compress, and store selected images onto the solid-state data recorder. The development of the DSC evolved from several key requirements; the DSPSE satellite was to do the following: (1) have a radiation-hardened spacecraft control system and be immune to single-event upsets (SEU's); (2) use an R3000-based processor to run the star tracker software that was developed by SDIO (due to schedule and cost constraints, there was no time to port the software to a radiation-hardened processor); and (3) fly a commercial processor to verify its suitability for use in a space environment. In order to enhance the DSC reliability, the system was designed with multiple processing paths. These multiple processing paths provide for greater tolerance to various component failures. The DSC was designed so that all housekeeping processing functions are performed by either the Mil-Std-1750 processor or the R3000 processor. The image capture and storage is performed either by the DHU or the R3000 processor.

  16. Ontology-driven health information systems architectures.

    PubMed

    Blobel, Bernd; Oemig, Frank

    2009-01-01

    Following an architecture vision such as the Generic Component Model (GCM) architecture framework, health information systems for supporting personalized care have to be based on a component-oriented architecture. Representing concepts and their interrelations, the GCM perspectives system architecture, domains, and development process can be described by the domains' ontologies. The paper introduces ontology principles, ontology references to the GCM as well as some practical aspects of ontology-driven approaches to semantically interoperable and sustainable health information systems.

  17. Information System for Educational Policy and Administration.

    ERIC Educational Resources Information Center

    Clayton, J. C., Jr.

    Educational Information System (EIS) is a proposed computer-based data processing system to help schools solve current educational problems more efficiently. The system would allow for more effective administrative operations in student scheduling, financial accounting, and long range planning. It would also assist school trustees and others in…

  18. Multi-Sensor Distributive On-Line Processing, Visualization, and Analysis Infrastructure for an Agricultural Information System at the NASA Goddard Earth Sciences DAAC

    NASA Technical Reports Server (NTRS)

    Teng, William; Berrick, Steve; Leptuokh, Gregory; Liu, Zhong; Rui, Hualan; Pham, Long; Shen, Suhung; Zhu, Tong

    2004-01-01

    The Goddard Space Flight Center Earth Sciences Data and Information Services Center (GES DISC) Distributed Active Center (DAAC) is developing an Agricultural Information System (AIS), evolved from an existing TRMM On-line Visualization and Analysis System precipitation and other satellite data products and services. AIS outputs will be ,integrated into existing operational decision support system for global crop monitoring, such as that of the U.N. World Food Program. The ability to use the raw data stored in the GES DAAC archives is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. To gain this understanding is a time-consuming process and not a productive investment of the user's time. This is an especially difficult challenge when users need to deal with multi-sensor data that usually are of different structures and resolutions. The AIS has taken a major step towards meeting this challenge by incorporating an underlying infrastructure, called the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni," that integrates various components to support web interfaces that ,allow users to perform interactive analysis on-line without downloading any data. Several instances of the Giovanni-based interface have been or are being created to serve users of TRMM precipitation, MODIS aerosol, and SeaWiFS ocean color data, as well as agricultural applications users. Giovanni-based interfaces are simple to use but powerful. The user selects geophysical ,parameters, area of interest, and time period; and the system generates an output ,on screen in a matter of seconds.

  19. A corporate strategy for the control of information processing.

    PubMed

    Lucas, H C; Turner, J A

    1982-01-01

    Although the use of information processing has become widespread, many organizations have developed systems that are basically independent of the firm's strategy. However, the authors in this article argue that the greatest benefits come when information technology is merged with strategy formulation. The article includes examples of how this has been done and presents a framework for top management direction and control of information processing.

  20. Project Records Information System (PRIS)

    SciTech Connect

    Smith, P.S.; Schwarz, R.K.

    1990-11-01

    The Project Records Information System (PRIS) is an interactive system developed for the Information Services Division (ISD) of Martin Marietta Energy Systems, Inc., to perform indexing, maintenance, and retrieval of information about Engineering project record documents for which they are responsible. This PRIS User's Manual provides instruction on the use of this system. This manual presents an overview of PRIS, describing the system's purpose; the data that it handles; functions it performs; hardware, software, and access; and help and error functions. This manual describes the interactive menu-driven operation of PRIS. Appendixes A, B, C, and D contain the data dictionary, help screens, report descriptions, and a primary menu structure diagram, respectively.

  1. Sandia Explosive Inventory and Information System

    SciTech Connect

    Clements, D.A.

    1994-08-01

    The Explosive Inventory and Information System (EIS) is being developed and implemented by Sandia National Laboratories (SNL) to incorporate a cradle to grave structure for all explosives and explosive containing devices and assemblies at SNL from acquisition through use, storage, reapplication, transfer or disposal. The system does more than track all material inventories. It provides information on material composition, characteristics, shipping requirements; life cycle cost information, plan of use; and duration of ownership. The system also provides for following the processes of explosive development; storage review; justification for retention; Resource, Recovery and Disposition Account (RRDA); disassembly and assembly; and job description, hazard analysis and training requirements for all locations and employees involved with explosive operations. In addition, other information systems will be provided through the system such as the Department of Energy (DOE) and SNL Explosive Safety manuals, the Navy`s Department of Defense (DoD) Explosive information system, and the Lawrence Livermore National Laboratories (LLNL) Handbook of Explosives.

  2. Science information systems: Visualization

    NASA Technical Reports Server (NTRS)

    Wall, Ray J.

    1991-01-01

    Future programs in earth science, planetary science, and astrophysics will involve complex instruments that produce data at unprecedented rates and volumes. Current methods for data display, exploration, and discovery are inadequate. Visualization technology offers a means for the user to comprehend, explore, and examine complex data sets. The goal of this program is to increase the effectiveness and efficiency of scientists in extracting scientific information from large volumes of instrument data.

  3. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  4. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  5. Information processing by pigeons (Columba livia): incentive as information.

    PubMed

    Shimp, Charles P; Froehlich, Alyson L; Herbranson, Walter T

    2007-02-01

    Experiment 1 showed that the Hick-Hyman law (W. E. Hick, 1952; R. Hyman, 1953) described the effects of anticipated reinforcement, a form of incentive, on pigeons' (Columba livia) reaction time to respond to a target spatial location. Reaction time was an approximately linear function of amount of information interpreted as probability of reinforcement, implying that pigeons processed incentive at a constant rate. Experiment 2 showed that the Hick-Hyman law described effects of incentive even when it varied from moment to moment in a serial reaction time task similar to that of M. J. Nissen and P. Bullemer (1987), and processing information about target spatial location modulated absolute reaction time and not rate of processing incentive. The results support mental continuity and provide comparative support for the idea of the economics of information in economic theory about the incentive value of information.

  6. System for Information Discovery

    1996-10-10

    SID characterizes natural language based documents so that they may be related and retrieved based on content similarity. This technology processes textual documents, autonomously identifies the major topics of the document set, and constructs an interpretable, high dimensional representation of each document. SID also provides the ability to interactively re-weight representations based on user needs, so that users may analyze the data set from multiple points of view. The particular advantages SID offers are speed,more » data compression, flexibility in representation, and incremental processing.« less

  7. Information technology equipment cooling system

    DOEpatents

    Schultz, Mark D.

    2014-06-10

    According to one embodiment, a system for removing heat from a rack of information technology equipment may include a sidecar indoor air to liquid heat exchanger that cools warm air generated by the rack of information technology equipment. The system may also include a liquid to liquid heat exchanger and an outdoor heat exchanger. The system may further include configurable pathways to connect and control fluid flow through the sidecar heat exchanger, the liquid to liquid heat exchanger, the rack of information technology equipment, and the outdoor heat exchanger based upon ambient temperature and/or ambient humidity to remove heat from the rack of information technology equipment.

  8. Implementing Student Information Systems

    ERIC Educational Resources Information Center

    Sullivan, Laurie; Porter, Rebecca

    2006-01-01

    Implementing an enterprise resource planning system is a complex undertaking. Careful planning, management, communication, and staffing can make the difference between a successful and unsuccessful implementation. (Contains 3 tables.)

  9. Forest resource information system

    NASA Technical Reports Server (NTRS)

    Mroczynski, R. P. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. A benchmark classification evaluation framework was implemented. The FRIS preprocessing activities were refined. Potential geo-based referencing systems were identified as components of FRIS.

  10. Information Systems for Federated Biobanks

    NASA Astrophysics Data System (ADS)

    Eder, Johann; Dabringer, Claus; Schicho, Michaela; Stark, Konrad

    Biobanks store and manage collections of biological material (tissue, blood, cell cultures, etc.) and manage the medical and biological data associated with this material. Biobanks are invaluable resources for medical research. The diversity, heterogeneity and volatility of the domain make information systems for biobanks a challenging application domain. Information systems for biobanks are foremost integration projects of heterogenous fast evolving sources.

  11. Distributing Executive Information Systems through Networks.

    ERIC Educational Resources Information Center

    Penrod, James I.; And Others

    1993-01-01

    Many colleges and universities will soon adopt distributed systems for executive information and decision support. Distribution of shared information through computer networks will improve decision-making processes dramatically on campuses. Critical success factors include administrative support, favorable organizational climate, ease of use,…

  12. Information Processing Theory and Conceptual Development.

    ERIC Educational Resources Information Center

    Schroder, H. M.

    An educational program based upon information processing theory has been developed at Southern Illinois University. The integrating theme was the development of conceptual ability for coping with social and personal problems. It utilized student information search and concept formation as foundations for discussion and judgment and was organized…

  13. Medical-Information-Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, Sidney; Friedman, Carl A.; Frankowski, James W.

    1989-01-01

    Medical Information Management System (MIMS) computer program interactive, general-purpose software system for storage and retrieval of information. Offers immediate assistance where manipulation of large data bases required. User quickly and efficiently extracts, displays, and analyzes data. Used in management of medical data and handling all aspects of data related to care of patients. Other applications include management of data on occupational safety in public and private sectors, handling judicial information, systemizing purchasing and procurement systems, and analyses of cost structures of organizations. Written in Microsoft FORTRAN 77.

  14. Developing Information Systems for Competitive Intelligence Support.

    ERIC Educational Resources Information Center

    Hohhof, Bonnie

    1994-01-01

    Discusses issues connected with developing information systems for competitive intelligence support; defines the elements of an effective competitive information system; and summarizes issues affecting system design and implementation. Highlights include intelligence information; information needs; information sources; decision making; and…

  15. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  16. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  17. Integrated risk information system (IRIS)

    SciTech Connect

    Tuxen, L.

    1990-12-31

    The Integrated Risk Information System (IRIS) is an electronic information system developed by the US Environmental Protection Agency (EPA) containing information related to health risk assessment. IRIS is the Agency`s primary vehicle for communication of chronic health hazard information that represents Agency consensus following comprehensive review by intra-Agency work groups. The original purpose for developing IRIS was to provide guidance to EPA personnel in making risk management decisions. This original purpose for developing IRIS was to guidance to EPA personnel in making risk management decisions. This role has expanded and evolved with wider access and use of the system. IRIS contains chemical-specific information in summary format for approximately 500 chemicals. IRIS is available to the general public on the National Library of Medicine`s Toxicology Data Network (TOXNET) and on diskettes through the National Technical Information Service (NTIS).

  18. Mars Aqueous Processing System

    NASA Technical Reports Server (NTRS)

    Berggren, Mark; Wilson, Cherie; Carrera, Stacy; Rose, Heather; Muscatello, Anthony; Kilgore, James; Zubrin, Robert

    2012-01-01

    The goal of the Mars Aqueous Processing System (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

  19. Optical Information Processing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Current research in optical processing is reviewed. Its role in future aerospace systems is determined. The development of optical devices and components demonstrates that system concepts can be implemented in practical aerospace configurations.

  20. Property Information System

    1998-01-28

    Provides cradle to grave tracking of DOE property (capital, accountable, etc.). Major functional areas include Acquisitions, Management, Inventory, Accounting, Agreements, Excessing, Dispositions, and Reporting. The Accounting module is not used at this time and may not be operational. A major enhancement added here at Lockheed Martin Energy Systems is the Web-based portion of the system, which allows custodians of property to record location and custodial changes, and to provide inventory confirmations. PLEASE NOTE: Customer mustmore » contact Ben McMurry, (865) 576-5906, Lockheed Martin Energy Ssytems, for help with installation of package. The fee for this installation help will be coordinated by customer and Lockheed Martin and is in addition to the cost of the package from ESTSC. Customer should contact Cheri Cross, (865) 574-6046, for user help.« less

  1. Property Information System

    SciTech Connect

    McMurry, Ben

    1998-01-28

    Provides cradle to grave tracking of DOE property (capital, accountable, etc.). Major functional areas include Acquisitions, Management, Inventory, Accounting, Agreements, Excessing, Dispositions, and Reporting. The Accounting module is not used at this time and may not be operational. A major enhancement added here at Lockheed Martin Energy Systems is the Web-based portion of the system, which allows custodians of property to record location and custodial changes, and to provide inventory confirmations. PLEASE NOTE: Customer must contact Ben McMurry, (865) 576-5906, Lockheed Martin Energy Ssytems, for help with installation of package. The fee for this installation help will be coordinated by customer and Lockheed Martin and is in addition to the cost of the package from ESTSC. Customer should contact Cheri Cross, (865) 574-6046, for user help.

  2. Multipurpose interactive NASA information system

    NASA Technical Reports Server (NTRS)

    Hill, J. M.; Keefer, R. L.; Sanders, D. R.; Seitz, R. N.

    1979-01-01

    Multipurpose Interactive NASA Information System (MINIS) is data management system capable of retrieving descriptive data from LANDSAT photos. General enough to be used with other user-defined data bases, interactive data management and information retrieval system was especially developed for small and medium-sized computers. It uses free-form data base that allows one to create entirely new and different data bases and to control format of output products.

  3. Information systems - Issues in global habitability

    NASA Technical Reports Server (NTRS)

    Norman, S. D.; Brass, J. A.; Jones, H.; Morse, D. R.

    1984-01-01

    The present investigation is concerned with fundamental issues, related to information considerations, which arise in an interdisciplinary approach to questions of global habitability. Information system problems and issues are illustrated with the aid of an example involving biochemical cycling and biochemical productivity. The estimation of net primary production (NPP) as an important consideration in the overall global habitability issue is discussed. The NPP model requires three types of data, related to meteorological information, a land surface inventory, and the vegetation structure. Approaches for obtaining and processing these data are discussed. Attention is given to user requirements, information system requirements, workstations, network communications, hardware/software access, and data management.

  4. Welcome to health information science and systems.

    PubMed

    Zhang, Yanchun

    2013-01-01

    Health Information Science and Systems is an exciting, new, multidisciplinary journal that aims to use technologies in computer science to assist in disease diagnoses, treatment, prediction and monitoring through the modeling, design, development, visualization, integration and management of health related information. These computer-science technologies include such as information systems, web technologies, data mining, image processing, user interaction and interface, sensors and wireless networking and are applicable to a wide range of health related information including medical data, biomedical data, bioinformatics data, public health data.

  5. Children's Information Processing of Television Advertising.

    ERIC Educational Resources Information Center

    Wackman, Daniel B.

    This report provides data from a larger study investigating consumer socialization of children which focused on the processes by which children acquire knowledge skills and attitudes related to consumer behavior. The research has utilized two theoretical perspectives: cognitive development and information processing theories. The data reported are…

  6. Systematic information processing style and perseverative worry.

    PubMed

    Dash, Suzanne R; Meeten, Frances; Davey, Graham C L

    2013-12-01

    This review examines the theoretical rationale for conceiving of systematic information processing as a proximal mechanism for perseverative worry. Systematic processing is characterised by detailed, analytical thought about issue-relevant information, and in this way, is similar to the persistent, detailed processing of information that typifies perseverative worry. We review the key features and determinants of systematic processing, and examine the application of systematic processing to perseverative worry. We argue that systematic processing is a mechanism involved in perseverative worry because (1) systematic processing is more likely to be deployed when individuals feel that they have not reached a satisfactory level of confidence in their judgement and this is similar to the worrier's striving to feel adequately prepared, to have considered every possible negative outcome/detect all potential danger, and to be sure that they will successfully cope with perceived future problems; (2) systematic processing and worry are influenced by similar psychological cognitive states and appraisals; and (3) the functional neuroanatomy underlying systematic processing is located in the same brain regions that are activated during worrying. This proposed mechanism is derived from core psychological processes and offers a number of clinical implications, including the identification of psychological states and appraisals that may benefit from therapeutic interventions for worry-based problems. PMID:24056060

  7. Quantum information processing with atoms and photons.

    PubMed

    Monroe, C

    2002-03-14

    Quantum information processors exploit the quantum features of superposition and entanglement for applications not possible in classical devices, offering the potential for significant improvements in the communication and processing of information. Experimental realization of large-scale quantum information processors remains a long-term vision, as the required nearly pure quantum behaviour is observed only in exotic hardware such as individual laser-cooled atoms and isolated photons. But recent theoretical and experimental advances suggest that cold atoms and individual photons may lead the way towards bigger and better quantum information processors, effectively building mesoscopic versions of 'Schrödinger's cat' from the bottom up.

  8. Engineering Photonic Switches for Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Oza, Neal N.

    In this dissertation, we describe, characterize, and demonstrate the operation of a dual-in, dual-out, all-optical, fiber-based quantum switch. This "cross-bar" switch is particularly useful for applications in quantum information processing because of its low-loss, high-speed, low-noise, and quantum-state-retention properties. Building upon on our lab's prior development of an ultrafast demultiplexer [1-3] , the new cross-bar switch can be used as a tunable multiplexer and demultiplexer. In addition to this more functional geometry, we present results demonstrating faster performance with a switching window of ≈45 ps, corresponding to >20-GHz switching rates. We show a switching fidelity of >98%, i. e., switched polarization-encoded photonic qubits are virtually identical to unswitched photonic qubits. We also demonstrate the ability to select one channel from a two-channel quantum data stream with the state of the measured (recovered) quantum channel having >96% relative fidelity with the state of that channel transmitted alone. We separate the two channels of the quantum data stream by 155 ps, corresponding to a 6.5-GHz datastream. Finally, we describe, develop, and demonstrate an application that utilizes the switch's higher-speed, lower-loss, and spatio-temporal-encoding features to perform quantum state tomographies on entangled states in higher-dimensional Hilbert spaces. Since many previous demonstrations show bipartite entanglement of two-level systems, we define "higher" as d > 2 where d represents the dimensionality of a photon. We show that we can generate and measure time-bin-entangled, two-photon, qutrit (d = 3) and ququat (d = 4) states with >85% and >64% fidelity to an ideal maximally entangled state, respectively. Such higher-dimensional states have applications in dense coding [4] , loophole-free tests of nonlocality [5] , simplifying quantum logic gates [6] , and increasing tolerance to noise and loss for quantum information processing [7] .

  9. Municipal solid waste landfill site selection with geographic information systems and analytical hierarchy process: a case study in Mahshahr County, Iran.

    PubMed

    Alavi, Nadali; Goudarzi, Gholamreza; Babaei, Ali Akbar; Jaafarzadeh, Nemat; Hosseinzadeh, Mohsen

    2013-01-01

    Landfill siting is a complicated process because it must combine social, environmental and technical factors. In this study, in order to consider all factors and rating criteria, a combination of geographic information systems and analytical hierarchy process (AHP) was used to determine the best sites for disposal of municipal solid waste (MSW) in Mahshahr County, Iran. In order to the decision making for landfill siting a structural hierarchy formed and the most important criteria: surface water, sensitive ecosystems, land cover, urban and rural areas, land uses, distance to roads, slope and land type were chosen according to standards and regulations. Each criterion was evaluated by rating methods. In the next step the relative importance of criteria to each other was determined by AHP. Land suitability for landfill was evaluated by simple additive weighting method. According to the landfill suitability map, the study area classified to four categories: high, moderate, low and very low suitability areas, which represented 18.6%, 20.3%, 1.6 and 0.8% of the study area respectively. The other 58.7% of the study area was determined to be completely unsuitable for landfill. By considering the parameters, such as the required area for landfill, distance to MSW generation points, and political and management issues, and consulting with municipalities managers in the study area, six sites were chosen for site visiting. The result of field study showed that it is a supplementary, and necessary, step in finding the best candidate landfill site from land with high suitability.

  10. Multipurpose Interactive NASA Information Systems (MINIS)

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The Multipurpose Interactive NASA Information System was developed to provide remote, interactive information retrieval capability for various types of data bases to be processed on different types of small and medium size computers. Use of the system for three different data bases is decribed: (1) LANDSAT photo look-up, (2) land use, and (3) census/socioeconomic. Each of the data base elements is shown together with other detailed information that a user would require to contact the system remotely, to transmit inquiries on commands, and to receive the results of the queries or commands.

  11. Earth Science Information System (ESIS)

    USGS Publications Warehouse

    ,

    1982-01-01

    The Earth Science Information System (ESIS) was developed in 1981 by the U.S. Geological Survey's Office of the Data Administrator. ESIS serves as a comprehensive data management facility designed to support the coordination, integration, and standardization of scientific, technical, and bibliographic data of the U.S. Geological Survey (USGS). ESIS provides, through an online interactive computer system, referral to information about USGS data bases, data elements which are fields in the records of data bases, and systems. The data bases contain information about many subjects from several scientific disciplines such as: geology, geophysics, geochemistry, hydrology, cartography, oceanography, geography, minerals exploration and conservation, and satellite data sensing.

  12. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    ERIC Educational Resources Information Center

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  13. Maryland Automated Geographic Information System

    NASA Technical Reports Server (NTRS)

    Thomas, E. L.

    1978-01-01

    A computer based system designed for storing geographic data in a consistent and coordinated manner is described. The data are stored, retrieved, and analyzed using a 400 km sq/acre cell. Stored information can be displayed on computer maps in a manner similar to standard map graphics. The data bank contains various information for performing land use analysis in a variety of areas.

  14. System Wide Information Management (SWIM)

    NASA Technical Reports Server (NTRS)

    Hritz, Mike; McGowan, Shirley; Ramos, Cal

    2004-01-01

    This viewgraph presentation lists questions regarding the implementation of System Wide Information Management (SWIM). Some of the questions concern policy issues and strategies, technology issues and strategies, or transition issues and strategies.

  15. Machine Process Capability Information Through Six Sigma

    SciTech Connect

    Lackner, M.F.

    1998-03-13

    A project investigating details concerning machine process capability information and its accessibility has been conducted. The thesis of the project proposed designing a part (denoted as a machine capability workpiece) based on the major machining features of a given machine. Parts are machined and measured to gather representative production, short-term variation. The information is utilized to predict the expected defect rate, expressed in terms of a composite sigma level process capability index, for a production part. Presently, decisions concerning process planning, particularly what machine will statistically produce the minimum amount of defects based on machined features and associated tolerances, are rarely made. Six sigma tools and methodology were employed to conduct this investigation at AlliedSignal FM and T. Tools such as the thought process map, factor relationship diagrams, and components of variance were used. This study is progressing toward completion. This research study was an example of how machine process capability information may be gathered for milling planar faces (horizontal) and slot features. The planning method used to determine where and how to gather variation for the part to be designed is known as factor relationship diagramming. Components-of-variation is then applied to the gathered data to arrive at the contributing level of variation illustrated within the factor relationship diagram. The idea of using this capability information beyond process planning to the other business enterprise operations is proposed.

  16. Medical Information Management System (MIMS): A Generalized Interactive Information System.

    ERIC Educational Resources Information Center

    Alterescu,Sidney; And Others

    This report describes an interactive information system. It is a general purpose, free format system which can offer immediate assistance where manipulation of large data bases is required. The medical area is a prime area of application. The report is designed to serve as a manual for potential users--nontechnical personnel who will use the…

  17. Structuring Information to Enhance Human Information Processing and Decision Style

    ERIC Educational Resources Information Center

    Harries-Belck, Nancy

    1978-01-01

    A pretest-posttest research design was used to measure changes in three criterion variables of 150 students enrolled in an undergraduate textile class using programmed instructional materials. Findings have implications for the design of organized learning sequences to help individuals process information more efficiently for complex…

  18. RIMS: Resource Information Management System

    NASA Technical Reports Server (NTRS)

    Symes, J.

    1983-01-01

    An overview is given of the capabilities and functions of the resource management system (RIMS). It is a simple interactive DMS tool which allows users to build, modify, and maintain data management applications. The RIMS minimizes programmer support required to develop/maintain small data base applications. The RIMS also assists in bringing the United Information Services (UIS) budget system work inhouse. Information is also given on the relationship between the RIMS and the user community.

  19. Geographic Information System Data Analysis

    NASA Technical Reports Server (NTRS)

    Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone; Thorpe, Karina

    1995-01-01

    Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.

  20. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-07

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  1. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  2. Social Information Processing in Deaf Adolescents.

    PubMed

    Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R

    2016-07-01

    The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment. Psychological Bulletin, 115, 74-101) reformulated six-stage model. It consisted of a structured interview after watching 18 scenes of situations depicting participation in a peer group or provocations by peers. Participants included 32 deaf and 20 hearing adolescents and young adults aged between 13 and 21 years. Deaf adolescents and adults had lower scores than hearing participants in all the steps of the SIP model (coding, interpretation, goal formulation, response generation, response decision, and representation). However, deaf girls and women had better scores on social adjustment and on some SIP skills than deaf male participants. PMID:27143715

  3. Integration of health care information systems.

    PubMed

    Cuddeback, J K

    1993-03-01

    Information is at the core of an effective response to virtually all of the new demands that health care institutions will face in the 1990s. New information that is differently organized, more timely, and more conveniently available will facilitate new interactions within the institution. The consistent theme of the new systems requirements introduced by CQI is tighter connection to the processes of patient care and integration of systems and data as those processes cross traditional organizational boundaries. Even the billing requirements are pushing in the same direction. Ironically, the dinosaurs descended from billing systems do not even perform very well as billing systems today, because payers want more clinical detail, in addition to information at very specific points during the patient-care process. This new management model changes our view of our systems. Instead of systems designed to create an after-the-fact record of patient care, we need to think in terms of systems that are part of the patient-care process. This is essential for the continuous monitoring and--when the process gets out of control--rapid intervention that are an intrinsic part of the process in the CQI model. Of course, these systems also produce a complete record as a by-product, but that is not their primary objective. These demands will test the capacities of many of our existing systems and will require the replacement of others. Like all complex processes, however, systems development is performed one step at a time. Each step is taken within the context of an overall goal but also presents an opportunity for learning. CQI is a new management model, and the system requirements are far from clear. Hence, we are likely to need a little continuous improvement in the systems, too.

  4. The Effects of a Concept Map-Based Information Display in an Electronic Portfolio System on Information Processing and Retention in a Fifth-Grade Science Class Covering the Earth's Atmosphere

    ERIC Educational Resources Information Center

    Kim, Paul; Olaciregui, Claudia

    2008-01-01

    An electronic portfolio system, designed to serve as a resource-based learning space, was tested in a fifth-grade science class. The control-group students accessed a traditional folder-based information display in the system and the experimental-group students accessed a concept map-based information display to review a science portfolio. The…

  5. Near real time data processing system

    NASA Astrophysics Data System (ADS)

    Mousessian, Ardvas; Vuu, Christina

    2008-08-01

    Raytheon recently developed and implemented a Near Real Time (NRT) data processing subsystem for Earth Observing System (EOS) Microwave Limb Sounder (MLS3) instrument on NASA Aura spacecraft. The NRT can be viewed as a customized Science Information Processing System (SIPS) where the measurements and information provided by the instrument are expeditiously processed, packaged, and delivered. The purpose of the MLS NRT is to process Level 0 data up through Level 2, and distribute standard data products to the customer within 3-5 hours of the first set of data arrival.

  6. Handbook on COMTAL's Image Processing System

    NASA Technical Reports Server (NTRS)

    Faulcon, N. D.

    1983-01-01

    An image processing system is the combination of an image processor with other control and display devices plus the necessary software needed to produce an interactive capability to analyze and enhance image data. Such an image processing system installed at NASA Langley Research Center, Instrument Research Division, Acoustics and Vibration Instrumentation Section (AVIS) is described. Although much of the information contained herein can be found in the other references, it is hoped that this single handbook will give the user better access, in concise form, to pertinent information and usage of the image processing system.

  7. System Description and Status Report: California Education Information System.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    The California Education Information System (CEIS) consists of two subsystems of computer programs designed to process business and pupil data for local school districts. Creating and maintaining records concerning the students in the schools, the pupil subsystem provides for a central repository of school district identification information and a…

  8. MIMS - MEDICAL INFORMATION MANAGEMENT SYSTEM

    NASA Technical Reports Server (NTRS)

    Frankowski, J. W.

    1994-01-01

    MIMS, Medical Information Management System is an interactive, general purpose information storage and retrieval system. It was first designed to be used in medical data management, and can be used to handle all aspects of data related to patient care. Other areas of application for MIMS include: managing occupational safety data in the public and private sectors; handling judicial information where speed and accuracy are high priorities; systemizing purchasing and procurement systems; and analyzing organizational cost structures. Because of its free format design, MIMS can offer immediate assistance where manipulation of large data bases is required. File structures, data categories, field lengths and formats, including alphabetic and/or numeric, are all user defined. The user can quickly and efficiently extract, display, and analyze the data. Three means of extracting data are provided: certain short items of information, such as social security numbers, can be used to uniquely identify each record for quick access; records can be selected which match conditions defined by the user; and specific categories of data can be selected. Data may be displayed and analyzed in several ways which include: generating tabular information assembled from comparison of all the records on the system; generating statistical information on numeric data such as means, standard deviations and standard errors; and displaying formatted listings of output data. The MIMS program is written in Microsoft FORTRAN-77. It was designed to operate on IBM Personal Computers and compatibles running under PC or MS DOS 2.00 or higher. MIMS was developed in 1987.

  9. Geographical information system for flight safety

    NASA Astrophysics Data System (ADS)

    Yamamoto, Hiromichi; Homma, Kohzo; Gomi, Hiromi; Kitagata, Satoru; Kumasaka, Kazuhiro; Oikawa, Tetsuya

    2003-03-01

    This paper proposes a geographical information system for terrain and obstacle awareness and alerting that extracts information from high-resolution satellite images. On-board terrain elevation databases are being increasingly used in aircraft terrain awareness and warning systems (TAWS), offering a step change in capability from the radar altimeter-based ground proximity warning system. However, to enhance the safety of flight of small aircraft and helicopters, in addition to pure topographic information a TAWS database should also contain significant man-made obstacles that present a collision hazard, such tall buildings and chimneys, communications masts and electrical power transmission lines. Another issue is keeping the terrain and obstacle database current, reflecting changes to features over time. High-resolution stereoscopic images remotely sensed from Earth orbit have great potential for addressing these issues. In this paper, some critical items are discussed and effective information processing schemes for extracting information relevant to flight safety from satellite images are proposed.

  10. MISS: A Metamodel of Information System Service

    NASA Astrophysics Data System (ADS)

    Arni-Bloch, Nicolas; Ralyté, Jolita

    Integration of different components that compose enterprise information systems (IS) represents a big challenge in the IS development. However, this integration is indispensable in order to avoid IS fragmentation and redundancy between different IS applications. In this work we apply service-oriented development principles to information systems. We define the concept of information system service (ISS) and propose a metamodel of ISS (MISS). We claim that it is not sufficient to consider an ISS as a black box and it is essential to include in the ISS specification the information about service structure, processes and rules shared with other services and thus to make the service transparent. Therefore we define the MISS using three informational spaces (static, dynamic and rule).

  11. [The Mechanisms of Orientation Sensitivity of Human Vision System. Part II: Neural Patterns of Early Processing of Information about Line Orientation].

    PubMed

    Mikhailova, E S; Gerasimenko, N Yu; Krylova, M A; Izyurov, I V; Slavutskaya, A V

    2015-01-01

    The high density EEG was registered in 41 healthy subjects (20 males, 21 females) in the cardinal (horizontal and vertical) and oblique (45 and 135 deg) line orientation identification task. The analysis of the adaptive amplitude maximum (4 ms averaging) of P1 and N1 evoked potentials in the symmetrical occipital, parietal and inferior temporal areas and dipole source modelling showed the anisotropy of cortical responses in the 80-150 ms interval. The amplitude is higher on the oblique orientations as comparison with cardinal ones. The temporal and regional features of cortical answers were discovered. The earlier selective response (~90 ms latency) is registered.in the parietal areas, while the later (~145 ms latency) is found in the occipital ones. We discovered a number of sex-related differences in the early stages of line orientation detection. In males, the amplitude of components is higher; they have broader area of localisation of their dipole sources: in addition to the occipital and parietal regions, cortex of the temporal regions is involved. Theobtained data are discussed in the context of the idea of effective neural coding (Barlow, 1959) and the features of spatial information processing in the visual system of males and females. PMID:26237944

  12. A Computerized Hospital Patient Information Management System

    PubMed Central

    Wig, Eldon D.

    1982-01-01

    The information processing needs of a hospital are many, with varying degrees of complexity. The prime concern in providing an integrated hospital information management system lies in the ability to process the data relating to the single entity for which every hospital functions - the patient. This paper examines the PRIMIS computer system developed to accommodate hospital needs with respect to a central patient registry, inpatients (i.e., Admission/Transfer/Discharge), and out-patients. Finally, the potential for expansion to permit the incorporation of more hospital functions within PRIMIS is examined.

  13. Central waste processing system

    NASA Technical Reports Server (NTRS)

    Kester, F. L.

    1973-01-01

    A new concept for processing spacecraft type wastes has been evaluated. The feasibility of reacting various waste materials with steam at temperatures of 538 - 760 C in both a continuous and batch reactor with residence times from 3 to 60 seconds has been established. Essentially complete gasification is achieved. Product gases are primarily hydrogen, carbon dioxide, methane, and carbon monoxide. Water soluble synthetic wastes are readily processed in a continuous tubular reactor at concentrations up to 20 weight percent. The batch reactor is able to process wet and dry wastes at steam to waste weight ratios from 2 to 20. Feces, urine, and synthetic wastes have been successfully processed in the batch reactor.

  14. Simulating The SSF Information System

    NASA Technical Reports Server (NTRS)

    Deshpande, Govind K.; Kleine, Henry; Younger, Joseph C.; Sanders, Felicia A.; Smith, Jeffrey L.; Aster, Robert W.; Olivieri, Jerry M.; Paul, Lori L.

    1993-01-01

    Freedom Operations Simulation Test (FROST) computer program simulates operation of SSF information system, tracking every packet of data from generation to destination, for both uplinks and downlinks. Collects various statistics concerning operation of system and provides reports of statistics at intervals specified by user. FROST also incorporates graphical-display capability to enhance interpretation of these statistics. Written in SIMSCRIPT 11.5.

  15. Municipal solid waste landfill site selection with geographic information systems and analytical hierarchy process: a case study in Mahshahr County, Iran.

    PubMed

    Alavi, Nadali; Goudarzi, Gholamreza; Babaei, Ali Akbar; Jaafarzadeh, Nemat; Hosseinzadeh, Mohsen

    2013-01-01

    Landfill siting is a complicated process because it must combine social, environmental and technical factors. In this study, in order to consider all factors and rating criteria, a combination of geographic information systems and analytical hierarchy process (AHP) was used to determine the best sites for disposal of municipal solid waste (MSW) in Mahshahr County, Iran. In order to the decision making for landfill siting a structural hierarchy formed and the most important criteria: surface water, sensitive ecosystems, land cover, urban and rural areas, land uses, distance to roads, slope and land type were chosen according to standards and regulations. Each criterion was evaluated by rating methods. In the next step the relative importance of criteria to each other was determined by AHP. Land suitability for landfill was evaluated by simple additive weighting method. According to the landfill suitability map, the study area classified to four categories: high, moderate, low and very low suitability areas, which represented 18.6%, 20.3%, 1.6 and 0.8% of the study area respectively. The other 58.7% of the study area was determined to be completely unsuitable for landfill. By considering the parameters, such as the required area for landfill, distance to MSW generation points, and political and management issues, and consulting with municipalities managers in the study area, six sites were chosen for site visiting. The result of field study showed that it is a supplementary, and necessary, step in finding the best candidate landfill site from land with high suitability. PMID:22878933

  16. NASA's Earth Observing System Data and Information System - EOSDIS

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2011-01-01

    This slide presentation reviews the work of NASA's Earth Observing System Data and Information System (EOSDIS), a petabyte-scale archive of environmental data that supports global climate change research. The Earth Science Data Systems provide end-to-end capabilities to deliver data and information products to users in support of understanding the Earth system. The presentation contains photographs from space of recent events, (i.e., the effects of the tsunami in Japan, and the wildfires in Australia.) It also includes details of the Data Centers that provide the data to EOSDIS and Science Investigator-led Processing Systems. Information about the Land, Atmosphere Near-real-time Capability for EOS (LANCE) and some of the uses that the system has made possible are reviewed. Also included is information about how to access the data, and evolutionary plans for the future of the system.

  17. Informational analysis involving application of complex information system

    NASA Astrophysics Data System (ADS)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  18. Information Systems Coordinate Emergency Management

    NASA Technical Reports Server (NTRS)

    2012-01-01

    The rescue crews have been searching for the woman for nearly a week. Hurricane Katrina devastated Hancock County, the southernmost point in Mississippi, and the woman had stayed through the storm in her beach house. There is little hope of finding her alive; the search teams know she is gone because the house is gone. Late at night in the art classroom of the school that is serving as the county s emergency operations center, Craig Harvey is discussing the search with the center s commander. Harvey is the Chief Operating Officer of a unique company called NVision Solutions Inc., based at NASA s Stennis Space Center in Bay St. Louis, only a couple of miles away. He and his entire staff have set up a volunteer operation in the art room, supporting the emergency management efforts using technology and capabilities the company developed through its NASA partnerships. As he talks to the commander, Harvey feels an idea taking shape that might lead them to the woman s location. Working with surface elevation data and hydrological principles, Harvey creates a map showing how the floodwaters from the storm would have flowed along the topography of the region around the woman s former home. Using the map, search crews find the woman s body in 15 minutes. Recovering individuals who have been lost is a sad reality of emergency management in the wake of a disaster like Hurricane Katrina in 2005. But the sooner answers can be provided, the sooner a community s overall recovery can take place. When damage is extensive, resources are scattered, and people are in dire need of food, shelter, and medical assistance, the speed and efficiency of emergency operations can be the key to limiting the impact of a disaster and speeding the process of recovery. And a key to quick and effective emergency planning and response is geographic information. With a host of Earth-observing satellites orbiting the globe at all times, NASA generates an unmatched wealth of data about our ever

  19. Development of living body information monitoring system

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hidetoshi; Ohbuchi, Yoshifumi; Torigoe, Ippei; Miyagawa, Hidekazu; Murayama, Nobuki; Hayashida, Yuki; Igasaki, Tomohiko

    2009-12-01

    The easy monitoring systems of contact and non-contact living body information for preventing the the Sudden Infant Death Syndrome (SIDS) were proposed as an alternative monitoring system of the infant's vital information. As for the contact monitoring system, respiration sensor, ECG electrodes, thermistor and IC signal processor were integrated into babies' nappy holder. This contact-monitoring unit has RF transmission function and the obtained data are analyzed in real time by PC. In non-contact mortaring system, the infrared thermo camera was used. The surrounding of the infant's mouth and nose is monitored and the respiration rate is obtained by thermal image processing of its temperature change image of expired air. This proposed system of in-sleep infant's vital information monitoring system and unit are very effective as not only infant's condition monitoring but also nursing person's one.

  20. Development of living body information monitoring system

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hidetoshi; Ohbuchi, Yoshifumi; Torigoe, Ippei; Miyagawa, Hidekazu; Murayama, Nobuki; Hayashida, Yuki; Igasaki, Tomohiko

    2010-03-01

    The easy monitoring systems of contact and non-contact living body information for preventing the the Sudden Infant Death Syndrome (SIDS) were proposed as an alternative monitoring system of the infant's vital information. As for the contact monitoring system, respiration sensor, ECG electrodes, thermistor and IC signal processor were integrated into babies' nappy holder. This contact-monitoring unit has RF transmission function and the obtained data are analyzed in real time by PC. In non-contact mortaring system, the infrared thermo camera was used. The surrounding of the infant's mouth and nose is monitored and the respiration rate is obtained by thermal image processing of its temperature change image of expired air. This proposed system of in-sleep infant's vital information monitoring system and unit are very effective as not only infant's condition monitoring but also nursing person's one.

  1. Systems Suitable for Information Professionals.

    ERIC Educational Resources Information Center

    Blair, John C., Jr.

    1983-01-01

    Describes computer operating systems applicable to microcomputers, noting hardware components, advantages and disadvantages of each system, local area networks, distributed processing, and a fully configured system. Lists of hardware components (disk drives, solid state disk emulators, input/output and memory components, and processors) and…

  2. A systems process of reinforcement.

    PubMed

    Sudakov, K V

    1997-01-01

    Functional systems theory was used to consider the process of reinforcement of the actions on the body of reinforcing factors, i.e., the results of behavior satisfying the body's original needs. The systems process of reinforcement includes reverse afferentation entering the CNS from receptors acted upon by various parameters of the desired results, and mechanisms for comparing reverse afferentation with the apparatus which accepts the results of the action and the corresponding emotional component. A tight interaction between reinforcement and the dominant motivation is generated on the basis of the hologram principle. Reinforcement forms an apparatus for predicting a desired result, i.e. a result-of-action acceptor. Reinforcement procedures significant changes in the activities of individual neurons in the various brain structures involved in dominant motivation, transforming their spike activity for a burst pattern to regular discharges; there are also molecular changes in neuron properties. After preliminary reinforcement, the corresponding motivation induces the ribosomal system of neurons to start synthesizing special effector molecules, which organize molecular engrams of the acceptor of the action's result. Sensory mechanisms of reinforcement are considered, with particular reference to the information role of emotions.

  3. Decentralized Multisensory Information Integration in Neural Systems

    PubMed Central

    Zhang, Wen-hao; Chen, Aihua

    2016-01-01

    the decentralized system can integrate information optimally, with the reciprocal connections between processers determining the extent of cue integration. Our model reproduces known multisensory integration behaviors observed in experiments and sheds new light on our understanding of how information is integrated in the brain. PMID:26758843

  4. Accuracy of Information Processing under Focused Attention.

    ERIC Educational Resources Information Center

    Bastick, Tony

    This paper reports the results of an experiment on the accuracy of information processing during attention focused arousal under two conditions: single estimation and double estimation. The attention of 187 college students was focused by a task requiring high level competition for a monetary prize ($10) under severely limited time conditions. The…

  5. Springfield Processing Plant (SPP) Facility Information

    SciTech Connect

    Leach, Janice; Torres, Teresa M.

    2012-10-01

    The Springfield Processing Plant is a hypothetical facility. It has been constructed for use in training workshops. Information is provided about the facility and its surroundings, particularly security-related aspects such as target identification, threat data, entry control, and response force data.

  6. IDEAL: A methology for developing information systems

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1988-01-01

    As a result of improved capabilities obtained through current computer technologies, application programs and expert systems, Enterprises are being designed or upgraded to be highly integrated and automated information systems. To design or modify Enterprises, it is necessary to first define what functions are to be performed within the Enterprise, identify which functions are potential candidates for automation, and what automated or expert systems are available, or must be developed, to accomplish the selected function. Second, it is necessary to define and analyze the informational requirements for each function along with the informational relationships among the functions so that a database structure can be established to support the Enterprise. To perform this type of system design, an integrated set of analysis tools is required to support the information analysis process. The IDEAL (Integrated Design and Engineering Analysis Languages) methodology provides this integrated set of tools and is discussed.

  7. Thermodynamic Costs of Information Processing in Sensory Adaptation

    PubMed Central

    Sartori, Pablo; Granger, Léo; Lee, Chiu Fan; Horowitz, Jordan M.

    2014-01-01

    Biological sensory systems react to changes in their surroundings. They are characterized by fast response and slow adaptation to varying environmental cues. Insofar as sensory adaptive systems map environmental changes to changes of their internal degrees of freedom, they can be regarded as computational devices manipulating information. Landauer established that information is ultimately physical, and its manipulation subject to the entropic and energetic bounds of thermodynamics. Thus the fundamental costs of biological sensory adaptation can be elucidated by tracking how the information the system has about its environment is altered. These bounds are particularly relevant for small organisms, which unlike everyday computers, operate at very low energies. In this paper, we establish a general framework for the thermodynamics of information processing in sensing. With it, we quantify how during sensory adaptation information about the past is erased, while information about the present is gathered. This process produces entropy larger than the amount of old information erased and has an energetic cost bounded by the amount of new information written to memory. We apply these principles to the E. coli's chemotaxis pathway during binary ligand concentration changes. In this regime, we quantify the amount of information stored by each methyl group and show that receptors consume energy in the range of the information-theoretic minimum. Our work provides a basis for further inquiries into more complex phenomena, such as gradient sensing and frequency response. PMID:25503948

  8. Conceptual Coordination Bridges Information Processing and Neurophysiology

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Norrig, Peter (Technical Monitor)

    2000-01-01

    Information processing theories of memory and skills can be reformulated in terms of how categories are physically and temporally related, a process called conceptual coordination. Dreaming can then be understood as a story understanding process in which two mechanisms found in everyday comprehension are missing: conceiving sequences (chunking categories in time as a categorization) and coordinating across modalities (e.g., relating the sound of a word and the image of its meaning). On this basis, we can readily identify isomorphisms between dream phenomenology and neurophysiology, and explain the function of dreaming as facilitating future coordination of sequential, cross-modal categorization (i.e., REM sleep lowers activation thresholds, "unlearning").

  9. Improvements to information management systems simulator

    NASA Technical Reports Server (NTRS)

    Bilek, R. W.

    1972-01-01

    The performance of personnel in the augmentation and improvement of the interactive IMSIM information management simulation model is summarized. With this augmented model, NASA now has even greater capabilities for the simulation of computer system configurations, data processing loads imposed on these configurations, and executive software to control system operations. Through these simulations, NASA has an extremely cost effective capability for the design and analysis of computer-based data management systems.

  10. Information efficiency in hyperspectral imaging systems

    NASA Astrophysics Data System (ADS)

    Reichenbach, Stephen E.; Cao, Luyin; Narayanan, Ram M.

    2002-07-01

    In this work we develop a method for assessing the information density and efficiency of hyperspectral imaging systems that have spectral bands of nonuniform width. Imaging system designs with spectral bands of nonuniform width can efficiently gather information about a scene by allocating bandwidth among the bands according to their information content. The information efficiency is the ratio of information density to data density and is a function of the scene's spectral radiance, hyperspectral system design, and signal-to-noise ratio. The assessment can be used to produce an efficient system design. For example, one approach to determining the number and width of the spectral bands for an information-efficient design is to begin with a design that has a single band and then to iteratively divide a band into two bands until no further division improves the system's efficiency. Two experiments illustrate this approach, one using a simple mathematical model for the scene spectral-radiance autocorrelation function and the other using the deterministic spectral-radiance autocorrelation function of a hyperspectral image from NASA's Advanced Solid-State Array Spectroradiometer. The approach could be used either to determine a fixed system design or to dynamically control a system with variable-width spectral bands (e.g., using on-board processing in a satellite system).

  11. Optical Information Processing for Aerospace Applications 2

    NASA Technical Reports Server (NTRS)

    Stermer, R. L. (Compiler)

    1984-01-01

    Current research in optical processing, and determination of its role in future aerospace systems was reviewed. It is shown that optical processing offers significant potential for aircraft and spacecraft control, pattern recognition, and robotics. It is demonstrated that the development of optical devices and components can be implemented in practical aerospace configurations.

  12. Information Structure: Linguistic, Cognitive, and Processing Approaches

    PubMed Central

    Arnold, Jennifer E.; Kaiser, Elsi; Kahn, Jason M.; Kim, Lucy Kyoungsook

    2013-01-01

    Language form varies as a result of the information being communicated. Some of the ways in which it varies include word order, referential form, morphological marking, and prosody. The relevant categories of information include the way a word or its referent have been used in context, for example whether a particular referent has been previously mentioned or not, and whether it plays a topical role in the current utterance or discourse. We first provide a broad review of linguistic phenomena that are sensitive to information structure. We then discuss several theoretical approaches to explaining information structure: information status as a part of the grammar; information status as a representation of the speaker’s and listener’s knowledge of common ground and/or the knowledge state of other discourse participants; and the optimal systems approach. These disparate approaches reflect the fact that there is little consensus in the field about precisely which information status categories are relevant, or how they should be represented. We consider possibilities for future work to bring these lines of work together in explicit psycholinguistic models of how people encode information status and use it for language production and comprehension. PMID:26150905

  13. Forest Resource Information System (FRIS)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The technological and economical feasibility of using multispectral digital image data as acquired from the LANDSAT satellites in an ongoing operational forest information system was evaluated. Computer compatible multispectral scanner data secured from the LANDSAT satellites were demonstrated to be a significant contributor to ongoing information systems by providing the added dimensions of synoptic and repeat coverage of the Earth's surface. Major forest cover types of conifer, deciduous, mixed conifer-deciduous and non-forest, were classified well within the bounds of the statistical accuracy of the ground sample. Further, when overlayed with existing maps, the acreage of cover type retains a high level of positional integrity. Maps were digitized by a graphics design system, overlayed and registered onto LANDSAT imagery such that the map data with associated attributes were displayed on the image. Once classified, the analysis results were converted back to map form as a cover type of information. Existing tabular information as represented by inventory is registered geographically to the map base through a vendor provided data management system. The notion of a geographical reference base (map) providing the framework to which imagery and tabular data bases are registered and where each of the three functions of imagery, maps and inventory can be accessed singly or in combination is the very essence of the forest resource information system design.

  14. Evolution of toxicology information systems

    SciTech Connect

    Wassom, J.S.; Lu, P.Y.

    1990-12-31

    Society today is faced with new health risk situations that have been brought about by recent scientific and technical advances. Federal and state governments are required to assess the many potential health risks to exposed populations from the products (chemicals) and by-products (pollutants) of these advances. Because a sound analysis of any potential health risk should be based on the use of relevant information, it behooves those individuals responsible for making the risk assessments to know where to obtain needed information. This paper reviews the origins of toxicology information systems and explores the specialized information center concept that was proposed in 1963 as a means of providing ready access to scientific and technical information. As a means of illustrating this concept, the operation of one specialized information center (the Environmental Mutagen Information Center at Oak Ridge National Laboratory) will be discussed. Insights into how toxicological information resources came into being, their design and makeup, will be of value to those seeking to acquire information for risk assessment purposes. 7 refs., 1 fig., 4 tabs.

  15. Using Innovative Information Systems Techniques To Teach Information Systems.

    ERIC Educational Resources Information Center

    Chimi, Carl J.; Gordon, Gene M.

    This paper discusses a number of innovative techniques that were used to teach courses in Information Systems to undergraduate and graduate students. While none of these techniques is individually innovative, the combination of techniques provides a true "hands-on" environment for students; because of the way that the components of the courses are…

  16. NICA project management information system

    NASA Astrophysics Data System (ADS)

    Bashashin, M. V.; Kekelidze, D. V.; Kostromin, S. A.; Korenkov, V. V.; Kuniaev, S. V.; Morozov, V. V.; Potrebenikov, Yu. K.; Trubnikov, G. V.; Philippov, A. V.

    2016-09-01

    The science projects growth, changing of the efficiency criteria during the project implementation require not only increasing of the management specialization level but also pose the problem of selecting the effective planning methods, monitoring of deadlines and interaction of participants involved in research projects. This paper is devoted to choosing the project management information system for the new heavy-ion collider NICA (Nuclotron based Ion Collider fAcility). We formulate the requirements for the project management information system with taking into account the specifics of the Joint Institute for Nuclear Research (JINR, Dubna, Russia) as an international intergovernmental research organization, which is developed on the basis of a flexible and effective information system for the NICA project management.

  17. Goal Based Testing: A Risk Informed Process

    NASA Technical Reports Server (NTRS)

    Everline, Chester; Smith, Clayton; Distefano, Sal; Goldin, Natalie

    2014-01-01

    A process for life demonstration testing is developed, which can reduce the number of resources required by conventional sampling theory while still maintaining the same degree of rigor and confidence level. This process incorporates state-of-the-art probabilistic thinking and is consistent with existing NASA guidance documentation. This view of life testing changes the paradigm of testing a system for many hours to show confidence that a system will last for the required number of years to one that focuses efforts and resources on exploring how the system can fail at end-of-life and building confidence that the failure mechanisms are understood and well mitigated.

  18. New Directions in Legal Information Processing.

    ERIC Educational Resources Information Center

    Chien, R. T.; And Others

    The paper discusses some new developments that should evolve during the next decade in automating the handling of legal information. These new developments include: automated question-answering systems to provide quick and inexpensive answers to many non-controversial, but not necessarily simple legal questions to aid lawyers, social and welfare…

  19. Photonic qubits for remote quantum information processing

    NASA Astrophysics Data System (ADS)

    Maunz, P.; Olmschenk, S.; Hayes, D.; Matsukevich, D. N.; Duan, L.-M.; Monroe, C.

    2009-05-01

    Quantum information processing between remote quantum memories relies on a fast and faithful quantum channel. Recent experiments employed both, the photonic polarization and frequency qubits, in order to entangle remote atoms [1, 2], to teleport quantum information [3] and to operate a quantum gate between distant atoms. Here, we compare the dierent schemes used in these experiments and analyze the advantages of the dierent choices of atomic and photonic qubits and their coherence properties. [4pt] [1] D. L. Moehring et al. Nature 449, 68 (2007).[0pt] [2] D. N. Matsukevich et al. Phys. Rev. Lett. 100, 150404 2008).[0pt] [3] S. Olmschenk et al. Science, 323, 486 (2009).

  20. MINIS: Multipurpose Interactive NASA Information System

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The Multipurpose Interactive NASA Information Systems (MINIS) was developed in response to the need for a data management system capable of operation on several different minicomputer systems. The desired system had to be capable of performing the functions of a LANDSAT photo descriptive data retrieval system while remaining general in terms of other acceptable user definable data bases. The system also had to be capable of performing data base updates and providing user-formatted output reports. The resultant MINI System provides all of these capabilities and several other features to complement the data management system. The MINI System is currently implemented on two minicomputer systems and is in the process of being installed on another minicomputer system. The MINIS is operational on four different data bases.

  1. Advanced system functions for the office information system

    NASA Astrophysics Data System (ADS)

    Ishikawa, Tetsuya

    First, author describes the functions needed for information management system in office. Next, he mentions the requisites for the enhancement of system functions. In order to make enhancement of system functions, he states, it is necessary to examine them comprehensively from every point of view including processing hour and cost. In this paper, he concentrates on the enhancement of man-machine interface (= human interface), that is, how to make system easy to use for the office workers.

  2. Marketing in Admissions: The Information System Approach.

    ERIC Educational Resources Information Center

    Wofford, O. Douglas; Timmerman, Ed

    1982-01-01

    A marketing information system approach for college admissions is outlined that includes objectives, information needs and sources, a data collection format, and information evaluation. Coordination with other institutional information systems is recommended. (MSE)

  3. Business Information Systems. Occupational Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Occupational Competency Analysis Profile (OCAP) for business information systems is an employer-verified competency list that evolved from a modified DACUM (Developing a Curriculum) job analysis process involving business, industry, labor, and community agency representatives throughout Ohio. The competency list consists of 10 units: (1) data…

  4. Distributed processing techniques: interface design for interactive information sharing.

    PubMed

    Wagner, J R; Krumbholz, S D; Silber, L K; Aniello, A J

    1978-01-01

    The Information Systems Division of the University of Iowa Hospitals and Clinics has successfully designed and implemented a set of generalized interface data-handling routines that control message traffic between a satellite minicomputer in a clinical laboratory and a large main-frame computer. A special queue status inquiry transaction has also been developed that displays the current message-processing backlog and other system performance information. The design and operation of these programs are discussed in detail, with special emphasis on the message-queuing and verification techniques required in a distributed processing environment.

  5. Spatial aspects of intracellular information processing.

    PubMed

    Kinkhabwala, Ali; Bastiaens, Philippe I H

    2010-02-01

    The computational properties of intracellular biochemical networks, for which the cell is assumed to be a 'well-mixed' reactor, have already been widely characterized. What has so far not received systematic treatment is the important role of space in many intracellular computations. Spatial network computations can be divided into two broad categories: those required for essential spatial processes (e.g. polarization, chemotaxis, division, and development) and those for which space is simply used as an extra dimension to expand the computational power of the network. Several pertinent recent examples of each category are discussed that illustrate the often conceptually subtle role of space in the processing of intracellular information. PMID:20096560

  6. [Electronic poison information management system].

    PubMed

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier. PMID:24466697

  7. [Electronic poison information management system].

    PubMed

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  8. Mass Storage Performance Information System

    NASA Technical Reports Server (NTRS)

    Scheuermann, Peter

    2000-01-01

    The purpose of this task is to develop a data warehouse to enable system administrators and their managers to gather information by querying the data logs of the MDSDS. Currently detailed logs capture the activity of the MDSDS internal to the different systems. The elements to be included in the data warehouse are requirements analysis, data cleansing, database design, database population, hardware/software acquisition, data transformation, query and report generation, and data mining.

  9. EOSDIS Science Data Information and Analysis Systems

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Ullman, R.; Pfister, R.

    2002-12-01

    NASA's Earth Science Data Information Systems (ESDIS) Project is committed to operating and maintaining a quality Earth Observing System (EOS) Data and Information System (EOSDIS) which enables research by Earth scientists and fosters data accessibility and application by the broader user community. With the recent launch of Aqua, a few hundred new datasets will be added to the current 1560 datasets available through EOSDIS. One of the core functions at ESDIS is to enable the processing of all science data collected from the various EOS missions including Terra and Aqua, upcoming ICESat and Aura and other missions. There are many EOS Science data producers, data users, planners and managers of available data systems and tools for managing EOS data. There are also many services available through EOSDIS including those that will help scientists process, archive and access data and information for research, applications, planning and management. This paper will describe system services, functionality, access requirements and procedures and the intended user community that work principally with EOSDIS data. It will address analysis tools, data population tools, specific EOSDIS data sets and metadata types, tools for metadata creation and management, tools for distribution, EOSDIS data formats and distribution techniques. New techniques are critical to the success of EOS data manipulation including data mining, intelligent data archiving, data fusion, agent technologies, visualization, and other advanced information system concepts. Data management is key to EOSDIS and our strategic focus areas look to EOSDIS evolution, external integration, data system development and relationship building with our user community.

  10. Processing data base information having nonwhite noise

    DOEpatents

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  11. Music Information Services System (MISS).

    ERIC Educational Resources Information Center

    Rao, Paladugu V.

    Music Information Services System (MISS) was developed at the Eastern Illinois University Library to manage the sound recording collection. Operating in a batch mode, MISS keeps track of the inventory of sound recordings, generates necessary catalogs to facilitate the use of the sound recordings, and provides specialized bibliographies of sound…

  12. Information Systems, Security, and Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    1984-01-01

    Computer security and computer privacy issues are discussed. Among the areas addressed are technical and human security threats, security and privacy issues for information in electronic mail systems, the need for a national commission to examine these issues, and security/privacy issues relevant to colleges and universities. (JN)

  13. Policy Information System Computer Program.

    ERIC Educational Resources Information Center

    Hamlin, Roger E.; And Others

    The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…

  14. Learning Information Systems: Theoretical Foundations.

    ERIC Educational Resources Information Center

    Paul, Terrance D.

    This paper uses the conceptual framework of cybernetics to understand why learning information systems such as the "Accelerated Reader" work so successfully, and to examine how this simple yet incisive concept can be used to accelerate learning at every level and in all disciplines. The first section, "Basic Concepts," discusses the cybernetic…

  15. Information Systems: Fact or Fiction.

    ERIC Educational Resources Information Center

    Bearley, William

    Rising costs of programming and program maintenance have caused discussion concerning the need for generalized information systems. These would provide data base functions plus complete report writing and file maintenance capabilities. All administrative applications, including online registration, student records, and financial applications are…

  16. Information processing by simple molecular motifs and susceptibility to noise.

    PubMed

    Mc Mahon, Siobhan S; Lenive, Oleg; Filippi, Sarah; Stumpf, Michael P H

    2015-09-01

    Biological organisms rely on their ability to sense and respond appropriately to their environment. The molecular mechanisms that facilitate these essential processes are however subject to a range of random effects and stochastic processes, which jointly affect the reliability of information transmission between receptors and, for example, the physiological downstream response. Information is mathematically defined in terms of the entropy; and the extent of information flowing across an information channel or signalling system is typically measured by the 'mutual information', or the reduction in the uncertainty about the output once the input signal is known. Here, we quantify how extrinsic and intrinsic noise affects the transmission of simple signals along simple motifs of molecular interaction networks. Even for very simple systems, the effects of the different sources of variability alone and in combination can give rise to bewildering complexity. In particular, extrinsic variability is apt to generate 'apparent' information that can, in extreme cases, mask the actual information that for a single system would flow between the different molecular components making up cellular signalling pathways. We show how this artificial inflation in apparent information arises and how the effects of different types of noise alone and in combination can be understood.

  17. Storage and distribution system for multimedia information

    NASA Astrophysics Data System (ADS)

    Murakami, Tokumichi

    1994-06-01

    Recent advances in technologies such as digital signal processing, LSI devices and storage media have led to an explosive growth in multimedia environment. Multimedia information services are expected to provide an information-oriented infrastructure which will integrate visual communication, broadcasting and computer services. International standardizations in video/audio coding accelerate permeation of these services into society. In this paper, from trends of R & D and international standardization in video coding techniques, an outline is given of a storage and distribution system for multimedia information, and a summary of the requirements of digital storage media.

  18. Online information retrieval systems and health professionals.

    PubMed

    Lialiou, Pascalina; Mantas, John

    2014-01-01

    The following paper presents a scientific contribution that explores the clinicians' use of online information retrieval systems for their clinical decision making. Particularly, the research focuses on the ability of doctors and nurses in seeking information through MEDLINE and ScienceDirect. The research process took place by an electronic form consisted of five clinical scenarios and an evaluation sheet. The results testify that only a small percent of clinicians use the recommended electronic bibliographic databasesand take the right clinical decision to the scenarios. Health professionals have to be educated in information searching and take advantage from the provided literature taking more useful and reliable answers on their clinical questions.

  19. Information survey for microcomputer systems integration

    SciTech Connect

    Hake, K.A.

    1991-12-01

    One goal of the PM-AIM is to provide US Army Project Managers (PMs) and Project Executive Officers (PEOs) with a fundamental microcomputing resource to help perform acquisition information management and its concomitant reporting requirements. Providing key application software represents one means of accomplishing this goal. This workstation would furnish a broad range of capabilities needed in the PM and PEO office settings as well as software tools for specific project management and acquisition information. Although still in the conceptual phase, the practical result of this exercise in systems integration will likely be a system called the Project Manager's Information System (PMIS) or the AIM workstation. It would include such software as, Project Manager's System Software (PMSS), Defense Acquisition Executive Summary (DAES), and Consolidated Acquisition Reporting System (CARS) and would conform to open systems architecture as accepted by the Department of Defense. ORNL has assisted PM-AIM in the development of technology ideas for the PMIS workstation concept. This paper represents the compilation of information gained during this process. This information is presented as a body of knowledge (or knowledge domain) defining the complex technology of microcomputing. The concept of systems integration or tying together all hardware and software components reflects the nature of PM-AIM's task in attempting to field a PMIS or AIM workstation.

  20. Information survey for microcomputer systems integration

    SciTech Connect

    Hake, K.A.

    1991-12-01

    One goal of the PM-AIM is to provide US Army Project Managers (PMs) and Project Executive Officers (PEOs) with a fundamental microcomputing resource to help perform acquisition information management and its concomitant reporting requirements. Providing key application software represents one means of accomplishing this goal. This workstation would furnish a broad range of capabilities needed in the PM and PEO office settings as well as software tools for specific project management and acquisition information. Although still in the conceptual phase, the practical result of this exercise in systems integration will likely be a system called the Project Manager`s Information System (PMIS) or the AIM workstation. It would include such software as, Project Manager`s System Software (PMSS), Defense Acquisition Executive Summary (DAES), and Consolidated Acquisition Reporting System (CARS) and would conform to open systems architecture as accepted by the Department of Defense. ORNL has assisted PM-AIM in the development of technology ideas for the PMIS workstation concept. This paper represents the compilation of information gained during this process. This information is presented as a body of knowledge (or knowledge domain) defining the complex technology of microcomputing. The concept of systems integration or tying together all hardware and software components reflects the nature of PM-AIM`s task in attempting to field a PMIS or AIM workstation.

  1. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  2. Database Systems. Course Three. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the third of seven in the Information Systems curriculum. The purpose of the course is to familiarize students with database management concepts and standard database management software. Databases and their roles, advantages, and limitations are explained. An overview of the course sets forth the condition and performance standard…

  3. Understanding and implementing hospital information systems.

    PubMed

    1995-02-01

    One of a hospital's greatest resources is its information. The hospital's information system, whether computerized or manual, is the means by which data is collected, integrated, and retrieved. However, because optimal patient treatment, financial management, and hospital operation require that decisions be based on current, accurate, complete, and well-organized data, a computerized hospital information system (HIS), when correctly implemented, can be the most effective means of disseminating valuable information to decision makers. Although the systems currently in place in most hospitals are used primarily to manage finances, an integrated HIS is much more than a financial system; it can, in fact, coordinate all of a hospital's information needs. An integrated HIS develops over time, typically several years. Merely automating existing procedures may not provide many of the potential benefits of a new system and may even carry forward most of the drawbacks of the old system. Determining how information is currently processed in the hospital and putting together an effective team to carry out acquisition and implementation of an HIS must precede the purchase of computers, networks, and software applications. In Part 1 of this article, we describe hospitals' general information needs and provide an overview of the current state of HISs and what hospitals can expect to gain from implementing a new system; in Part 2, we describe the steps hospitals can take when putting the system in place. We caution readers that, although we will be discussing many benefits of successful HISs, little documented or quantified evidence exists to show that these benefits are being realized; most evidence is subjective and qualitative, and claims are not thoroughly substantiated. Few, if any, hospitals have achieved the completely integrated system model--or even come close. Nevertheless, this article provides the groundwork for hospitals to make a thoughtful beginning. In upcoming

  4. PROMIS (Procurement Management Information System)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The PROcurement Management Information System (PROMIS) provides both detailed and summary level information on all procurement actions performed within NASA's procurement offices at Marshall Space Flight Center (MSFC). It provides not only on-line access, but also schedules procurement actions, monitors their progress, and updates Forecast Award Dates. Except for a few computational routines coded in FORTRAN, the majority of the systems is coded in a high level language called NATURAL. A relational Data Base Management System called ADABAS is utilized. Certain fields, called descriptors, are set up on each file to allow the selection of records based on a specified value or range of values. The use of like descriptors on different files serves as the link between the falls, thus producing a relational data base. Twenty related files are currently being maintained on PROMIS.

  5. INCEPTION, DESIGN AND IMPLEMENTATION OF A MANAGEMENT INFORMATION SYSTEM.

    ERIC Educational Resources Information Center

    LEWIS, DAVID ALFRED

    THE PURPOSE OF THIS PAPER IS TO DEVELOP AN INSTRUCTIONAL AND SYSTEMATIC APPROACH TO THE DESIGN AND IMPLEMENTATION OF A MANAGEMENT INFORMATION SYSTEM. GOALS, OBJECTIVES, STRUCTURE, AND RESPONSIBILITIES FORM THE FRAMEWORK OF A MANAGEMENT INFORMATION SYSTEM. THE TASK OF A MANAGEMENT INFORMATION SYSTEM IS TO PROCESS RAW DATA IN SUCH A WAY AS TO…

  6. Fisher Information in Ecological Systems

    NASA Astrophysics Data System (ADS)

    Frieden, B. Roy; Gatenby, Robert A.

    Fisher information is being increasingly used as a tool of research into ecological systems. For example the information was shown in Chapter 7 to provide a useful diagnostic of the health of an ecology. In other applications to ecology, extreme physical information (EPI) has been used to derive the population-rate (or Lotka-Volterra) equations of ecological systems, both directly [1] and indirectly (Chapter 5) via the quantum Schrodinger wave equation (SWE). We next build on these results, to derive (i) an uncertainty principle (8.3) of biology, (ii) a simple decision rule (8.18) for predicting whether a given ecology is susceptible to a sudden drop in population (Section 8.1), (iii) the probability law (8.57) or (8.59) on the worldwide occurrence of the masses of living creatures from mice to elephants and beyond (Section 8.2), and (iv) the famous quarter-power laws for the attributes of biological and other systems. The latter approach uses EPI to derive the simultaneous quarter-power behavior of all attributes obeyed by the law, such as metabolism rate, brain size, grazing range, etc. (Section 8.3). This maximal breadth of scope is allowed by its basis in information, which of course applies to all types of quantitative data (Section 1.4.3, Chapter 1).

  7. Nuclear magnetic resonance quantum information processing

    PubMed Central

    Serra, R. M.; Oliveira, I. S.

    2012-01-01

    For the past decade, nuclear magnetic resonance (NMR) has been established as a main experimental technique for testing quantum protocols in small systems. This Theme Issue presents recent advances and major challenges of NMR quantum information possessing (QIP), including contributions by researchers from 10 different countries. In this introduction, after a short comment on NMR-QIP basics, we briefly anticipate the contents of this issue. PMID:22946031

  8. Embedded ubiquitous services on hospital information systems.

    PubMed

    Kuroda, Tomohiro; Sasaki, Hiroshi; Suenaga, Takatoshi; Masuda, Yasushi; Yasumuro, Yoshihiro; Hori, Kenta; Ohboshi, Naoki; Takemura, Tadamasa; Chihara, Kunihiro; Yoshihara, Hiroyuki

    2012-11-01

    A Hospital Information Systems (HIS) have turned a hospital into a gigantic computer with huge computational power, huge storage and wired/wireless local area network. On the other hand, a modern medical device, such as echograph, is a computer system with several functional units connected by an internal network named a bus. Therefore, we can embed such a medical device into the HIS by simply replacing the bus with the local area network. This paper designed and developed two embedded systems, a ubiquitous echograph system and a networked digital camera. Evaluations of the developed systems clearly show that the proposed approach, embedding existing clinical systems into HIS, drastically changes productivity in the clinical field. Once a clinical system becomes a pluggable unit for a gigantic computer system, HIS, the combination of multiple embedded systems with application software designed under deep consideration about clinical processes may lead to the emergence of disruptive innovation in the clinical field. PMID:22855229

  9. Process gas solidification system

    DOEpatents

    Fort, William G. S.; Lee, Jr., William W.

    1978-01-01

    It has been the practice to (a) withdraw hot, liquid UF.sub.6 from various systems, (b) direct the UF.sub.6 into storage cylinders, and (c) transport the filled cylinders to another area where the UF.sub.6 is permitted to solidify by natural cooling. However, some hazard attends the movement of cylinders containing liquid UF.sub.6, which is dense, toxic, and corrosive. As illustrated in terms of one of its applications, the invention is directed to withdrawing hot liquid UF.sub.6 from a system including (a) a compressor for increasing the pressure and temperature of a stream of gaseous UF.sub.6 to above its triple point and (b) a condenser for liquefying the compressed gas. A network containing block valves and at least first and second portable storage cylinders is connected between the outlet of the condenser and the suction inlet of the compressor. After an increment of liquid UF.sub.6 from the condenser has been admitted to the first cylinder, the cylinder is connected to the suction of the compressor to flash off UF.sub.6 from the cylinder, thus gradually solidifying UF.sub.6 therein. While the first cylinder is being cooled in this manner, an increment of liquid UF.sub.6 from the condenser is transferred into the second cylinder. UF.sub.6 then is flashed from the second cylinder while another increment of liquid UF.sub.6 is being fed to the first. The operations are repeated until both cylinders are filled with solid UF.sub.6, after which they can be moved safely. As compared with the previous technique, this procedure is safer, faster, and more economical. The method also provides the additional advantage of removing volatile impurities from the UF.sub.6 while it is being cooled.

  10. Information systems vulnerability: A systems analysis perspective

    SciTech Connect

    Wyss, G.D.; Daniel, S.L.; Schriner, H.K.; Gaylor, T.R.

    1996-07-01

    Vulnerability analyses for information systems are complicated because the systems are often geographically distributed. Sandia National Laboratories has assembled an interdisciplinary team to explore the applicability of probabilistic logic modeling (PLM) techniques (including vulnerability and vital area analysis) to examine the risks associated with networked information systems. The authors have found that the reliability and failure modes of many network technologies can be effectively assessed using fault trees and other PLM methods. The results of these models are compatible with an expanded set of vital area analysis techniques that can model both physical locations and virtual (logical) locations to identify both categories of vital areas simultaneously. These results can also be used with optimization techniques to direct the analyst toward the most cost-effective security solution.

  11. Automated Information System (AIS) Alarm System

    SciTech Connect

    Hunteman, W.

    1997-05-01

    The Automated Information Alarm System is a joint effort between Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratory to demonstrate and implement, on a small-to-medium sized local area network, an automated system that detects and automatically responds to attacks that use readily available tools and methodologies. The Alarm System will sense or detect, assess, and respond to suspicious activities that may be detrimental to information on the network or to continued operation of the network. The responses will allow stopping, isolating, or ejecting the suspicious activities. The number of sensors, the sensitivity of the sensors, the assessment criteria, and the desired responses may be set by the using organization to meet their local security policies.

  12. Engineering Design Information System (EDIS)

    SciTech Connect

    Smith, P.S.; Short, R.D.; Schwarz, R.K.

    1990-11-01

    This manual is a guide to the use of the Engineering Design Information System (EDIS) Phase I. The system runs on the Martin Marietta Energy Systems, Inc., IBM 3081 unclassified computer. This is the first phase in the implementation of EDIS, which is an index, storage, and retrieval system for engineering documents produced at various plants and laboratories operated by Energy Systems for the Department of Energy. This manual presents on overview of EDIS, describing the system's purpose; the functions it performs; hardware, software, and security requirements; and help and error functions. This manual describes how to access EDIS and how to operate system functions using Database 2 (DB2), Time Sharing Option (TSO), Interactive System Productivity Facility (ISPF), and Soft Master viewing features employed by this system. Appendix A contains a description of the Soft Master viewing capabilities provided through the EDIS View function. Appendix B provides examples of the system error screens and help screens for valid codes used for screen entry. Appendix C contains a dictionary of data elements and descriptions.

  13. Central American information system for energy planning

    SciTech Connect

    Fonseca, M.G.; Lyon, P.C.; Heskett, J.C.

    1991-04-01

    SICAPE (Sistema de Information Centroamericano para Planificacion Energetica) is an expandable information system designed for energy planning. Its objective is to satisfy ongoing information requirements by means of a menu driver operational environment. SICAPE is as easily used by the novice computer user as those with more experience. Moreover, the system is capable of evolving concurrently with future requirements of the individual country. The expansion is accomplished by menu restructuring as data and user requirements change. The new menu configurations require no programming effort. The use and modification of SICAPE are separate menu-driven processes that allow for rapid data query, minimal training, and effortless continued growth. SICAPE's data is organized by country or region. Information is available in the following areas: energy balance, macro economics, electricity generation capacity, and electricity and petroleum product pricing. (JF)

  14. Keeping Signals Straight: How Cells Process Information and Make Decisions

    PubMed Central

    Laub, Michael T.

    2016-01-01

    As we become increasingly dependent on electronic information-processing systems at home and work, it’s easy to lose sight of the fact that our very survival depends on highly complex biological information-processing systems. Each of the trillions of cells that form the human body has the ability to detect and respond to a wide range of stimuli and inputs, using an extraordinary set of signaling proteins to process this information and make decisions accordingly. Indeed, cells in all organisms rely on these signaling proteins to survive and proliferate in unpredictable and sometimes rapidly changing environments. But how exactly do these proteins relay information within cells, and how do they keep a multitude of incoming signals straight? Here, I describe recent efforts to understand the fidelity of information flow inside cells. This work is providing fundamental insight into how cells function. Additionally, it may lead to the design of novel antibiotics that disrupt the signaling of pathogenic bacteria or it could help to guide the treatment of cancer, which often involves information-processing gone awry inside human cells. PMID:27427909

  15. Keeping Signals Straight: How Cells Process Information and Make Decisions.

    PubMed

    Laub, Michael T

    2016-07-01

    As we become increasingly dependent on electronic information-processing systems at home and work, it's easy to lose sight of the fact that our very survival depends on highly complex biological information-processing systems. Each of the trillions of cells that form the human body has the ability to detect and respond to a wide range of stimuli and inputs, using an extraordinary set of signaling proteins to process this information and make decisions accordingly. Indeed, cells in all organisms rely on these signaling proteins to survive and proliferate in unpredictable and sometimes rapidly changing environments. But how exactly do these proteins relay information within cells, and how do they keep a multitude of incoming signals straight? Here, I describe recent efforts to understand the fidelity of information flow inside cells. This work is providing fundamental insight into how cells function. Additionally, it may lead to the design of novel antibiotics that disrupt the signaling of pathogenic bacteria or it could help to guide the treatment of cancer, which often involves information-processing gone awry inside human cells. PMID:27427909

  16. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  17. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump

    NASA Astrophysics Data System (ADS)

    Cao, Yuansheng; Gong, Zongping; Quan, H. T.

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  18. Air Force geographic information and analysis system

    SciTech Connect

    Henney, D.A.; Jansing, D.S.; Durfee, R.C.; Margle, S.M.; Till, L.E.

    1987-01-01

    A microcomputer-based geographic information and analysis system (GIAS) was developed to assist Air Force planners with environmental analysis, natural resources management, and facility and land-use planning. The system processes raster image data, topological data structures, and geometric or vector data similar to that produced by computer-aided design and drafting (CADD) systems, integrating the data where appropriate. Data types included Landsat imagery, scanned images of base maps, digitized point and chain features, topographic elevation data, USGS stream course data, highway networks, railroad networks, and land use/land cover information from USGS interpreted aerial photography. The system is also being developed to provide an integrated display and analysis capability with base maps and facility data bases prepared on CADD systems. 3 refs.

  19. [Information system in the cardio polyclinic].

    PubMed

    Mihajlović, Marina; Zivković, Marija

    2014-03-01

    The cardiologic polyclinic information system ensures effective management of business processes in the polyclinic. Medical nurse provides health care to a patient with the support of the information system, which enables recording the patient's identity, admission, participation fee charges, billing for the services provided, patients' orders for noninvasive diagnostic methods, and implementation of diagnostic methods. The nurse enters patient's personal information at every work station, updates the existing records, and has an opportunity to add notes and insights to the results of patient's diagnostic tests and doctors' opinions for patients in the polyclinic. Additionally, the nurse records the services and supplies provided, and these entries are used for billing and service charges. This information is accessible at every work station to authorized persons exclusively. The implementation of the information system enables medical nurses working at the reception desk and in nurses' consulting room to record administrative data and data related to diagnostic analysis at the moment and at the place they happen. A personal password is required to access these data. In this way, the patient admission recording is facilitated, and in case the patient needs to be contacted, communication with him/her is improved, and finally, writing reports and data analysis are simplified. Apart from the advantages, there also are problems such as inadequate staff education and insufficient reliability of the information infrastructure, which if overloaded, can slow down the system, and this is time consuming for both health workers and patients.

  20. [A medical consumable material management information system].

    PubMed

    Tang, Guoping; Hu, Liang

    2014-05-01

    Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process. PMID:25241525

  1. [Development of Hospital Equipment Maintenance Information System].

    PubMed

    Zhou, Zhixin

    2015-11-01

    Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.

  2. [Development of Hospital Equipment Maintenance Information System].

    PubMed

    Zhou, Zhixin

    2015-11-01

    Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis. PMID:27066680

  3. Effects of noise upon human information processing

    NASA Technical Reports Server (NTRS)

    Cohen, H. H.; Conrad, D. W.; Obrien, J. F.; Pearson, R. G.

    1974-01-01

    Studies of noise effects upon human information processing are described which investigated whether or not effects of noise upon performance are dependent upon specific characteristics of noise stimulation and their interaction with task conditions. The difficulty of predicting noise effects was emphasized. Arousal theory was considered to have explanatory value in interpreting the findings of all the studies. Performance under noise was found to involve a psychophysiological cost, measured by vasoconstriction response, with the degree of response cost being related to scores on a noise annoyance sensitivity scale. Noise sensitive subjects showed a greater autonomic response under noise stimulation.

  4. Image and information management system

    NASA Technical Reports Server (NTRS)

    Robertson, Tina L. (Inventor); Raney, Michael C. (Inventor); Dougherty, Dennis M. (Inventor); Kent, Peter C. (Inventor); Brucker, Russell X. (Inventor); Lampert, Daryl A. (Inventor)

    2007-01-01

    A system and methods through which pictorial views of an object's configuration, arranged in a hierarchical fashion, are navigated by a person to establish a visual context within the configuration. The visual context is automatically translated by the system into a set of search parameters driving retrieval of structured data and content (images, documents, multimedia, etc.) associated with the specific context. The system places hot spots, or actionable regions, on various portions of the pictorials representing the object. When a user interacts with an actionable region, a more detailed pictorial from the hierarchy is presented representing that portion of the object, along with real-time feedback in the form of a popup pane containing information about that region, and counts-by-type reflecting the number of items that are available within the system associated with the specific context and search filters established at that point in time.

  5. Image and information management system

    NASA Technical Reports Server (NTRS)

    Robertson, Tina L. (Inventor); Raney, Michael C. (Inventor); Dougherty, Dennis M. (Inventor); Kent, Peter C. (Inventor); Brucker, Russell X. (Inventor); Lampert, Daryl A. (Inventor)

    2009-01-01

    A system and methods through which pictorial views of an object's configuration, arranged in a hierarchical fashion, are navigated by a person to establish a visual context within the configuration. The visual context is automatically translated by the system into a set of search parameters driving retrieval of structured data and content (images, documents, multimedia, etc.) associated with the specific context. The system places ''hot spots'', or actionable regions, on various portions of the pictorials representing the object. When a user interacts with an actionable region, a more detailed pictorial from the hierarchy is presented representing that portion of the object, along with real-time feedback in the form of a popup pane containing information about that region, and counts-by-type reflecting the number of items that are available within the system associated with the specific context and search filters established at that point in time.

  6. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  7. Information Management System for Site Remediation Efforts.

    PubMed

    Laha; Mukherjee; Nebhrajani

    2000-05-01

    / Environmental regulatory agencies are responsible for protecting human health and the environment in their constituencies. Their responsibilities include the identification, evaluation, and cleanup of contaminated sites. Leaking underground storage tanks (USTs) constitute a major source of subsurface and groundwater contamination. A significant portion of a regulatory body's efforts may be directed toward the management of UST-contaminated sites. In order to manage remedial sites effectively, vast quantities of information must be maintained, including analytical dataon chemical contaminants, remedial design features, and performance details. Currently, most regulatory agencies maintain such information manually. This makes it difficult to manage the data effectively. Some agencies have introduced automated record-keeping systems. However, the ad hoc approach in these endeavors makes it difficult to efficiently analyze, disseminate, and utilize the data. This paper identifies the information requirements for UST-contaminated site management at the Waste Cleanup Section of the Department of Environmental Resources Management in Dade County, Florida. It presents a viable design for an information management system to meet these requirements. The proposed solution is based on a back-end relational database management system with relevant tools for sophisticated data analysis and data mining. The database is designed with all tables in the third normal form to ensure data integrity, flexible access, and efficient query processing. In addition to all standard reports required by the agency, the system provides answers to ad hoc queries that are typically difficult to answer under the existing system. The database also serves as a repository of information for a decision support system to aid engineering design and risk analysis. The system may be integrated with a geographic information system for effective presentation and dissemination of spatial data.

  8. The Process of Systemic Change

    ERIC Educational Resources Information Center

    Duffy, Francis M.; Reigeluth, Charles M.; Solomon, Monica; Caine, Geoffrey; Carr-Chellman, Alison A.; Almeida, Luis; Frick, Theodore; Thompson, Kenneth; Koh, Joyce; Ryan, Christopher D.; DeMars, Shane

    2006-01-01

    This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

  9. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  10. Quantum Information Processing using Scalable Techniques

    NASA Astrophysics Data System (ADS)

    Hanneke, D.; Bowler, R.; Jost, J. D.; Home, J. P.; Lin, Y.; Tan, T.-R.; Leibfried, D.; Wineland, D. J.

    2011-05-01

    We report progress towards improving our previous demonstrations that combined all the fundamental building blocks required for scalable quantum information processing using trapped atomic ions. Included elements are long-lived qubits; a laser-induced universal gate set; state initialization and readout; and information transport, including co-trapping a second ion species to reinitialize motion without qubit decoherence. Recent efforts have focused on reducing experimental overhead and increasing gate fidelity. Most of the experimental duty cycle was previously used for transport, separation, and recombination of ion chains as well as re-cooling of motional excitation. We have addressed these issues by developing and implementing an arbitrary waveform generator with an update rate far above the ions' motional frequencies. To reduce gate errors, we actively stabilize the position of several UV (313 nm) laser beams. We have also switched the two-qubit entangling gate to one that acts directly on 9Be+ hyperfine qubit states whose energy separation is magnetic-fluctuation insensitive. This work is supported by DARPA, NSA, ONR, IARPA, Sandia, and the NIST Quantum Information Program.

  11. A nursing information model process for interoperability.

    PubMed

    Chow, Marilyn; Beene, Murielle; O'Brien, Ann; Greim, Patricia; Cromwell, Tim; DuLong, Donna; Bedecarré, Diane

    2015-05-01

    The ability to share nursing data across organizations and electronic health records is a key component of improving care coordination and quality outcomes. Currently, substantial organizational and technical barriers limit the ability to share and compare essential patient data that inform nursing care. Nursing leaders at Kaiser Permanente and the U.S. Department of Veterans Affairs collaborated on the development of an evidence-based information model driven by nursing practice to enable data capture, re-use, and sharing between organizations and disparate electronic health records. This article describes a framework with repeatable steps and processes to enable the semantic interoperability of relevant and contextual nursing data. Hospital-acquired pressure ulcer prevention was selected as the prototype nurse-sensitive quality measure to develop and test the model. In a Health 2.0 Developer Challenge program from the Office of the National Coordinator for Health, mobile applications implemented the model to help nurses assess the risk of hospital-acquired pressure ulcers and reduce their severity. The common information model can be applied to other nurse-sensitive measures to enable data standardization supporting patient transitions between care settings, quality reporting, and research.

  12. The Co-Creation of Information Systems

    ERIC Educational Resources Information Center

    Gomillion, David

    2013-01-01

    In information systems development, end-users have shifted in their role: from consumers of information to informants for requirements to developers of systems. This shift in the role of users has also changed how information systems are developed. Instead of systems developers creating specifications for software or end-users creating small…

  13. [Hospital information system--project of implementation of SAP information system at Sveti Duh General Hospital].

    PubMed

    Pale, Ivica

    2005-01-01

    Nowadays, as medical and hospital institutions have been facing a growing need of a more efficient provision of healthcare services to patients, with simultaneous complete monitoring of the successfulness of business activities, integrated information systems appear as the logical choice for the support to hospital business processes. The integrated business information system implemented at Sveti Duh General Hospital is a comprehensive system that supports all hospital, clinical and administrative processes, while providing the basis for decision making regarding the patients and hospital management. The system also enables transfer of all data with specific medical business segments such as laboratory device management. The project for the implementation of the information system was realized in accordance with the requests from the Ministry of Health, applying the proven methodology for the execution of such complex projects. The project team consisted of a number of consultants from b4b Co. from Zagreb, as well as Hospital employees. The new information system is completely ready for going live; however, the necessary decisions have to be made first. The application of the system gives the medical staff more time for their professional work with patients, and through longterm collection and analysis of data on symptoms, illnesses and medical treatments, the information system becomes an important tool for the improvement of health and quality of healthcare system in general. PMID:16095196

  14. [Hospital information system--project of implementation of SAP information system at Sveti Duh General Hospital].

    PubMed

    Pale, Ivica

    2005-01-01

    Nowadays, as medical and hospital institutions have been facing a growing need of a more efficient provision of healthcare services to patients, with simultaneous complete monitoring of the successfulness of business activities, integrated information systems appear as the logical choice for the support to hospital business processes. The integrated business information system implemented at Sveti Duh General Hospital is a comprehensive system that supports all hospital, clinical and administrative processes, while providing the basis for decision making regarding the patients and hospital management. The system also enables transfer of all data with specific medical business segments such as laboratory device management. The project for the implementation of the information system was realized in accordance with the requests from the Ministry of Health, applying the proven methodology for the execution of such complex projects. The project team consisted of a number of consultants from b4b Co. from Zagreb, as well as Hospital employees. The new information system is completely ready for going live; however, the necessary decisions have to be made first. The application of the system gives the medical staff more time for their professional work with patients, and through longterm collection and analysis of data on symptoms, illnesses and medical treatments, the information system becomes an important tool for the improvement of health and quality of healthcare system in general.

  15. Management Information and Library Management Systems: An Overview.

    ERIC Educational Resources Information Center

    Fisher, Shelagh; Rowley, Jennifer

    1994-01-01

    Provides an overview of the facilities for management information in library management systems. Highlights include the relationship between transaction processing systems, management information systems, and decision support systems; a review of previous work; enquiries and standard reports relating to library operations; report generators; and…

  16. TWRS information locator database system administrator`s manual

    SciTech Connect

    Knutson, B.J., Westinghouse Hanford

    1996-09-13

    This document is a guide for use by the Tank Waste Remediation System (TWRS) Information Locator Database (ILD) System Administrator. The TWRS ILD System is an inventory of information used in the TWRS Systems Engineering process to represent the TWRS Technical Baseline. The inventory is maintained in the form of a relational database developed in Paradox 4.5.

  17. Information Security and Integrity Systems

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Viewgraphs from the Information Security and Integrity Systems seminar held at the University of Houston-Clear Lake on May 15-16, 1990 are presented. A tutorial on computer security is presented. The goals of this tutorial are the following: to review security requirements imposed by government and by common sense; to examine risk analysis methods to help keep sight of forest while in trees; to discuss the current hot topic of viruses (which will stay hot); to examine network security, now and in the next year to 30 years; to give a brief overview of encryption; to review protection methods in operating systems; to review database security problems; to review the Trusted Computer System Evaluation Criteria (Orange Book); to comment on formal verification methods; to consider new approaches (like intrusion detection and biometrics); to review the old, low tech, and still good solutions; and to give pointers to the literature and to where to get help. Other topics covered include security in software applications and development; risk management; trust: formal methods and associated techniques; secure distributed operating system and verification; trusted Ada; a conceptual model for supporting a B3+ dynamic multilevel security and integrity in the Ada runtime environment; and information intelligence sciences.

  18. Integrated Bibliographic Information System: Integrating Resources by Integrating Information Technologies.

    ERIC Educational Resources Information Center

    Cotter, Gladys A.; Hartt, Richard W.

    The Defense Technical Information Center (DTIC), an organization charged with providing information services to the Department of Defense (DoD) scientific and technical community, actively seeks ways to promote resource sharing as a means for speeding access to information while reducing the costs of information processing throughout the technical…

  19. Association with emotional information alters subsequent processing of neutral faces

    PubMed Central

    Riggs, Lily; Fujioka, Takako; Chan, Jessica; McQuiggan, Douglas A.; Anderson, Adam K.; Ryan, Jennifer D.

    2014-01-01

    The processing of emotional as compared to neutral information is associated with different patterns in eye movement and neural activity. However, the ‘emotionality’ of a stimulus can be conveyed not only by its physical properties, but also by the information that is presented with it. There is very limited work examining the how emotional information may influence the immediate perceptual processing of otherwise neutral information. We examined how presenting an emotion label for a neutral face may influence subsequent processing by using eye movement monitoring (EMM) and magnetoencephalography (MEG) simultaneously. Participants viewed a series of faces with neutral expressions. Each face was followed by a unique negative or neutral sentence to describe that person, and then the same face was presented in isolation again. Viewing of faces paired with a negative sentence was associated with increased early viewing of the eye region and increased neural activity between 600 and 1200 ms in emotion processing regions such as the cingulate, medial prefrontal cortex, and amygdala, as well as posterior regions such as the precuneus and occipital cortex. Viewing of faces paired with a neutral sentence was associated with increased activity in the parahippocampal gyrus during the same time window. By monitoring behavior and neural activity within the same paradigm, these findings demonstrate that emotional information alters subsequent visual scanning and the neural systems that are presumably invoked to maintain a representation of the neutral information along with its emotional details. PMID:25566024

  20. Nuclear Criticality Information System. Database examples

    SciTech Connect

    Foret, C.A.

    1984-06-01

    The purpose of this publication is to provide our users with a guide to using the Nuclear Criticality Information System (NCIS). It is comprised of an introduction, an information and resources section, a how-to-use section, and several useful appendices. The main objective of this report is to present a clear picture of the NCIS project and its available resources as well as assisting our users in accessing the database and using the TIS computer to process data. The introduction gives a brief description of the NCIS project, the Technology Information System (TIS), online user information, future plans and lists individuals to contact for additional information about the NCIS project. The information and resources section outlines the NCIS database and describes the resources that are available. The how-to-use section illustrates access to the NCIS database as well as searching datafiles for general or specific data. It also shows how to access and read the NCIS news section as well as connecting to other information centers through the TIS computer.