Science.gov

Sample records for computer based information

  1. Computer-Based Information Networks: Selected Examples.

    ERIC Educational Resources Information Center

    Hardesty, Larry

    The history, purpose, and operation of six computer-based information networks are described in general and nontechnical terms. In the introduction the many definitions of an information network are explored. Ohio College Library Center's network (OCLC) is the first example. OCLC began in 1963, and since early 1973 has been extending its services…

  2. Computer-Based Information Networks: Selected Examples.

    ERIC Educational Resources Information Center

    Hardesty, Larry

    The history, purpose, and operation of six computer-based information networks are described in general and nontechnical terms. In the introduction the many definitions of an information network are explored. Ohio College Library Center's network (OCLC) is the first example. OCLC began in 1963, and since early 1973 has been extending its services…

  3. Secure information transfer based on computing reservoir

    NASA Astrophysics Data System (ADS)

    Szmoski, R. M.; Ferrari, F. A. S.; de S. Pinto, S. E.; Baptista, M. S.; Viana, R. L.

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on-off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  4. Computer-based manuals for procedural information

    NASA Technical Reports Server (NTRS)

    Rouse, S. H.; Rouse, W. B.

    1980-01-01

    Display of procedural information as found in aircraft operating manuals is discussed. The problem of converting hardcopy manuals to a computer-based presentation is considered. The trade-off of faster retrieval and display integration possible with a cathode-ray tube (CRT) versus the limited size of a CRT is emphasized. Nine subjects participated in an experimental study of the effectiveness of three alternative displays. Displays were evaluated for the task of retrieving and carrying out emergency procedures in an environment where task interruptions were prevalent. It was found that an on-line manual which provided considerable user assistance was superior to a hardcopy manual in terms of both task completion time and errors. However, an on-line manual without user assistance was inferior to a hardcopy manual in terms of errors.

  5. Computer Based Information Systems and the Middle Manager.

    DTIC Science & Technology

    Why do some computer based information systems succeed while others fail. It concludes with eleven recommended areas that middle management must...understand in order to effectively use computer based information systems . (Modified author abstract)

  6. Computer-Based Assessments. Information Capsule. Volume 0918

    ERIC Educational Resources Information Center

    Blazer, Christie

    2010-01-01

    This Information Capsule reviews research conducted on computer-based assessments. Advantages and disadvantages associated with computer-based testing programs are summarized and research on the comparability of computer-based and paper-and-pencil assessments is reviewed. Overall, studies suggest that for most students, there are few if any…

  7. Handbook of Standards for Computer-Based Career Information Systems.

    ERIC Educational Resources Information Center

    Association of Computer-Based Systems for Career Information, Eugene, OR. Clearinghouse.

    This document presents standards for computer-based career information systems developed by the Association of Computer-Based Systems for Career Information (ACSCI). The adoption of ACSCI standards constitutes a voluntary means for organizations to declare that they subscribe to certain quality measures. These standards can be used to: (1) foster…

  8. Community readiness for a computer-based health information network.

    PubMed

    Ervin, Naomi E; Berry, Michelle M

    2006-01-01

    The need for timely and accurate communication among healthcare providers has prompted the development of computer-based health information networks that allow patient and client information to be shared among agencies. This article reports the findings of a study to assess whether residents of an upstate New York community were ready for a computer-based health information network to facilitate delivery of long term care services. Focus group sessions, which involved both consumers and professionals, revealed that security of personal information was of concern to healthcare providers, attorneys, and consumers. Physicians were the most enthusiastic about the possibility of a computer-based health information network. Consumers and other healthcare professionals, including nurses, indicated that such a network would be helpful to them personally. Nurses and other healthcare professionals need to be knowledgeable about the use of computer-based health information networks and other electronic information systems as this trend continues to spread across the U.S.

  9. Randomised trial of personalised computer based information for cancer patients

    PubMed Central

    Jones, Ray; Pearson, Janne; McGregor, Sandra; Cawsey, Alison J; Barrett, Ann; Craig, Neil; Atkinson, Jacqueline M; Gilmour, W Harper; McEwen, Jim

    1999-01-01

    Objective To compare the use and effect of a computer based information system for cancer patients that is personalised using each patient's medical record with a system providing only general information and with information provided in booklets. Design Randomised trial with three groups. Data collected at start of radiotherapy, one week later (when information provided), three weeks later, and three months later. Participants 525 patients started radical radiotherapy; 438 completed follow up. Interventions Two groups were offered information via computer (personalised or general information, or both) with open access to computer thereafter; the third group was offered a selection of information booklets. Outcomes Patients' views and preferences, use of computer and information, and psychological status; doctors' perceptions; cost of interventions. Results More patients offered the personalised information said that they had learnt something new, thought the information was relevant, used the computer again, and showed their computer printouts to others. There were no major differences in doctors' perceptions of patients. More of the general computer group were anxious at three months. With an electronic patient record system, in the long run the personalised information system would cost no more than the general system. Full access to booklets cost twice as much as the general system. Conclusions Patients preferred computer systems that provided information from their medical records to systems that just provided general information. This has implications for the design and implementation of electronic patient record systems and reliance on general sources of patient information. PMID:10550090

  10. Changing Nature of Computer-Based Information Systems.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.

    1984-01-01

    Computer-based information systems have evolved from emphasizing data processing to providing full and flexible support for management. They have moved from providing mere data to providing a medium for representing knowledge wherein managers can analyze data, formulate ideas, structure arguments, and building models. (Author/MLW)

  11. INFORMATION DISPLAY: CONSIDERATIONS FOR DESIGNING COMPUTER-BASED DISPLAY SYSTEMS.

    SciTech Connect

    O'HARA,J.M.; PIRUS,D.; BELTRATCCHI,L.

    2004-09-19

    This paper discussed the presentation of information in computer-based control rooms. Issues associated with the typical displays currently in use are discussed. It is concluded that these displays should be augmented with new displays designed to better meet the information needs of plant personnel and to minimize the need for interface management tasks (the activities personnel have to do to access and organize the information they need). Several approaches to information design are discussed, specifically addressing: (1) monitoring, detection, and situation assessment; (2) routine task performance; and (3) teamwork, crew coordination, collaborative work.

  12. Symmetrically private information retrieval based on blind quantum computing

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling

    2015-05-01

    Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.

  13. A Spread Willingness Computing-Based Information Dissemination Model

    PubMed Central

    Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738

  14. A computer-based information system for epilepsy and electroencephalography.

    PubMed

    Finnerup, N B; Fuglsang-Frederiksen, A; Røssel, P; Jennum, P

    1999-08-01

    This paper describes a standardised computer-based information system for electroencephalography (EEG) focusing on epilepsy. The system was developed using a prototyping approach. It is based on international recommendations for EEG examination, interpretation and terminology, international guidelines for epidemiological studies on epilepsy and classification of epileptic seizures and syndromes and international classification of diseases. It is divided into: (1) clinical information and epilepsy relevant data; and (2) EEG data, which is hierarchically structured including description and interpretation of EEG. Data is coded but is supplemented with unrestricted text. The resulting patient database can be integrated with other clinical databases and with the patient record system and may facilitate clinical and epidemiological research and development of standards and guidelines for EEG description and interpretation. The system is currently used for teleconsultation between Gentofte and Lisbon.

  15. A Comprehensive Computer-Based Medical Information System

    PubMed Central

    David, Sidney S.

    1977-01-01

    A comupter-based medical information system has been developed for patient care and clinical investigation. It is implemented on a large digital computer and employs techniques consistent with general purpose commercially available data management systems. It has been in operation since 1971 and contains the records of approximately 1600 patients. Incoming data are received from patients and clinical staff utilizing specialized forms. A wide diversity of output, including summaries, searches and statistics are provided. The system enhances the quality of care provided to patients, optimizes physician time spent on clinical management, improves many aspects of the supporting research, and is applicable to other areas of medicine.

  16. A Cloud Computing Based Patient Centric Medical Information System

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  17. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  18. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  19. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  20. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  1. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  2. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  3. 18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...

  4. Why Adolescents Use a Computer-Based Health Information System.

    ERIC Educational Resources Information Center

    Hawkins, Robert P.; And Others

    The Body Awareness Resource Network (BARN) is a system of interactive computer programs designed to provide adolescents with confidential, nonjudgmental health information, behavior change strategies, and sources of referral. These programs cover five adolescent health areas: alcohol and other drugs, human sexuality, smoking prevention and…

  5. Computer-Based Information Systems Education in the Arab World: A Survey.

    ERIC Educational Resources Information Center

    Noor, A. El Sayed

    1983-01-01

    This state-of-the-art review of computing sciences education in the Arab countries, including computer engineering, computer science, and computer-based management information systems, discusses factors contributing to the widening gap in computing expertise between the Arab countries and the developed countries. (EAO)

  6. Design and evaluation of an onboard computer-based information system for aircraft

    NASA Technical Reports Server (NTRS)

    Rouse, S. H.; Rouse, W. B.; Hammer, J. M.

    1982-01-01

    Information seeking by human operators of technical systems is considered. Types of information and forms of presentation are discussed and important issues reviewed. This broad discussion provides a framework within which flight management is considered. The design of an onboard computer-based information system for aircraft is discussed. The aiding possibilities of a computer-based system are emphasized. Results of an experimental evaluation of a prototype system are presented. It is concluded that a computer-based information system can substantially lessen the frequency of human errors.

  7. A Method for Rating Computer-Based Career Information Delivery Systems.

    ERIC Educational Resources Information Center

    Bloch, Deborah Perlmutter; Kinnison, Joyce Ford

    1989-01-01

    Developed a three-part rating system for computer-based career information delivery systems in the areas of comprehensiveness, accuracy, and effectiveness. Used system to rate five popular computer-based systems (C-LECT, CHOICES, CIS, DISCOVER, and GIS). Four systems were evaluated as being very similar, with CIS receiving highest scores.…

  8. INIS: A Computer-Based International Nuclear Information System.

    ERIC Educational Resources Information Center

    Balakrishnan, M. R.

    1986-01-01

    Description of the International Nuclear Information System includes its history, organizational structure, subject classification scheme, thesaurus, input standards, and various products and services generated by the system. Appendices provide a list of participating countries, subjects covered by the system, and a sample output record.…

  9. Information Retrieval: Presentation and Demonstration of an Interactive Computer-Based Search Program.

    ERIC Educational Resources Information Center

    Spuck, Dennis W.; And Others

    A symposium with four major presentations centering on the topic of computer-based information retrieval. Also highlighted are several features of the Wisconsin Information System for Education (WISE-ONE) and the Educational Resources Information Center (ERIC) system. The first paper in the series discusses the development, current capabilities…

  10. Information Retrieval: Presentation and Demonstration of an Interactive Computer-Based Search Program.

    ERIC Educational Resources Information Center

    Spuck, Dennis W.; And Others

    A symposium with four major presentations centering on the topic of computer-based information retrieval. Also highlighted are several features of the Wisconsin Information System for Education (WISE-ONE) and the Educational Resources Information Center (ERIC) system. The first paper in the series discusses the development, current capabilities…

  11. A Computer Based Biomedical Information System. I. Logic Foundation and Techniques

    ERIC Educational Resources Information Center

    Syner, James C.

    A digital computer based biomedical information system was designed to service the needs of physicians engaged in patient care and clinical research, and scientists engaged in laboratory research. The system embraces all functions of information processing which include information collection, storage, retrieval, analyses and display. The…

  12. An interdisciplinary computer-based information tool for palliative severe pain management.

    PubMed

    Kuziemsky, Craig E; Weber-Jahnke, Jens H; Lau, Francis; Downing, G Michael

    2008-01-01

    As patient care becomes more collaborative in nature, there is a need for information technology that supports interdisciplinary practices of care. This study developed and performed usability testing of a standalone computer-based information tool to support the interdisciplinary practice of palliative severe pain management (SPM). A grounded theory-participatory design (GT-PD) approach was used with three distinct palliative data sources to obtain and understand user requirements for SPM practice and how a computer-based information tool could be designed to support those requirements. The GT-PD concepts and categories provided a rich perspective of palliative SPM and the process and information support required for different SPM tasks. A conceptual framework consisting of an ontology and a set of three problem-solving methods was developed to reconcile the requirements of different interdisciplinary team members. The conceptual framework was then implemented as a prototype computer-based information tool that has different modes of use to support both day-to-day case management and education of palliative SPM. Usability testing of the computer tool was performed, and the tool tested favorably in a laboratory setting. An interdisciplinary computer-based information tool can be developed to support the different work practices and information needs of interdisciplinary team members, but explicit requirements must be sought from all prospective users of such a tool. Qualitative methods such as the hybrid GT-PD approach used in this research are particularly helpful for articulating computer tool design requirements.

  13. Facilitating Place-Based Learning in Outdoor Informal Environments with Mobile Computers

    ERIC Educational Resources Information Center

    Zimmerman, Heather Toomey; Land, Susan M.

    2014-01-01

    This paper advocates for place-based education to guide research and design for mobile computers used in outdoor informal environments (e.g., backyards, nature centers and parks). By bringing together research on place-based education with research on location awareness, we developed three design guidelines to support learners to develop robust…

  14. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    ERIC Educational Resources Information Center

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  15. Computer-Based Storage and Retrieval of Geoscience Information: Bibliography 1970-72.

    ERIC Educational Resources Information Center

    Burk, C. F., Jr.

    The publication of papers describing activity in computer-based storage and retrieval and geoscience information has continued at a vigorous pace since release of the last bibliography, which covered the period 1946-69 (ED 076 203). A total of 211 references are identified, nearly all of which were published during the three-year period 1970-72…

  16. Computer-Based Storage and Retrieval of Geoscience Information: Bibliography 1970-72.

    ERIC Educational Resources Information Center

    Burk, C. F., Jr.

    The publication of papers describing activity in computer-based storage and retrieval and geoscience information has continued at a vigorous pace since release of the last bibliography, which covered the period 1946-69 (ED 076 203). A total of 211 references are identified, nearly all of which were published during the three-year period 1970-72…

  17. Attention Paid to Feedback Provided by a Computer-Based Assessment for Learning on Information Literacy

    ERIC Educational Resources Information Center

    Timmers, Caroline; Veldkamp, Bernard

    2011-01-01

    Three studies are presented on attention paid to feedback provided by a computer-based assessment for learning on information literacy. Results show that the attention paid to feedback varies greatly. In general the attention focuses on feedback of incorrectly answered questions. In each study approximately fifty percent of the respondents paid…

  18. Computer-Based Systems for Increasing Information Access to School Media Center Materials. Final Report.

    ERIC Educational Resources Information Center

    Hines, Theodore C.; And Others

    The project presented here explored the possibility of using computer-based systems to increase information access to non-text children's materials at the pre-school through elementary (6th grade) school levels. This final report includes an indicative summary as well as ten separate papers that describe a range of applications of proven computer…

  19. Attention Paid to Feedback Provided by a Computer-Based Assessment for Learning on Information Literacy

    ERIC Educational Resources Information Center

    Timmers, Caroline; Veldkamp, Bernard

    2011-01-01

    Three studies are presented on attention paid to feedback provided by a computer-based assessment for learning on information literacy. Results show that the attention paid to feedback varies greatly. In general the attention focuses on feedback of incorrectly answered questions. In each study approximately fifty percent of the respondents paid…

  20. The Impact of a Computer Based Information System (CBIS) on Foreign Investments Opportunities.

    ERIC Educational Resources Information Center

    Goodwin, Chester

    The purpose of this paper is to analyze the impact that computer based information systems (CBIS) could have on U.S. multinational corporations operating in Canada, particularly in the province of Quebec, and the implications for the North American Free Trade Agreement (NAFTA) that went into effect on January 1, 1994. The study focused on how the…

  1. Optimal nonlinear information processing capacity in delay-based reservoir computers

    NASA Astrophysics Data System (ADS)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  2. Optimal nonlinear information processing capacity in delay-based reservoir computers

    PubMed Central

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528

  3. Optimal nonlinear information processing capacity in delay-based reservoir computers.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-11

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  4. A Computer-Based System Integrating Instruction and Information Retrieval: A Description of Some Methodological Considerations.

    ERIC Educational Resources Information Center

    Selig, Judith A.; And Others

    This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…

  5. A new computational strategy for identifying essential proteins based on network topological properties and biological information.

    PubMed

    Qin, Chao; Sun, Yongqi; Dong, Yadong

    2017-01-01

    Essential proteins are the proteins that are indispensable to the survival and development of an organism. Deleting a single essential protein will cause lethality or infertility. Identifying and analysing essential proteins are key to understanding the molecular mechanisms of living cells. There are two types of methods for predicting essential proteins: experimental methods, which require considerable time and resources, and computational methods, which overcome the shortcomings of experimental methods. However, the prediction accuracy of computational methods for essential proteins requires further improvement. In this paper, we propose a new computational strategy named CoTB for identifying essential proteins based on a combination of topological properties, subcellular localization information and orthologous protein information. First, we introduce several topological properties of the protein-protein interaction (PPI) network. Second, we propose new methods for measuring orthologous information and subcellular localization and a new computational strategy that uses a random forest prediction model to obtain a probability score for the proteins being essential. Finally, we conduct experiments on four different Saccharomyces cerevisiae datasets. The experimental results demonstrate that our strategy for identifying essential proteins outperforms traditional computational methods and the most recently developed method, SON. In particular, our strategy improves the prediction accuracy to 89, 78, 79, and 85 percent on the YDIP, YMIPS, YMBD and YHQ datasets at the top 100 level, respectively.

  6. Estimating the mutual information of an EEG-based Brain-Computer Interface.

    PubMed

    Schlögl, A; Neuper, C; Pfurtscheller, G

    2002-01-01

    An EEG-based Brain-Computer Interface (BCI) could be used as an additional communication channel between human thoughts and the environment. The efficacy of such a BCI depends mainly on the transmitted information rate. Shannon's communication theory was used to quantify the information rate of BCI data. For this purpose, experimental EEG data from four BCI experiments was analyzed off-line. Subjects imaginated left and right hand movements during EEG recording from the sensorimotor area. Adaptive autoregressive (AAR) parameters were used as features of single trial EEG and classified with linear discriminant analysis. The intra-trial variation as well as the inter-trial variability, the signal-to-noise ratio, the entropy of information, and the information rate were estimated. The entropy difference was used as a measure of the separability of two classes of EEG patterns.

  7. Quantum information and computation

    SciTech Connect

    Bennett, C.H.

    1995-10-01

    A new quantum theory of communication and computation is emerging, in which the stuff transmitted or processed is not classical information, but arbitrary superpositions of quantum states. {copyright} 1995 {ital American} {ital Institute} {ital of} {ital Physics}.

  8. Selective Dissemination of Information and Retrospective Searches. Computer Based Information Services from RIT.

    ERIC Educational Resources Information Center

    Gluchowicz, Zofia

    The purpose of this guide is to give an up-to-date presentation of the information service offered by the documentation center at the Royal Institute of Technology (RIT) and to facilitate the utilization of the service. The guide gives a general account of the multidisciplinary computerized current awareness service (SDI) and a detailed…

  9. Adaptability and the integration of computer-based information processing into the dynamics of organizations.

    PubMed

    Kampfner, Roberto R

    2006-07-01

    The structure of a system influences its adaptability. An important result of adaptability theory is that subsystem independence increases adaptability [Conrad, M., 1983. Adaptability. Plenum Press, New York]. Adaptability is essential in systems that face an uncertain environment such as biological systems and organizations. Modern organizations are the product of human design. And so it is their structure and the effect that it has on their adaptability. In this paper we explore the potential effects of computer-based information processing on the adaptability of organizations. The integration of computer-based processes into the dynamics of the functions they support and the effect it has on subsystem independence are especially relevant to our analysis.

  10. Three-dimensional information hierarchical encryption based on computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Shen, Xueju; Cao, Liangcai; Zhang, Hao; Zong, Song; Jin, Guofan

    2016-12-01

    A novel approach for encrypting three-dimensional (3-D) scene information hierarchically based on computer-generated holograms (CGHs) is proposed. The CGHs of the layer-oriented 3-D scene information are produced by angular-spectrum propagation algorithm at different depths. All the CGHs are then modulated by different chaotic random phase masks generated by the logistic map. Hierarchical encryption encoding is applied when all the CGHs are accumulated one by one, and the reconstructed volume of the 3-D scene information depends on permissions of different users. The chaotic random phase masks could be encoded into several parameters of the chaotic sequences to simplify the transmission and preservation of the keys. Optical experiments verify the proposed method and numerical simulations show the high key sensitivity, high security, and application flexibility of the method.

  11. A sequential procedure for implementing a computer-based information system.

    PubMed

    Aldrich, D S; Helbig, L C

    1986-09-01

    A sequential procedure is presented for the foodservice manager to follow when considering initial implementation of a computer-based information system (CBIS). A feasibility study is recommended as a first step to analyze the information desired and resources available in order to determine objectives of the proposed CBIS. Alternative CBIS design plans should then be evaluated against critical success factors to determine the direction of initial CBIS efforts. Application software, which provides needed support, then determines the hardware needed. Information about vendors and the suitability of the CBIS to meet needs should be determined next. The following management procedures are suggested: utilization of project management skills, identification of roles for the project team members, and initiation of a phased implementation strategy. Finally, to ensure control of the CBIS project, evaluation and documentation are advised.

  12. A computer-based education intervention to enhance surrogates' informed consent for genomics research.

    PubMed

    Shelton, Ann K; Freeman, Bradley D; Fish, Anne F; Bachman, Jean A; Richardson, Lloyd I

    2015-03-01

    Many research studies conducted today in critical care have a genomics component. Patients' surrogates asked to authorize participation in genomics research for a loved one in the intensive care unit may not be prepared to make informed decisions about a patient's participation in the research. To examine the effectiveness of a new, computer-based education module on surrogates' understanding of the process of informed consent for genomics research. A pilot study was conducted with visitors in the waiting rooms of 2 intensive care units in a Midwestern tertiary care medical center. Visitors were randomly assigned to the experimental (education module plus a sample genomics consent form; n = 65) or the control (sample genomics consent form only; n = 69) group. Participants later completed a test on informed genomics consent. Understanding the process of informed consent was greater (P = .001) in the experimental group than in the control group. Specifically, compared with the control group, the experimental group had a greater understanding of 8 of 13 elements of informed consent: intended benefits of research (P = .02), definition of surrogate consenter (P= .001), withdrawal from the study (P = .001), explanation of risk (P = .002), purpose of the institutional review board (P = .001), definition of substituted judgment (P = .03), compensation for harm (P = .001), and alternative treatments (P = .004). Computer-based education modules may be an important addition to conventional approaches for obtaining informed consent in the intensive care unit. Preparing patients' family members who may consider serving as surrogate consenters is critical to facilitating genomics research in critical care. ©2015 American Association of Critical-Care Nurses.

  13. Computer/Information Science

    ERIC Educational Resources Information Center

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  14. Computer/Information Science

    ERIC Educational Resources Information Center

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  15. Towards a sustainable framework for computer based health information systems (CHIS) for least developed countries (LDCs).

    PubMed

    Gordon, Abekah Nkrumah; Hinson, Robert Ebo

    2007-01-01

    The purpose of this paper is to argue for a theoretical framework by which development of computer based health information systems (CHIS) can be made sustainable. Health Management and promotion thrive on well-articulated CHIS. There are high levels of risk associated with the development of CHIS in the context of least developed countries (LDC), thereby making them unsustainable. This paper is based largely on literature survey on health promotion and information systems. The main factors accounting for the sustainability problem in less developed countries include poor infrastructure, inappropriate donor policies and strategies, poor infrastructure and inadequate human resource capacity. To counter these challenges and to ensure that CHIS deployment in LDCs is sustainable, it is proposed that the activities involved in the implementation of these systems be incorporated into organizational routines. This will ensure and secure the needed resources as well as the relevant support from all stakeholders of the system; on a continuous basis. This paper sets out to look at the issue of CHIS sustainability in LDCs, theoretically explains the factors that account for the sustainability problem and develops a conceptual model based on theoretical literature and existing empirical findings.

  16. A Feasibility Study of a Computer-Based Manpower Information System for the Construction Industry.

    ERIC Educational Resources Information Center

    Markowitz, Edward A.; And Others

    The result of a Presidential directive, this study was conducted to determine the feasibility and means for implementing a computer-assisted labor market information system designed to improve information flow in and about the construction industry. Data were collected by means of: (1) extensive research reviews, (2) contact with selected…

  17. Design Considerations of Help Options in Computer-Based L2 Listening Materials Informed by Participatory Design

    ERIC Educational Resources Information Center

    Cárdenas-Claros, Mónica Stella

    2015-01-01

    This paper reports on the findings of two qualitative exploratory studies that sought to investigate design features of help options in computer-based L2 listening materials. Informed by principles of participatory design, language learners, software designers, language teachers, and a computer programmer worked collaboratively in a series of…

  18. Design Considerations of Help Options in Computer-Based L2 Listening Materials Informed by Participatory Design

    ERIC Educational Resources Information Center

    Cárdenas-Claros, Mónica Stella

    2015-01-01

    This paper reports on the findings of two qualitative exploratory studies that sought to investigate design features of help options in computer-based L2 listening materials. Informed by principles of participatory design, language learners, software designers, language teachers, and a computer programmer worked collaboratively in a series of…

  19. Enhancing performances of SSVEP-based brain-computer interfaces via exploiting inter-subject information

    NASA Astrophysics Data System (ADS)

    Yuan, Peng; Chen, Xiaogang; Wang, Yijun; Gao, Xiaorong; Gao, Shangkai

    2015-08-01

    Objective. A new training-free framework was proposed for target detection in steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) using joint frequency-phase coding. Approach. The key idea is to transfer SSVEP templates from the existing subjects to a new subject to enhance the detection of SSVEPs. Under this framework, transfer template-based canonical correlation analysis (tt-CCA) methods were developed for single-channel and multi-channel conditions respectively. In addition, an online transfer template-based CCA (ott-CCA) method was proposed to update EEG templates by online adaptation. Main results. The efficiency of the proposed framework was proved with a simulated BCI experiment. Compared with the standard CCA method, tt-CCA obtained an 18.78% increase of accuracy with a data length of 1.5 s. A simulated test of ott-CCA further received an accuracy increase of 2.99%. Significance. The proposed simple yet efficient framework significantly facilitates the use of SSVEP BCIs using joint frequency-phase coding. This study also sheds light on the benefits from exploring and exploiting inter-subject information to the electroencephalogram (EEG)-based BCIs.

  20. Enhancing performances of SSVEP-based brain-computer interfaces via exploiting inter-subject information.

    PubMed

    Yuan, Peng; Chen, Xiaogang; Wang, Yijun; Gao, Xiaorong; Gao, Shangkai

    2015-08-01

    A new training-free framework was proposed for target detection in steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) using joint frequency-phase coding. The key idea is to transfer SSVEP templates from the existing subjects to a new subject to enhance the detection of SSVEPs. Under this framework, transfer template-based canonical correlation analysis (tt-CCA) methods were developed for single-channel and multi-channel conditions respectively. In addition, an online transfer template-based CCA (ott-CCA) method was proposed to update EEG templates by online adaptation. The efficiency of the proposed framework was proved with a simulated BCI experiment. Compared with the standard CCA method, tt-CCA obtained an 18.78% increase of accuracy with a data length of 1.5 s. A simulated test of ott-CCA further received an accuracy increase of 2.99%. The proposed simple yet efficient framework significantly facilitates the use of SSVEP BCIs using joint frequency-phase coding. This study also sheds light on the benefits from exploring and exploiting inter-subject information to the electroencephalogram (EEG)-based BCIs.

  1. Computer and information networks.

    PubMed

    Greenberger, M; Aronofsky, J; McKenney, J L; Massy, W F

    1973-10-05

    The most basic conclusion coming out of the EDUCOM seminars is that computer networking must be acknowledged as an important new mode for obtaining information and computation (15). It is a real alternative that needs to be given serious attention in current planning and decision-making. Yet the fact is that many institutions are not taking account of networks when they confer on whether or how to replace their main computer. Articulation of the possibilities of computer networks goes back to the early 1960's and before, and working networks have been in evidence for several years now, both commercially and in universities. What is new, however, is the unmistakable recognition-bordering on a sense of the inevitable-that networks are finally practical and here to stay. The visionary and promotional phases of computer networks are over. It is time for hard-nosed comparative analysis (16). Another conclusion of the seminars has to do with the factors that hinder the fuller development of networking. The major problems to be overcome in applying networks to research and education are political, organizational, and economic in nature rather than technological. This is not to say that the hardware and software problems of linking computers and information systems are completely solved, but they are not the big bottlenecks at present. Research and educational institutions must find ways to organize themselves as well as their computers to work together for greater resource sharing. The coming of age of networks takes on special significance as a result of widespread dissatisfactions expressed with the present computing situation. There is a feeling that the current mode of autonomous, self-sufficient operation in the provision of computing and information services is frequently wasteful, deficient, and unresponsive to users' needs because of duplication of effort from one installation to another, incompatibilities, and inadequate documentation, program support, and user

  2. Decomposition Storage of Information Based on Computer-Generated Hologram Interference and its Application in Optical Image Encryption

    NASA Astrophysics Data System (ADS)

    Guo, Yongkang; Huang, Qizhong; Du, Jinglei; Zhang, Yixiao

    2001-06-01

    An information-encryption method based on computer-generated hologram (CGH) interference is presented. In this method the original information is decomposed into two parts, and then each part is encoded on a separate CGH. When these two encoded CGHs are aligned and illuminated, a combined interference pattern is formed. The original information is obtained from this pattern. It is impossible to decrypt the original information from one CGH alone; two matched CGHs must be put together to make it available.

  3. SIGI: Field Test and Evaluation of a Computer-Based System of Interactive Guidance and Information. Volume II: Appendices.

    ERIC Educational Resources Information Center

    Chapman, Warren; And Others

    The computer-based System of Interactive Guidance and Information (SIGI) was field tested and evaluated at five community colleges and one university. Developed by Educational Testing Service, SIGI assists students in the process of informed and rational career decision making. These appendices to the final evaluation report contain the manuals…

  4. Maximizing Information Transfer in SSVEP-Based Brain-Computer Interfaces.

    PubMed

    Sengelmann, Malte; Engel, Andreas K; Maye, Alexander

    2017-02-01

    Compared to the different brain signals used in brain-computer interface (BCI) paradigms, the s teady-state visually evoked potential (SSVEP) features a high signal to noise ratio, enabling reliable and fast classification of neural activity patterns without extensive training requirements. In this paper, we present methods to further increase the information transfer rates (ITRs) of SSVEP-based BCIs. Starting with stimulus parameter optimizations methods, we develop an improved approach for the use of Canonical correlation analysis and analyze properties of the SSVEP when the user fixates a target and during transitions between targets. These transitions show a negative effect on the system's ITR which we trace back to delays and dead times of the SSVEP. Using two classifier types adapted to continuous and transient SSVEPs and two control modes (fast feedback and fast input), we present a simulated online BCI implementation which addresses the challenges introduced by transient SSVEPs. The resulting system reaches an average ITR of 181 Bits/min and peak ITR values of up to 295 Bits/min for individual users.

  5. A Comparative Assessment of Paper-Based and Computer-Based Maintenance Information Delivery Systems

    DTIC Science & Technology

    1988-07-01

    shipboard data bases have to be considered, such as the shipboard non-tactical ADP program (SNAP), the consolidated automatic support system ( CASS ...e.g., Brown. 1964; Goff, Schlesinger, & Parlog, 1969; Kieras , Tibbits, & Bovair, 1984; Nugent, 1988). Participation in the testing was voluntary. No...consolidated automatic support system ( CASS ), etc., to provide maximum usage of available maintenance data and to simplify parts ordering and documentation of

  6. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  7. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Treesearch

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  8. Towards a Model Curriculum in Computer-Based Information Systems for Developing Countries: The Case of Arab Environments.

    ERIC Educational Resources Information Center

    Noor, A. El Sayed

    1984-01-01

    Focuses on the benefits of model care curriculum development in computer-based information systems relevant to the needs of developing countries. Suggested procedures include outlining search rationale, examining existing curricula in developed nations, evaluating such education in Kuwait, presenting a model curriculum, comparing it with relevant…

  9. Application of a Micro Computer-Based Management Information System to Improve the USAF Service Reporting Process

    DTIC Science & Technology

    1990-09-01

    ASD) SPO. The Service Reporting Management Information System (SRMIS) was implemented and evaluated in the Air Force One (AF-1) Replacement Aircraft SPO during a five month trial period.... Information System (MIS). MIS experts and System Program Office (SPO) acquisition managers; and software prototyping with an Aeronautical Systems Division...This study conducted research into the development, implementation and evaluation of a personal computer based Service Reporting (SR) Management

  10. Providing Computer-Based Information Services to an Academic Community. Final Technical Report.

    ERIC Educational Resources Information Center

    Bayer, Bernard

    The Mechanized Information Center (MIC) at the Ohio State University conducts retrospective and current awareness searches for faculty, students, and staff using data bases for agriculture, chemistry, education, psychology, and social sciences, as well as a multidisciplinary data base. The final report includes (1) a description of the background…

  11. American Health Information Management Association. Position statement. Issue: healthcare reform--information systems and the need for computer-based patient records.

    PubMed

    1994-01-01

    Timely, reliable information is a critical part of healthcare reform. The Clinton Administration's current proposal would streamline health information through the use of standard forms and data definitions and establish a nationwide electronic highway to link health records and exchange needed information. Information would be captured, retained, and transmitted as a routine byproduct of patient care. These goals can be achieved only through broad implementation of the computer-based patient record (CPR). The CPR will contribute to more effective and cost-efficient care through (1) ready access to longitudinal (lifetime) health information; (2) support for continuous quality improvement; (3) easy access to clinical knowledge bases; and (4) patient participation in health documentation and disease prevention. The technology exists to implement the CPR, but further work is needed to develop the necessary standards and security mechanisms. The American Health Information Management Association is committed to working with applicable state and federal agencies, professional associations, accrediting agencies, voluntary standards organizations, and the Computer-Based Patient Record Institute (CPRI) to achieve the information management objectives of the current health care reform plan. With their expertise in health information systems and strong commitment to patient privacy, health information management professionals can make significant contributions to the development, implementation, and ongoing security of national and state health information networks.

  12. Understanding Networks of Computing Chemical Droplet Neurons Based on Information Flow.

    PubMed

    Gruenert, Gerd; Gizynski, Konrad; Escuela, Gabi; Ibrahim, Bashar; Gorecki, Jerzy; Dittrich, Peter

    2015-11-01

    In this paper, we present general methods that can be used to explore the information processing potential of a medium composed of oscillating (self-exciting) droplets. Networks of Belousov-Zhabotinsky (BZ) droplets seem especially interesting as chemical reaction-diffusion computers because their time evolution is qualitatively similar to neural network activity. Moreover, such networks can be self-generated in microfluidic reactors. However, it is hard to track and to understand the function performed by a medium composed of droplets due to its complex dynamics. Corresponding to recurrent neural networks, the flow of excitations in a network of droplets is not limited to a single direction and spreads throughout the whole medium. In this work, we analyze the operation performed by droplet systems by monitoring the information flow. This is achieved by measuring mutual information and time delayed mutual information of the discretized time evolution of individual droplets. To link the model with reality, we use experimental results to estimate the parameters of droplet interactions. We exemplarily investigate an evolutionary generated droplet structure that operates as a NOR gate. The presented methods can be applied to networks composed of at least hundreds of droplets.

  13. A network-based distributed, media-rich computing and information environment

    SciTech Connect

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise and a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.

  14. Digital Computer Simulation Models of Library-Based Information Retrieval Systems. Reports on File Organization Studies.

    ERIC Educational Resources Information Center

    Reilly, Kevin D.

    A simulation study of library-based information retrieval systems is described. Basic models for each of several important aspects are presented: (1) user behavior, emphasizing response to quality and delays in services; (2) the scheduling of services and the organization of the machine-readable files; and (3) the distribution of conventional…

  15. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record: information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  16. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; Lane, W. L.; Baier, W. G.

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  17. Medical Information Processing by Computer.

    ERIC Educational Resources Information Center

    Kleinmuntz, Benjamin

    The use of the computer for medical information processing was introduced about a decade ago. Considerable inroads have now been made toward its applications to problems in medicine. Present uses of the computer, both as a computational and noncomputational device include the following: automated search of patients' files; on-line clinical data…

  18. Community Information Centers and the Computer.

    ERIC Educational Resources Information Center

    Carroll, John M.; Tague, Jean M.

    Two computer data bases have been developed by the Computer Science Department at the University of Western Ontario for "Information London," the local community information center. One system, called LONDON, permits Boolean searches of a file of 5,000 records describing human service agencies in the London area. The second system,…

  19. Subquantum information and computation

    NASA Astrophysics Data System (ADS)

    Valentini, Antony

    2002-08-01

    It is argued that immense physical resources -- for nonlocal communication, espionage, and exponentially-fast computation -- are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that `non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).

  20. Information processing, computation, and cognition.

    PubMed

    Piccinini, Gualtiero; Scarantino, Andrea

    2011-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both - although others disagree vehemently. Yet different cognitive scientists use 'computation' and 'information processing' to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates' empirical aspects.

  1. Effective solutions in introducing Server-Based Computing into a hospital information system.

    PubMed

    Kuwata, Shigeki; Teramoto, Kei; Matsumura, Yasushi; Kushniruk, Andre W; Borycki, Elizabeth M; Kondoh, Hiroshi

    2009-01-01

    Server-Based Computing (SBC) is a technology for terminal administration that achieves higher security at lower expense. Use of SBC in large hospitals, however, is not widespread because methods to effectively implement the technology have not been fully established. We present a system design that uses SBC in a large-scale hospital and then discuss the implementation problems and their solutions. With the exception of network traffic estimates, the server size estimates were validated. Three results from an evaluation of an SBC implementation were: 1) security was re-enforced by applying multiple-policy adaptation to a single client terminal, 2) cost reduction was realized by having fewer PC failures and a lower power consumption, and 3) user-roaming was found to be effective in reducing the number of iterative operations performed by users.

  2. Information processing, computation, and cognition

    PubMed Central

    Scarantino, Andrea

    2010-01-01

    Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. PMID:22210958

  3. Development of software for computing forming information using a component based approach

    NASA Astrophysics Data System (ADS)

    Ko, Kwang Hee; Park, Jiing Seo; Kim, Jung; Kim, Young Bum; Shin, Jong Gye

    2009-12-01

    In shipbuilding industry, the manufacturing technology> has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology', however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary> to create a "plug-in ''framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a frame-work for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology; which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  4. Quantum Information, Computation and Communication

    NASA Astrophysics Data System (ADS)

    Jones, Jonathan A.; Jaksch, Dieter

    2012-07-01

    Part I. Quantum Information: 1. Quantum bits and quantum gates; 2. An atom in a laser field; 3. Spins in magnetic fields; 4. Photon techniques; 5. Two qubits and beyond; 6. Measurement and entanglement; Part II. Quantum Computation: 7. Principles of quantum computing; 8. Elementary quantum algorithms; 9. More advanced quantum algorithms; 10. Trapped atoms and ions; 11. Nuclear magnetic resonance; 12. Large scale quantum computers; Part III. Quantum Communication: 13. Basics of information theory; 14. Quantum information; 15. Quantum communication; 16. Testing EPR; 17. Quantum cryptography; Appendixes; References; Index.

  5. Quantum Information and Computing

    NASA Astrophysics Data System (ADS)

    Accardi, L.; Ohya, Masanori; Watanabe, N.

    2006-03-01

    Preface -- Coherent quantum control of [symbol]-atoms through the stochastic limit / L. Accardi, S. V. Kozyrev and A. N. Pechen -- Recent advances in quantum white noise calculus / L. Accardi and A. Boukas -- Control of quantum states by decoherence / L. Accardi and K. Imafuku -- Logical operations realized on the Ising chain of N qubits / M. Asano, N. Tateda and C. Ishii -- Joint extension of states of fermion subsystems / H. Araki -- Quantum filtering and optimal feedback control of a Gaussian quantum free particle / S. C. Edwards and V. P. Belavkin -- On existence of quantum zeno dynamics / P. Exner and T. Ichinose -- Invariant subspaces and control of decoherence / P. Facchi, V. L. Lepore and S. Pascazio -- Clauser-Horner inequality for electron counting statistics in multiterminal mesoscopic conductors / L. Faoro, F. Taddei and R. Fazio -- Fidelity of quantum teleportation model using beam splittings / K.-H. Fichtner, T. Miyadera and M. Ohya -- Quantum logical gates realized by beam splittings / W. Freudenberg ... [et al.] -- Information divergence for quantum channels / S. J. Hammersley and V. P. Belavkin -- On the uniqueness theorem in quantum information geometry / H. Hasegawa -- Noncanonical representations of a multi-dimensional Brownian motion / Y. Hibino -- Some of future directions of white noise theory / T. Hida -- Information, innovation and elemental random field / T. Hida -- Generalized quantum turing machine and its application to the SAT chaos algorithm / S. Iriyama, M. Ohya and I. Volovich -- A Stroboscopic approach to quantum tomography / A. Jamiolkowski -- Positive maps and separable states in matrix algebras / A. Kossakowski -- Simulating open quantum systems with trapped ions / S. Maniscalco -- A purification scheme and entanglement distillations / H. Nakazato, M. Unoki and K. Yuasa -- Generalized sectors and adjunctions to control micro-macro transitions / I. Ojima -- Saturation of an entropy bound and quantum Markov states / D. Petz -- An

  6. Information-theoretic CAD system in mammography: Entropy-based indexing for computational efficiency and robust performance

    SciTech Connect

    Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee; Lo, Joseph Y.

    2007-08-15

    We have previously presented a knowledge-based computer-assisted detection (KB-CADe) system for the detection of mammographic masses. The system is designed to compare a query mammographic region with mammographic templates of known ground truth. The templates are stored in an adaptive knowledge database. Image similarity is assessed with information theoretic measures (e.g., mutual information) derived directly from the image histograms. A previous study suggested that the diagnostic performance of the system steadily improves as the knowledge database is initially enriched with more templates. However, as the database increases in size, an exhaustive comparison of the query case with each stored template becomes computationally burdensome. Furthermore, blind storing of new templates may result in redundancies that do not necessarily improve diagnostic performance. To address these concerns we investigated an entropy-based indexing scheme for improving the speed of analysis and for satisfying database storage restrictions without compromising the overall diagnostic performance of our KB-CADe system. The indexing scheme was evaluated on two different datasets as (i) a search mechanism to sort through the knowledge database, and (ii) a selection mechanism to build a smaller, concise knowledge database that is easier to maintain but still effective. There were two important findings in the study. First, entropy-based indexing is an effective strategy to identify fast a subset of templates that are most relevant to a given query. Only this subset could be analyzed in more detail using mutual information for optimized decision making regarding the query. Second, a selective entropy-based deposit strategy may be preferable where only high entropy cases are maintained in the knowledge database. Overall, the proposed entropy-based indexing scheme was shown to reduce the computational cost of our KB-CADe system by 55% to 80% while maintaining the system's diagnostic

  7. Computers in Information Data Centers.

    ERIC Educational Resources Information Center

    Clifton, Joe Ann, Ed.; Helgeson, Duane, Ed.

    This collection of ten conference reports begins with "Historical Impact of Computers on the Operation of Libraries and Information Systems: A Personal Perspective" and is followed by "Tips on Computer Software: Advantages and Methods of Program Dissemination" and "BLOCS--A Unique Multi-Dimensional Approach to On-Line Circulation". Next, libraries…

  8. Optimization of numerical weather/wave prediction models based on information geometry and computational techniques

    NASA Astrophysics Data System (ADS)

    Galanis, George; Famelis, Ioannis; Kalogeri, Christina

    2014-10-01

    The last years a new highly demanding framework has been set for environmental sciences and applied mathematics as a result of the needs posed by issues that are of interest not only of the scientific community but of today's society in general: global warming, renewable resources of energy, natural hazards can be listed among them. Two are the main directions that the research community follows today in order to address the above problems: The utilization of environmental observations obtained from in situ or remote sensing sources and the meteorological-oceanographic simulations based on physical-mathematical models. In particular, trying to reach credible local forecasts the two previous data sources are combined by algorithms that are essentially based on optimization processes. The conventional approaches in this framework usually neglect the topological-geometrical properties of the space of the data under study by adopting least square methods based on classical Euclidean geometry tools. In the present work new optimization techniques are discussed making use of methodologies from a rapidly advancing branch of applied Mathematics, the Information Geometry. The latter prove that the distributions of data sets are elements of non-Euclidean structures in which the underlying geometry may differ significantly from the classical one. Geometrical entities like Riemannian metrics, distances, curvature and affine connections are utilized in order to define the optimum distributions fitting to the environmental data at specific areas and to form differential systems that describes the optimization procedures. The methodology proposed is clarified by an application for wind speed forecasts in the Kefaloniaisland, Greece.

  9. Institutional computing (IC) information session

    SciTech Connect

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  10. A mobile Nursing Information System based on human-computer interaction design for improving quality of nursing.

    PubMed

    Su, Kuo-Wei; Liu, Cheng-Li

    2012-06-01

    A conventional Nursing Information System (NIS), which supports the role of nurse in some areas, is typically deployed as an immobile system. However, the traditional information system can't response to patients' conditions in real-time, causing delays on the availability of this information. With the advances of information technology, mobile devices are increasingly being used to extend the human mind's limited capacity to recall and process large numbers of relevant variables and to support information management, general administration, and clinical practice. Unfortunately, there have been few studies about the combination of a well-designed small-screen interface with a personal digital assistant (PDA) in clinical nursing. Some researchers found that user interface design is an important factor in determining the usability and potential use of a mobile system. Therefore, this study proposed a systematic approach to the development of a mobile nursing information system (MNIS) based on Mobile Human-Computer Interaction (M-HCI) for use in clinical nursing. The system combines principles of small-screen interface design with user-specified requirements. In addition, the iconic functions were designed with metaphor concept that will help users learn the system more quickly with less working-memory. An experiment involving learnability testing, thinking aloud and a questionnaire investigation was conducted for evaluating the effect of MNIS on PDA. The results show that the proposed MNIS is good on learning and higher satisfaction on symbol investigation, terminology and system information.

  11. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    NASA Astrophysics Data System (ADS)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  12. The diffusion of computer-based information technology into health institutions of Republic of Serbia (FR Yugoslavia).

    PubMed

    Simić, S; Marinković, J; Bjegović, V; Stanisavljević, D

    1996-08-01

    The basic purpose of this study was to analyze the diffusion of computer-based information technology into the health care institutions of the Republic of Serbia in the year 1994, and to compare the results with a similar investigation in 1992 in order to determine the state and progress of its development. The instrument of investigation was a questionnaire with 24 questions, distributed to all the independent health institutions in Serbia (total 238). The overall response rate was 40.8%. Of the number of responding health institutions, 92.8% own computers which are in use, six PCs on average, and on average use two application softwares, obligatory one for accounting and billing. In conclusion, health care institutions in the Republic of Serbia are unsatisfactorily equipped with information technology and without the developed institutional information system, except on the level of the project. So, careful planning, selection, implementation and management with national coordination will be needed to ensure the appropriate use of technology and information systems in health care.

  13. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  14. Computer Abuse: Vandalizing the Information Society.

    ERIC Educational Resources Information Center

    Furnell, Steven M.; Warren, Matthew J.

    1997-01-01

    Computing and telecommunications, key to an information-based society, are increasingly targets for criminals and mischief makers. This article examines the effects of malicious computer abuse: hacking and viruses, highlights the apparent increase in incidents, and examines their effect on public perceptions of technology. Presents broad…

  15. Computer Abuse: Vandalizing the Information Society.

    ERIC Educational Resources Information Center

    Furnell, Steven M.; Warren, Matthew J.

    1997-01-01

    Computing and telecommunications, key to an information-based society, are increasingly targets for criminals and mischief makers. This article examines the effects of malicious computer abuse: hacking and viruses, highlights the apparent increase in incidents, and examines their effect on public perceptions of technology. Presents broad…

  16. A Study on the Learning Efficiency of Multimedia-Presented, Computer-Based Science Information

    ERIC Educational Resources Information Center

    Guan, Ying-Hua

    2009-01-01

    This study investigated the effects of multimedia presentations on the efficiency of learning scientific information (i.e. information on basic anatomy of human brains and their functions, the definition of cognitive psychology, and the structure of human memory). Experiment 1 investigated whether the modality effect could be observed when the…

  17. A Study on the Learning Efficiency of Multimedia-Presented, Computer-Based Science Information

    ERIC Educational Resources Information Center

    Guan, Ying-Hua

    2009-01-01

    This study investigated the effects of multimedia presentations on the efficiency of learning scientific information (i.e. information on basic anatomy of human brains and their functions, the definition of cognitive psychology, and the structure of human memory). Experiment 1 investigated whether the modality effect could be observed when the…

  18. User-System Interface Design for Computer-Based Information Systems,

    DTIC Science & Technology

    1982-04-01

    documentation is needed to oermit design review before software implementation, and continuing design coordination thereafter. Potential computer aids are...Replies were received from nine members of the group: Sara R. Abbott Union Carbide Corporation Christopher J. Arbak Systems Research Laboratories , Inc. J

  19. Computer-Based Storage and Retrieval of Geoscience Information: Bibliography 1946-69.

    ERIC Educational Resources Information Center

    Hruska, J.; Burk, C. F., Jr.

    The application of computer technology to storage and retrieval of geoscience data has increased markedly since about 1966, and has resulted in the publication of numerous papers describing various aspects of this work. Unfortunately, these are widely dispersed in the literature, many in relatively obscure journals, proceedings and other sources.…

  20. Design Guidelines for the User Interface to Computer-Based Information Systems

    DTIC Science & Technology

    1983-03-01

    units of measurement, do not display data in English units, or vice versa. Example: Computer time records that are not in directly usable format should...recommendation is ignored, such as the coding of English and Canadian postal zones. Reference: BB 2.5.1. -6 Meaningful Codes -6 Meaningful codes should be...transaction. I Natural Language - A type of dialogue in which users iormulate control entries in their natural language, e.g., English , Spanish, French

  1. An Evaluation of a Computer Based Medical Diagnostic/Information System for Nuclear Submarines,

    DTIC Science & Technology

    1979-11-01

    PERFDU), Cholecystitis (CHOLE), Small Bowel Obstruction (SBO), Renal Colic (R.COLIC), Non-specific Abdominal Pain (NSAP), and’ Dyspepsia (DYSPP...Small Bowel Obstruction , Cholecystitis, Diverticulitis and Perforated Duodenal Ulcer. This is not surprising since the sample of cases in this study is...Disease Computer Physician Non-Specific Pain 62.0 74.6 Dyspepsia 55.0 30.0 Renal Cholic 25.0 100.0 Small Bowel Obstruction 25.0 75.0 Cholistitis 57.1

  2. Policy Information System Computer Program.

    ERIC Educational Resources Information Center

    Hamlin, Roger E.; And Others

    The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…

  3. Policy Information System Computer Program.

    ERIC Educational Resources Information Center

    Hamlin, Roger E.; And Others

    The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…

  4. Information-computational system for storage, search and analytical processing of environmental datasets based on the Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Titov, A.; Gordov, E.; Okladnikov, I.

    2009-04-01

    a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.

  5. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is

  6. Special data base of Informational - Computational System 'INM RAS - Black Sea' for solving inverse and data assimilation problems

    NASA Astrophysics Data System (ADS)

    Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly

    2014-05-01

    Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological

  7. Evidence-based guidelines for the informal use of computers by children to promote the development of academic, cognitive and social skills.

    PubMed

    Tran, Phuoc; Subrahmanyam, Kaveri

    2013-01-01

    The use of computers in the home has become very common among young children. This paper reviews research on the effects of informal computer use and identifies potential pathways through which computers may impact children's development. Based on the evidence reviewed, we present the following guidelines to arrange informal computer experiences that will promote the development of children's academic, cognitive and social skills: (1) children should be encouraged to use computers for moderate amounts of time (2-3 days a week for an hour or two per day) and (2) children's use of computers should (a) include non-violent action-based computer games as well as educational games, (b) not displace social activities but should instead be arranged to provide opportunities for social engagement with peers and family members and (c) involve content with pro-social and non-violent themes. We conclude the paper with questions that must be addressed in future research. This paper reviews research on the effects of informal computer use on children's academic, cognitive and social skills. Based on the evidence presented, we have presented guidelines to enable parents, teachers and other adults to arrange informal computer experiences so as to maximise their potential benefit for children's development.

  8. A Computer-Based Surveillance System to Support Discharge Planning: An Implementation within a Hospital Information System

    PubMed Central

    Crawford, A.G.; Young, A.J.; Stephens, R.K.; Shinn, D.T.

    1988-01-01

    This paper describes an innovation designed to improve discharge planning and, potentially, reduce the length of inpatient stays. Unlike other approaches, this one has been implemented within a Hospital Information System. The rationale is that reports integrating clinical and non-clinical data gathered upon admission can enable the Social Work Manager to assign staff more effectively, i.e., to the most difficult cases, and can enable staff to perform discharge planning more effectively, i.e., more pro-actively. The paper reviews the use of computers in social work and provides a critique, not of the models proposed, but of the implementations attempted to date. We contend that our approach will prove more successful than those other implementations because ours is based on an integrated and almost totally-electronic medical record keeping system, encompassing observations by both clinical and non-clinical staff.

  9. Measuring the Effectiveness of Information Security Training: A Comparative Analysis of Computer-Based Training and Instructor-Based Training

    ERIC Educational Resources Information Center

    Kim, Philip

    2010-01-01

    Financial institutions are increasingly finding difficulty defending against information security risks and threats, as they are often the number one target for information thieves. An effective information security training and awareness program can be a critical component of protecting an organization's information assets. Many financial…

  10. Measuring the Effectiveness of Information Security Training: A Comparative Analysis of Computer-Based Training and Instructor-Based Training

    ERIC Educational Resources Information Center

    Kim, Philip

    2010-01-01

    Financial institutions are increasingly finding difficulty defending against information security risks and threats, as they are often the number one target for information thieves. An effective information security training and awareness program can be a critical component of protecting an organization's information assets. Many financial…

  11. Computer-supported access to engineering information

    SciTech Connect

    Schwarz, H.; Glock, H.J.; Mueller, F.

    1985-01-01

    A computer-based documentation system is described that provides access to the information stored in written documents and drawings. This system contains the syntax of a documentation language, several computer programs, and special methods. The latter enable users to formulate the semantics of their own documentation language, to employ that language when describing the information content of documents and formulating queries, and to organize the storage and retrieval procedure. The system is explained by its application to nuclear power plant documentation. Finally, a layer model of an integrated software system is presented that is suited to support engineers work continuously. 7 references.

  12. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    SciTech Connect

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  13. Developing a Standards-Based Information Model for Representing Computable Diagnostic Criteria: A Feasibility Study of the NQF Quality Data Model.

    PubMed

    Jiang, Guoqian; Solbrig, Harold R; Pathak, Jyotishman; Chute, Christopher G

    2015-01-01

    The lack of a standards-based information model has been recognized as a major barrier for representing computable diagnostic criteria. In this paper we describe our efforts in examining the feasibility of the Quality Data Model (QDM)-developed by the National Quality Forum (NQF)-for representing computable diagnostic criteria. We collected the diagnostic criteria for a number of diseases and disorders (n=12) from textbooks and profiled the data elements of the criteria using the QDM data elements. We identified a number of common patterns informed by the QDM. In conclusion, the common patterns informed by the QDM are useful and feasible in building a standards-based information model for computable diagnostic criteria.

  14. Quantum information density scaling and qubit operation time constraints of CMOS silicon-based quantum computer architectures

    NASA Astrophysics Data System (ADS)

    Rotta, Davide; Sebastiano, Fabio; Charbon, Edoardo; Prati, Enrico

    2017-06-01

    Even the quantum simulation of an apparently simple molecule such as Fe2S2 requires a considerable number of qubits of the order of 106, while more complex molecules such as alanine (C3H7NO2) require about a hundred times more. In order to assess such a multimillion scale of identical qubits and control lines, the silicon platform seems to be one of the most indicated routes as it naturally provides, together with qubit functionalities, the capability of nanometric, serial, and industrial-quality fabrication. The scaling trend of microelectronic devices predicting that computing power would double every 2 years, known as Moore's law, according to the new slope set after the 32-nm node of 2009, suggests that the technology roadmap will achieve the 3-nm manufacturability limit proposed by Kelly around 2020. Today, circuital quantum information processing architectures are predicted to take advantage from the scalability ensured by silicon technology. However, the maximum amount of quantum information per unit surface that can be stored in silicon-based qubits and the consequent space constraints on qubit operations have never been addressed so far. This represents one of the key parameters toward the implementation of quantum error correction for fault-tolerant quantum information processing and its dependence on the features of the technology node. The maximum quantum information per unit surface virtually storable and controllable in the compact exchange-only silicon double quantum dot qubit architecture is expressed as a function of the complementary metal-oxide-semiconductor technology node, so the size scale optimizing both physical qubit operation time and quantum error correction requirements is assessed by reviewing the physical and technological constraints. According to the requirements imposed by the quantum error correction method and the constraints given by the typical strength of the exchange coupling, we determine the workable operation frequency

  15. Computing spatial information from Fourier coefficient distributions.

    PubMed

    Heinz, William F; Werbin, Jeffrey L; Lattman, Eaton; Hoh, Jan H

    2011-05-01

    The spatial relationships between molecules can be quantified in terms of information. In the case of membranes, the spatial organization of molecules in a bilayer is closely related to biophysically and biologically important properties. Here, we present an approach to computing spatial information based on Fourier coefficient distributions. The Fourier transform (FT) of an image contains a complete description of the image, and the values of the FT coefficients are uniquely associated with that image. For an image where the distribution of pixels is uncorrelated, the FT coefficients are normally distributed and uncorrelated. Further, the probability distribution for the FT coefficients of such an image can readily be obtained by Parseval's theorem. We take advantage of these properties to compute the spatial information in an image by determining the probability of each coefficient (both real and imaginary parts) in the FT, then using the Shannon formalism to calculate information. By using the probability distribution obtained from Parseval's theorem, an effective distance from the uncorrelated or most uncertain case is obtained. The resulting quantity is an information computed in k-space (kSI). This approach provides a robust, facile and highly flexible framework for quantifying spatial information in images and other types of data (of arbitrary dimensions). The kSI metric is tested on a 2D Ising model, frequently used as a model for lipid bilayer; and the temperature-dependent phase transition is accurately determined from the spatial information in configurations of the system.

  16. Novel information theory based method for superimposition of lateral head radiographs and cone beam computed tomography images

    PubMed Central

    Jacquet, W; Nyssen, E; Bottenberg, P; de Groen, P; Vande Vannet, B

    2010-01-01

    Objectives The aim was to introduce a novel alignment criterion, focus mutual information (FMI), for the superimposition of lateral cephalometric radiographs and three dimensional (3D) cone beam computed images as well as the assessment of the alignment characteristics of the new method and comparison of the novel methodology with the region of interest (ROI) approach. Methods Implementation of a FMI criterion-based methodology that only requires the approximate indication of stable structures in one single image. The robustness of the method was first addressed in a phantom experiment comparing the new technique with a ROI approach. Two consecutive cephalometric radiographs were then obtained, one before and one after functional twin block application. These images were then superimposed using alignment by FMI where the following were focused on, in several ways: (1) cranial base and acoustic meatus, (2) palatal plane and (3) mandibular symphysis. The superimposed images were subtracted and coloured. The applicability to cone beam CT (CBCT) is illustrated by the alignment of CBCT images acquired before and after craniofacial surgery. Results The phantom experiment clearly shows superior alignment when compared to the ROI approach (Wilcoxon n = 17, Z = −3.290, and P = 0.001), and robustness with respect to the choice of parameters (one-sample t-test n = 50, t = −12.355, and P = 0.000). The treatment effects are revealed clearly in the subtraction image of well-aligned cephalometric radiographs. The colouring scheme of the subtraction image emphasises the areas of change and visualizes the remodelling of the soft tissue. Conclusions FMI allows for cephalometry without tracing, it avoids the error inherent to the use of landmarks and the interaction of the practitioner is kept to a minimum. The robustness to focal distribution variations limits the influence of possible examiner inaccuracy. PMID:20395459

  17. Sound and computer information presentation

    SciTech Connect

    Bly, S

    1982-03-01

    This thesis examines the use of sound to present data. Computer graphics currently offers a vast array of techniques for communicating data to analysts. Graphics is limited, however, by the number of dimensions that can be perceived at one time, by the types of data that lend themselves to visual representation, and by the necessary eye focus on the output. Sound offers an enhancement and an alternative to graphic tools. Multivariate, logarithmic, and time-varying data provide examples for aural representation. For each of these three types of data, the thesis suggests a method of encoding the information into sound and presents various applications. Data values were mapped to sound characteristics such as pitch and volume so that information was presented as sets or sequences of notes. In all cases, the resulting sounds conveyed information in a manner consistent with prior knowledge of the data. Experiments showed that sound does convey information accurately and that sound can enhance graphic presentations. Subjects were tested on their ability to distinguish between two sources of test items. In the first phase of the experiments, subjects discriminated between two 6-dimensional data sets represented in sound. In the second phase of the experiment, 75 subjects were selected and assigned to one of three groups. The first group of 25 heard test items, the second group saw test items, and the third group both heard and saw the test items. The average percentage correct was 64.5% for the sound-only group, 62% for the graphics-only group, and 69% for the sound and graphics group. In the third phase, additional experiments focused on the mapping between data values and sound characteristics and on the training methods.

  18. Information Sources on Computer Literacy.

    ERIC Educational Resources Information Center

    Ossman, Marian R.

    1984-01-01

    Cites books, journals, articles, and speeches covering the gamut from computer literacy as a national crisis to a current listing of popular computer camps, educational computing, library role, and staff training. Primary focus is on microcomputers, but several less recent articles are oriented to computers in general. (MBR)

  19. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  20. Detailed requirements document for Stowage List and Hardware Tracking System (SLAHTS). [computer based information management system in support of space shuttle orbiter stowage configuration

    NASA Technical Reports Server (NTRS)

    Keltner, D. J.

    1975-01-01

    The stowage list and hardware tracking system, a computer based information management system, used in support of the space shuttle orbiter stowage configuration and the Johnson Space Center hardware tracking is described. The input, processing, and output requirements that serve as a baseline for system development are defined.

  1. Evaluating the Informative Quality of Documents in SGML Format from Judgements by Means of Fuzzy Linguistic Techniques Based on Computing with Words.

    ERIC Educational Resources Information Center

    Herrera-Viedma, Enrique; Peis, Eduardo

    2003-01-01

    Presents a fuzzy evaluation method of SGML documents based on computing with words. Topics include filtering the amount of information available on the Web to assist users in their search processes; document type definitions; linguistic modeling; user-system interaction; and use with XML and other markup languages. (Author/LRW)

  2. Evaluating the Informative Quality of Documents in SGML Format from Judgements by Means of Fuzzy Linguistic Techniques Based on Computing with Words.

    ERIC Educational Resources Information Center

    Herrera-Viedma, Enrique; Peis, Eduardo

    2003-01-01

    Presents a fuzzy evaluation method of SGML documents based on computing with words. Topics include filtering the amount of information available on the Web to assist users in their search processes; document type definitions; linguistic modeling; user-system interaction; and use with XML and other markup languages. (Author/LRW)

  3. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  4. Traffic information computing platform for big data

    NASA Astrophysics Data System (ADS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-10-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  5. Information content compression and zero-order elimination of computer-generated hologram based on discrete cosine transform

    NASA Astrophysics Data System (ADS)

    Ren, Zhenbo; Su, Ping; Ma, Jianshe

    2013-11-01

    A hologram always contains redundancy. Its huge information content brings difficulties in the storage and transmission. In addition, when a computer-generated hologram (CGH) is reproduced, the zero-order diffraction spot which occupies most of energy reduces the contrast of reproduced image and disturbs the observation and recording. In this paper, discrete cosine transform (DCT) is used to compress the information content of a CGH and to eliminate the zero-order spot. Numerical simulation results showed that, DCT could effectively remove about 64.8% of the redundancy without affecting the quality of reconstruction. It also eliminated the zero-order spot and improved the contrast of reproduced image. Finally, the optical reconstruction results were shown. Numerical and optical experiments both proved that, the reconstructed image after compression almost did not lose any resolution that human eyes could perceive.

  6. Computer Information System For Nuclear Medicine

    NASA Astrophysics Data System (ADS)

    Cahill, P. T.; Knowles, R. J.....; Tsen, O.

    1983-12-01

    To meet the complex needs of a nuclear medicine division serving a 1100-bed hospital, a computer information system has been developed in sequential phases. This database management system is based on a time-shared minicomputer linked to a broadband communications network. The database contains information on patient histories, billing, types of procedures, doses of radiopharmaceuticals, times of study, scanning equipment used, and technician performing the procedure. These patient records are cycled through three levels of storage: (a) an active file of 100 studies for those patients currently scheduled, (b) a temporary storage level of 1000 studies, and (c) an archival level of 10,000 studies containing selected information. Merging of this information with reports and various statistical analyses are possible. This first phase has been in operation for well over a year. The second phase is an upgrade of the size of the various storage levels by a factor of ten.

  7. Computer virus information update CIAC-2301

    SciTech Connect

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  8. An Introduction to Computer-Based Learning.

    ERIC Educational Resources Information Center

    Reynolds, Angus

    1983-01-01

    Defines computer-based learning terminology. Describes six modes of computer-assisted instruction (tutorial, drill and practice, instructional game, modeling, simulation, problem solving); three modes of computer-managed instruction (testing, prescription generation, recordkeeping); and computer-supported learning resources (information storage…

  9. Improving Royal Australian Air Force Strategic Airlift Planning by Application of a Computer Based Management Information System

    DTIC Science & Technology

    1991-12-01

    perceived ability of system to support organisational goals and objectives. Other: With what other informtion systems should commaications be considered... informtion system ? A system is not required or justified. Potential exists for a system but not right now. A system may be justified. However, the envirornet...IMPROVING ROYAL AUSTRALIAN AIR FORCE STRATEGIC AIRLIFT PLANNING BY APPLICATION OF A (XM E BASED MANAGEEN INFORMATION SYSTEM THESIS Neil A. Cooper Squadron

  10. Space-based Observation System Simulation Experiments for the Global Water Cycle: Information Tradeoffs, Model Diagnostics, and Exascale Computing

    NASA Astrophysics Data System (ADS)

    Reed, P. M.

    2011-12-01

    Global scale issues such as population growth, changing land-use, and climate change place our natural resources at the center of focus for a broad range of interdependent science, engineering, and policy problems. Our ability to mitigate and adapt to the accelerating rate of environmental change is critically dependent on our ability to observe and predict the natural, built, and social systems that define sustainability at the global scale. Despite the risks and challenges posed by global change, we are faced with critical risks to our ability to maintain and improve long term space-based observations of these changes. Despite consensus agreement on the critical importance of space-based Earth science, the fundamental challenge remains: How should we manage the severe tradeoffs and design challenges posed by maximizing the value of existing and proposed spaced-based Earth observation systems? Addressing this question requires transformative innovations in the design and management of spaced-based Earth observation systems that effectively take advantage of massively parallel computing architectures to enable the discovery and exploitation of critical mission tradeoffs using high-resolution space-based observation system simulation events (OSSEs) that simulate the global water cycle data that would result from sensing innovations and evaluates their merit with carefully constructed prediction and management benchmarks.

  11. New Information Dispersal Techniques for Trustworthy Computing

    ERIC Educational Resources Information Center

    Parakh, Abhishek

    2011-01-01

    Information dispersal algorithms (IDA) are used for distributed data storage because they simultaneously provide security, reliability and space efficiency, constituting a trustworthy computing framework for many critical applications, such as cloud computing, in the information society. In the most general sense, this is achieved by dividing data…

  12. New Information Dispersal Techniques for Trustworthy Computing

    ERIC Educational Resources Information Center

    Parakh, Abhishek

    2011-01-01

    Information dispersal algorithms (IDA) are used for distributed data storage because they simultaneously provide security, reliability and space efficiency, constituting a trustworthy computing framework for many critical applications, such as cloud computing, in the information society. In the most general sense, this is achieved by dividing data…

  13. Computer based satellite design

    NASA Astrophysics Data System (ADS)

    Lashbrook, David D.

    1992-06-01

    A computer program to design geosynchronous spacecraft has been developed. The program consists of four separate but interrelated executable computer programs. The programs are compiled to run on a DOS based personnel computer. The source code is written in DoD mandated Ada programming language. The thesis presents the design technique and design equations used in the program. Detailed analysis is performed in the following areas for both dual spin and three axis stabilized spacecraft configurations: (1) Mass Propellent Budget and Mass Summary; (2) Battery Cell and Solar Cell Requirements for a Payload Power Requirement; and (3) Passive Thermal Control Requirements. A user's manual is included as Appendix A, and the source code for the computer programs as Appendix B.

  14. Computer Software for Information Management.

    ERIC Educational Resources Information Center

    Lesk, Michael

    1984-01-01

    Discusses software developed to organize and retrieve electronically stored data, examining structure of the databases in which information is stored and the physical structure of the storage medium. Hierarchical and relational databases, unordered files, B-trees, and storage/software for specific purposes (such as weather, stock market, and…

  15. The Laboratory for Information and Computer Science.

    ERIC Educational Resources Information Center

    Jensen, Alton P.; Slamecka, Vladimir

    This document briefly explains the relationship between the School of Information Science and the Laboratory for Information and Computer Science at the Georgia Institute of Technology. The explicit purposes of the information science laboratory are spelled out as well as the specific objectives for the 1969/70, 1970/71, and 1971/72 school years.…

  16. The Primary Care Research Object Model (PCROM): A Computable Information Model for Practice-based Primary Care Research

    PubMed Central

    Speedie, Stuart M.; Taweel, Adel; Sim, Ida; Arvanitis, Theodoros N.; Delaney, Brendan; Peterson, Kevin A.

    2008-01-01

    Objectives Chronic disease prevalence and burden is growing, as is the need for applicable large community-based clinical trials of potential interventions. To support the development of clinical trial management systems for such trials, a community-based primary care research information model is needed. We analyzed the requirements of trials in this environment, and constructed an information model to drive development of systems supporting trial design, execution, and analysis. We anticipate that this model will contribute to a deeper understanding of all the dimensions of clinical research and that it will be integrated with other clinical research modeling efforts, such as the Biomedical Research Integrated Domain Group (BRIDG) model, to complement and expand on current domain models. Design We used unified modeling language modeling to develop use cases, activity diagrams, and a class (object) model to capture components of research in this setting. The initial primary care research object model (PCROM) scope was the performance of a randomized clinical trial (RCT). It was validated by domain experts worldwide, and underwent a detailed comparison with the BRIDG clinical research reference model. Results We present a class diagram and associated definitions that capture the components of a primary care RCT. Forty-five percent of PCROM objects were mapped to BRIDG, 37% differed in class and/or subclass assignment, and 18% did not map. Conclusion The PCROM represents an important link between existing research reference models and the real-world design and implementation of systems for managing practice-based primary care clinical trials. Although the high degree of correspondence between PCROM and existing research reference models provides evidence for validity and comprehensiveness, existing models require object extensions and modifications to serve primary care research. PMID:18579829

  17. Computer-Based Environmental Management

    NASA Astrophysics Data System (ADS)

    Seppelt, Ralf

    2003-11-01

    This book provides professionals in environmental research and management with the information they need for computer modeling. Highlights include: A detailed summary of available software tools Presents cutting-edge mathematical methodology (e.g. fuzzy logic, hybrid Petri nets, optimum control theory) in a clear, understandable way Colour illustrations, flowcharts and worked examples that visualise and explain complex mathematical tasks. Case studies from various fields of application making it easier to apply simulation models for the solution of real-world problems Computer-Based Environmental Management is a unique reference for all environmental chemists, ecologists and agricultural scientists.

  18. Computers and information technologies in psychiatric nursing.

    PubMed

    Repique, Renee John R

    2007-04-01

    There is an assumption that psychiatric nurses are late adopters of technology because psychiatric nursing has been traditionally viewed as a nontechnological nursing specialty. This article will review current nursing literature to outline the value and significance of computers and information technologies to psychiatric nursing. Existing bodies of research literature related to computers and information technology for psychiatric nurses. Three areas of psychiatric nursing are identified and the specific advantages and benefits of computers and information technologies in each of these areas are discussed. In addition, the importance of informatics competencies for psychiatric nursing practice is reiterated in order to accelerate its acquisition.

  19. Integrated Sensing and Information Processing Theme-Based Redesign of the Undergraduate Electrical and Computer Engineering Curriculum at Duke University

    ERIC Educational Resources Information Center

    Ybarra, Gary A.; Collins, Leslie M.; Huettel, Lisa G.; Brown, April S.; Coonley, Kip D.; Massoud, Hisham Z.; Board, John A.; Cummer, Steven A.; Choudhury, Romit Roy; Gustafson, Michael R.; Jokerst, Nan M.; Brooke, Martin A.; Willett, Rebecca M.; Kim, Jungsang; Absher, Martha S.

    2011-01-01

    The field of electrical and computer engineering has evolved significantly in the past two decades. This evolution has broadened the field of ECE, and subfields have seen deep penetration into very specialized areas. Remarkable devices and systems arising from innovative processes, exotic materials, high speed computer simulations, and complex…

  20. Information thermodynamics of near-equilibrium computation

    NASA Astrophysics Data System (ADS)

    Prokopenko, Mikhail; Einav, Itai

    2015-06-01

    In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.

  1. Information thermodynamics of near-equilibrium computation.

    PubMed

    Prokopenko, Mikhail; Einav, Itai

    2015-06-01

    In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.

  2. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  3. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  4. Secure medical information sharing in cloud computing.

    PubMed

    Shao, Zhiyi; Yang, Bo; Zhang, Wenzheng; Zhao, Yi; Wu, Zhenqiang; Miao, Meixia

    2015-01-01

    Medical information sharing is one of the most attractive applications of cloud computing, where searchable encryption is a fascinating solution for securely and conveniently sharing medical data among different medical organizers. However, almost all previous works are designed in symmetric key encryption environment. The only works in public key encryption do not support keyword trapdoor security, have long ciphertext related to the number of receivers, do not support receiver revocation without re-encrypting, and do not preserve the membership of receivers. In this paper, we propose a searchable encryption supporting multiple receivers for medical information sharing based on bilinear maps in public key encryption environment. In the proposed protocol, data owner stores only one copy of his encrypted file and its corresponding encrypted keywords on cloud for multiple designated receivers. The keyword ciphertext is significantly shorter and its length is constant without relation to the number of designated receivers, i.e., for n receivers the ciphertext length is only twice the element length in the group. Only the owner knows that with whom his data is shared, and the access to his data is still under control after having been put on the cloud. We formally prove the security of keyword ciphertext based on the intractability of Bilinear Diffie-Hellman problem and the keyword trapdoor based on Decisional Diffie-Hellman problem.

  5. Expert computer systems. 1977-April, 1983 (Citations from the International Information Service for the Physics and Engineering Communities Data Base)

    SciTech Connect

    Not Available

    1983-04-01

    This bibliography contains citations concerning the use of knowledge and reasoning techniques in computer programs designated as expert systems. Expert systems are a form of artificial intelligence, used to solve problems that often require human intervention to apply reasoning and experience. Applications include engineering design, medical diagnosis, management decision support systems, computer assisted instruction, structural analysis, and improvement in man-machine communications. (Contains 98 citations fully indexed and including a title list.)

  6. Computer games and information-processing skills.

    PubMed

    Yuji, H

    1996-10-01

    To assess the association of past use of computer games and parallel-processing skills as measured by tests of discrimination perception using computers 46 boys and girls in kindergarten, aged 4 to 6 years, were classified into 17 player and 17 nonplayer groups by their enthusiasm for computer games. There were no significant differences between the two groups in correct responses; however, RTs of players were significantly faster than those of nonplayers. RTs were different to color and shape. Experiences with computer games might develop information-processing skills.

  7. AFE base flow computations

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Prabhu, Dinesh K.; Palmer, Grant

    1991-01-01

    Hypersonic wake flows behind the Aeroassist Flight Experiment (AFE) geometry are analyzed using two Navier-Stokes flow solvers. Many of the AFE wake features observed in ballistic-range shadowgraphs are simulated using a simple, two-dimensional semicylinder geometry at moderate angles of attack. At free-stream conditions corresponding to a Hypersonic Free Flight Facility (HFFF) AFE experiment, the three-dimensional base flow for the AFE geometry is computed using an ideal-gas, Navier-Stokes solver. The computed results agree reasonably well with the shadowgraphs taken at the HFFF. An ideal-gas and a nonequilibrium Navier-Stokes solver have been coupled and applied to the complete flow around the AFE vehicle at the free-stream conditions corresponding to a nomial trajectory point. Limitations of the coupled ideal-gas and nonequilibrium solution are discussed. The nonequilibrium base flow solution is analyzed for the wake radiation and the radiation profiles along various lines of sight are compared. Finally, the wake unsteadiness is predicted using experimental correlations and the numerical solutions. An adaptive grid code, SAGE, has been used in all the simulations to enhance the solution accuracy. The grid adaptation is found to be necessary in obtaining base flow solutions with accurate flow features.

  8. Computing, Information and Communications Technology (CICT) Website

    NASA Technical Reports Server (NTRS)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  9. An Evaluation of a Computer-Based Videotext Information Delivery System for Farmers: The Green Thumb Project.

    ERIC Educational Resources Information Center

    Warner, Paul D.; Clearfield, Frank

    The Green Thumb Project was designed to test the feasibility of operating a computerized system for disseminating weather, market, and other agricultural production and management information on a day-to-day basis; to develop a prototype software support system for the test; and to provide essential project information on conduct of the test to…

  10. Symposium on Human-Computer Information Retrieval.

    PubMed

    Tunkelang, Daniel; Capra, Robert; Golovchinsky, Gene; Kules, Bill; Smith, Catherine; White, Ryen

    2013-03-01

    Human-computer information retrieval (HCIR) is the study of information retrieval techniques that integrate human intelligence and algorithmic search to help people explore, understand, and use information. Since 2007, we have held an annual gathering of researchers and practitioners to advance the state of the art in this field. This meeting report summarizes the history of the HCIR symposium and emphasizes its relevance to the data science community.

  11. Computer Conference in Information Service. Research Report 191.

    ERIC Educational Resources Information Center

    Repo, Aatto J.

    This document describes the development of computer conferencing (CC) and its role within information service communities, particularly in Finland and Sweden. CC is defined as a computer-based messaging (CBM) system providing an asynchronous communications structure for group work. It is noted that CC differs from electronic mail and that CC…

  12. Automated Information Management for Designers: Functional Requirements for Computer Based Associates That Support Access and Use of Technical Information in Design

    DTIC Science & Technology

    1993-03-01

    Support Access and Use of Technical Information in IDcsign 1 William J. Cody R William B. Rouse 0 ~~~SEARCH TECHNOLOGY i m’N 4898 S. OLD PEACHTREE ROAD... information . Termed Designer’s Associates , these supports derive from a thorough understanding of designer’s tasks and the services that intelligent...8217 access and use of such information . Termed Designer’s Associates , these supports derive from a thorough understanding of designers’ tasks and the

  13. RESEARCH ON COMPUTER-AUGMENTED INFORMATION MANAGEMENT

    DTIC Science & Technology

    information management . The report is, in itself a product of the project: With the exception of ’front matter,’ the entire report was composed, edited, and produced with on-line and off-line computer aids. For this project, the techniques of computer aids were applied to two areas: task monitoring and program design. The processes and techniques developed offer a promising beginning to computer-aided programming design extending from initial specification to final debugging in a unified design record that grows and evolves to complete final documentation. The

  14. [Patient's Autonomy and Information in Psycho-Oncology: Computer Based Distress Screening for an Interactive Treatment Planning (ePOS-react)].

    PubMed

    Schäffeler, Norbert; Sedelmaier, Jana; Möhrer, Hannah; Ziser, Katrin; Ringwald, Johanna; Wickert, Martin; Brucker, Sara; Junne, Florian; Zipfel, Stephan; Teufel, Martin

    2017-07-01

    To identify distressed patients in oncology using screening questionnaires is quite challenging in clinical routine. Up to now there is no evidence based recommendation which instrument is most suitable and how to put a screening to practice. Using computer based screening tools offers the possibility to automatically analyse patient's data, inform psycho-oncological and medical staff about the results, and use reactive questionnaires. Studies on how to empower patients in decision making in psycho-oncology are rare.Methods Women with breast and gynaecological cancer have been consecutively included in this study (n=103) at time of inpatient surgical treatment in a gynaecological clinic. They answered the computer based screening questionnaire (ePOS-react) for routine distress screening at time of admission. At the end of the tool an individual recommendation concerning psycho-oncological treatment is given ( i) psycho-oncological counselling, ii) brief psycho-oncological contact, iii) no treatment suggestion). The informed patients could choose autonomously either the recommended treatment or an individually more favoured alternative possibility. Additionally, a clinical interview (approx. 30 min) based on the "Psychoonkologische Basisdiagnostik (PO-Bado)" has been carried out for a third-party assessment of patients' need for treatment.Results 68.9% followed the treatment recommendation. 22.3% asked for a more "intense" (e. g. counselling instead of recommended brief contact) and 8,7% for a "less intense" intervention than recommended. The accordance of third-party assessment (clinical interview "PO-Bado") and treatment recommendation is about 72.8%. The accordance of third-party assessment and patient's choice (ePOS-react) is about 58.3%. The latter is smaller because 29.1% asked for a brief psycho-oncological contact for whom from the third-party assessment's perspective no indication for treatment has been existent.Discussion A direct response of the

  15. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  16. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  17. Reversible Data Hiding Based on DNA Computing

    PubMed Central

    Xie, Yingjie

    2017-01-01

    Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security. PMID:28280504

  18. Reversible Data Hiding Based on DNA Computing.

    PubMed

    Wang, Bin; Xie, Yingjie; Zhou, Shihua; Zhou, Changjun; Zheng, Xuedong

    2017-01-01

    Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security.

  19. Network selection, Information filtering and Scalable computation

    NASA Astrophysics Data System (ADS)

    Ye, Changqing

    This dissertation explores two application scenarios of sparsity pursuit method on large scale data sets. The first scenario is classification and regression in analyzing high dimensional structured data, where predictors corresponds to nodes of a given directed graph. This arises in, for instance, identification of disease genes for the Parkinson's diseases from a network of candidate genes. In such a situation, directed graph describes dependencies among the genes, where direction of edges represent certain causal effects. Key to high-dimensional structured classification and regression is how to utilize dependencies among predictors as specified by directions of the graph. In this dissertation, we develop a novel method that fully takes into account such dependencies formulated through certain nonlinear constraints. We apply the proposed method to two applications, feature selection in large margin binary classification and in linear regression. We implement the proposed method through difference convex programming for the cost function and constraints. Finally, theoretical and numerical analyses suggest that the proposed method achieves the desired objectives. An application to disease gene identification is presented. The second application scenario is personalized information filtering which extracts the information specifically relevant to a user, predicting his/her preference over a large number of items, based on the opinions of users who think alike or its content. This problem is cast into the framework of regression and classification, where we introduce novel partial latent models to integrate additional user-specific and content-specific predictors, for higher predictive accuracy. In particular, we factorize a user-over-item preference matrix into a product of two matrices, each representing a user's preference and an item preference by users. Then we propose a likelihood method to seek a sparsest latent factorization, from a class of over

  20. Computer-assisted diagnosis of mammographic masses using an information-theoretic image retrieval scheme with BIRADs-based relevance feedback

    NASA Astrophysics Data System (ADS)

    Tourassi, Georgia D.; Floyd, Carey E., Jr.

    2004-05-01

    The purpose of the study was to develop and evaluate a content-based image retrieval (CBIR) approach for computer-assisted diagnosis of masses detected in screening mammograms. The system follows an information theoretic retrieval scheme with a BIRADS-based relevance feedback (RF) algorithm. Initially, a knowledge databank of 365 mammographic regions of interest (ROIs) was created. They were all 512x512 pixel ROIs extracted from DDSM mammograms digitized using the Lumisys digitizer. The ROIs were extracted around the known locations of the annotated masses. Specifically, there were 177 ROIs depicting a biopsy-proven malignant mass and 188 ROIs with a benign mass. Subsequently, the CBIR algorithm was implemented using mutual information (MI) as the similarity metric for image retrieval. The CBIR algorithm formed the basis of a knowledge-based CAD system. Given a databank of mammographic masses with known pathology, a query mass was evaluated. Based on their information content, all similar masses in the databank were retrieved. A relevance feedback algorithm based on BIRADS findings was implemented to determine the relevance factor of the retrieved masses. Finally, a decision index was calculated using the query's k best matches. The decision index effectively combined the similarity metric of the retrieved cases and their relevance factor into a prediction regarding the malignancy status of the mass depicted in the query ROI. ROC analysis was to evaluate diagnostic performance. Performance improved dramatically with the incorporation of the relevance feedback algorithm. Overall, the CAD system achieved ROC area index AZ= 0.86+/-0.02 for the diagnosis of masses in screening mammograms.

  1. Computers, Health Care, and Medical Information Science.

    ERIC Educational Resources Information Center

    Lincoln, Thomas L.; Korpman, Ralph A.

    1980-01-01

    Discusses the new discipline of medical information science (MIS) and examines some problem-solving approaches used in its application in the clinical laboratory, emphasizing automation by computer technology. The health care field is viewed as one having overlapping domains of clinical medicine, health management and statistics, and fundamental…

  2. Informing Mechanistic Toxicology with Computational Molecular Models

    EPA Science Inventory

    Computational molecular models of chemicals interacting with biomolecular targets provides toxicologists a valuable, affordable, and sustainable source of in silico molecular level information that augments, enriches, and complements in vitro and in vivo effo...

  3. Informing Mechanistic Toxicology with Computational Molecular Models

    EPA Science Inventory

    Computational molecular models of chemicals interacting with biomolecular targets provides toxicologists a valuable, affordable, and sustainable source of in silico molecular level information that augments, enriches, and complements in vitro and in vivo effo...

  4. Information Security: Computer Hacker Information Available on the Internet

    DTIC Science & Technology

    1996-06-05

    2600, the Internet news group for readers of 2600 Magazine. On alt.2600 we discuss telephone ( phreaking ), computer (hacking), and related topics...collected a variety of documents from various phreaking , cracking, and hacking sources. These publications include information on hacker conferences and

  5. Neural computation by concentrating information in time.

    PubMed Central

    Tank, D W; Hopfield, J J

    1987-01-01

    An analog model neural network that can solve a general problem of recognizing patterns in a time-dependent signal is presented. The networks use a patterned set of delays to collectively focus stimulus sequence information to a neural state at a future time. The computational capabilities of the circuit are demonstrated on tasks somewhat similar to those necessary for the recognition of words in a continuous stream of speech. The network architecture can be understood from consideration of an energy function that is being minimized as the circuit computes. Neurobiological mechanisms are known for the generation of appropriate delays. PMID:3470765

  6. Information theoretic approaches to multidimensional neural computations

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Jeffrey D.

    Many systems in nature process information by transforming inputs from their environments into observable output states. These systems are often difficult to study because they are performing computations on multidimensional inputs with many degrees of freedom using highly nonlinear functions. The work presented in this dissertation deals with some of the issues involved with characterizing real-world input/output systems and understanding the properties of idealized systems using information theoretic methods. Using the principle of maximum entropy, a family of models are created that are consistent with certain measurable correlations from an input/output dataset but are maximally unbiased in all other respects, thereby eliminating all unjustified assumptions about the computation. In certain cases, including spiking neurons, we show that these models also minimize the mutual information. This property gives one the advantage of being able to identify the relevant input/output statistics by calculating their information content. We argue that these maximum entropy models provide a much needed quantitative framework for characterizing and understanding sensory processing neurons that are selective for multiple stimulus features. To demonstrate their usefulness, these ideas are applied to neural recordings from macaque retina and thalamus. These neurons, which primarily respond to two stimulus features, are shown to be well described using only first and second order statistics, indicating that their firing rates encode information about stimulus correlations. In addition to modeling multi-feature computations in the relevant feature space, we also show that maximum entropy models are capable of discovering the relevant feature space themselves. This technique overcomes the disadvantages of two commonly used dimensionality reduction methods and is explored using several simulated neurons, as well as retinal and thalamic recordings. Finally, we ask how neurons in a

  7. Computer Based Education.

    ERIC Educational Resources Information Center

    Fauley, Franz E.

    1980-01-01

    A case study of what one company did to increase the productivity of its sales force and generate cost savings by using computer-assisted instruction to teach salespeople at regional offices. (Editor)

  8. Multimedia, visual computing, and the information superhighway

    NASA Astrophysics Data System (ADS)

    Kitson, Frederick L.

    1996-04-01

    The data types of graphics, images, audio and video or collectively multimedia are becoming standard components of most computer interfaces and applications. Medical imaging in particular will be able to exploit these capabilities in concert with the database engines or 'information furnaces' that will exist as part of the information superhighway. The ability to connect experts with patients electronically enables care delivery from remote diagnostics to remote surgery. Traditional visual computing tasks such as MRI, volume rendering, computer vision or image processing may also be available to more clinics and researchers as they become 'electronically local.' Video is the component of multimedia that provides the greatest sense of presence or visual realism yet has been the most difficult to offer digitally due to its high transmission, storage and computation requirements. Advanced 3D graphics have also been a scarce or at least expensive resource. This paper addresses some of the recent innovations in media processing and client/server technology that will facilitate PCs, workstations or even set-top/TV boxes to process both video and graphics in real-time.

  9. Introduction to Quantum Information/Computing

    DTIC Science & Technology

    2005-06-01

    mωX + iP) sqrt(2mhω) BCS Theory – Named for John Bardeen , Leon Cooper, and Robert Schrieffer. According to theory, the...Theory and Reliable Communication, John Wiley & Sons 1998 2. M.A. Nielsen, I. L. Chuang, Quantum Computation and Quantum Information, Cambridge...France and by John Wiley & Sons. 6. H. Goldstein, Classical Mechanics, 1950 Addison-Wesley Publishing Company, Inc. 7. L.S. Brown and G

  10. A randomized trial of computer-based communications using imagery and text information to alter representations of heart disease risk and motivate protective behaviour.

    PubMed

    Lee, Tarryn J; Cameron, Linda D; Wünsche, Burkhard; Stevens, Carey

    2011-02-01

    Advances in web-based animation technologies provide new opportunities to develop graphic health communications for dissemination throughout communities. We developed imagery and text contents of brief, computer-based programmes about heart disease risk, with both imagery and text contents guided by the common-sense model (CSM) of self-regulation. The imagery depicts a three-dimensional, beating heart tailored to user-specific information. A 2 × 2 × 4 factorial design was used to manipulate concrete imagery (imagery vs. no imagery) and conceptual information (text vs. no text) about heart disease risk in prevention-oriented programmes and assess changes in representations and behavioural motivations from baseline to 2 days, 2 weeks, and 4 weeks post-intervention. Sedentary young adults (N= 80) were randomized to view one of four programmes: imagery plus text, imagery only, text only, or control. Participants completed measures of risk representations, worry, and physical activity and healthy diet intentions and behaviours at baseline, 2 days post-intervention (except behaviours), and 2 weeks (intentions and behaviours only) and 4 weeks later. The imagery contents increased representational beliefs and mental imagery relating to heart disease, worry, and intentions at post-intervention. Increases in sense of coherence (understanding of heart disease) and worry were sustained after 1 month. The imagery contents also increased healthy diet efforts after 2 weeks. The text contents increased beliefs about causal factors, mental images of clogged arteries, and worry at post-intervention, and increased physical activity 2 weeks later and sense of coherence 1 month later. The CSM-based programmes induced short-term changes in risk representations and behaviour motivation. The combination of CSM-based text and imagery appears to be most effective in instilling risk representations that motivate protective behaviour. ©2010 The British Psychological Society.

  11. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  12. Computer Aided Management for Information Processing Projects.

    ERIC Educational Resources Information Center

    Akman, Ibrahim; Kocamustafaogullari, Kemal

    1995-01-01

    Outlines the nature of information processing projects and discusses some project management programming packages. Describes an in-house interface program developed to utilize a selected project management package (TIMELINE) by using Oracle Data Base Management System tools and Pascal programming language for the management of information system…

  13. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  14. The computer-based lecture.

    PubMed

    Wofford, M M; Spickard, A W; Wofford, J L

    2001-07-01

    Advancing computer technology, cost-containment pressures, and desire to make innovative improvements in medical education argue for moving learning resources to the computer. A reasonable target for such a strategy is the traditional clinical lecture. The purpose of the lecture, the advantages and disadvantages of "live" versus computer-based lectures, and the technical options in computerizing the lecture deserve attention in developing a cost-effective, complementary learning strategy that preserves the teacher-learner relationship. Based on a literature review of the traditional clinical lecture, we build on the strengths of the lecture format and discuss strategies for converting the lecture to a computer-based learning presentation.

  15. The Lilongwe Central Hospital Patient Management Information System: A Success in Computer-Based Order Entry Where One Might Least Expect It

    PubMed Central

    GP, Douglas; RA, Deula; SE, Connor

    2003-01-01

    Computer-based order entry is a powerful tool for enhancing patient care. A pilot project in the pediatric department of the Lilongwe Central Hospital (LCH) in Malawi, Africa has demonstrated that computer-based order entry (COE): 1) can be successfully deployed and adopted in resource-poor settings, 2) can be built, deployed and sustained at relatively low cost and with local resources, and 3) has a greater potential to improve patient care in developing than in developed countries. PMID:14728338

  16. Computer-Based Hindi Pedagogy.

    ERIC Educational Resources Information Center

    Bhatia, Tej K.

    This paper brings out the structure and salient features of the Computer-Based Hindi Teaching (CBHT) course, which is being developed at the University of Illinois. The following topics are treated specifically: (1) areas of the Hindi Language course that can be efficiently and economically taught with computer-based pedagogy; (2) a demonstration…

  17. Computational inference of neural information flow networks.

    PubMed

    Smith, V Anne; Yu, Jing; Smulders, Tom V; Hartemink, Alexander J; Jarvis, Erich D

    2006-11-24

    Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.

  18. Scalable quantum information processing and the optical topological quantum computer

    NASA Astrophysics Data System (ADS)

    Devitt, S.

    2010-02-01

    Optical quantum computation has represented one of the most successful testbed systems for quantum information processing. Along with ion-traps and nuclear magnetic resonance (NMR), experimentalists have demonstrated control of qubits, multi-gubit gates and small quantum algorithms. However, photonic based qubits suffer from a problematic lack of a large scale architecture for fault-tolerant computation which could conceivably be built in the near future. While optical systems are, in some regards, ideal for quantum computing due to their high mobility and low susceptibility to environmental decoherence, these same properties make the construction of compact, chip based architectures difficult. Here we discuss many of the important issues related to scalable fault-tolerant quantum computation and introduce a feasible architecture design for an optics based computer. We combine the recent development of topological cluster state computation with the photonic module, simple chip based devices which can be utilized to deterministically entangle photons. The integration of this operational unit with one of the most exciting computational models solves many of the existing problems with other optics based architectures and leads to a feasible large scale design which can continuously generate a 3D cluster state with a photonic module resource cost linear in the cross sectional size of the cluster.

  19. Agent Based Computing Machine

    DTIC Science & Technology

    2005-12-09

    If <XY> is high, then A OR B is considered to be true. Once again, recall that A and B (and X and Y) are tokens or strings and not algebraic ...variables. There are no algebraic variables in instructions. The data and programs are inseparable as in LISP programming in the conventional computing...shows the promise of performing superior to traditional connectionist architectures for certain classes of problems that can take advantage of

  20. SPECIAL-PURPOSE DIGITAL COMPUTER FOR INFORMATION RETRIEVAL,

    DTIC Science & Technology

    DIGITAL COMPUTERS, * INFORMATION RETRIEVAL ), COMPUTERS, DATA STORAGE SYSTEMS, CORE STORAGE, MAGNETIC CORES, PUNCHED TAPE, MAGNETIC TAPE, SEARCH THEORY, INDEXES, VOCABULARY, CODING, RELIABILITY, CIRCUITS, USSR

  1. Holographic computations of the quantum information metric

    NASA Astrophysics Data System (ADS)

    Trivella, Andrea

    2017-05-01

    In this paper we show how the quantum information metric can be computed holographically using a perturbative approach. In particular when the deformation of the conformal field theory state is induced by a scalar operator the corresponding bulk configuration reduces to a scalar field perturbatively probing the background. We study two concrete examples: a CFT ground state deformed by a primary operator and thermofield double state in d  =  2 deformed by a marginal operator. Finally, we generalize the bulk construction to the case of a multi dimensional parameter space and show that the quantum information metric coincides with the metric of the non-linear sigma model for the corresponding scalar fields.

  2. CHESS: a computer-based system for providing information, referrals, decision support and social support to people facing medical and other health-related crises.

    PubMed

    Gustafson, D H; Bosworth, K; Hawkins, R P; Boberg, E W; Bricker, E

    1992-01-01

    CHESS (the Comprehensive Health Enhancement Support System) is an interactive, computer-based system to support people facing health-related crises or concerns. CHESS provides information, referral to service providers, support in making tough decisions and networking to experts and others facing the same concerns. CHESS will improve access to health and human services for people who would otherwise face psychological, social, economic or geographic barriers to receiving services. CHESS has developed programs in five specific topic areas: Academic Crisis, Adult Children of Alcoholics, AIDS/HIV Infection, Breast Cancer and Sexual Assault. The lessons learned, and the structures developed, will serve as a model for future implementation of CHESS programs in a broad range of other topic areas. CHESS is designed around three major desired outcomes: 1) improving the emotional health status of users; 2) increasing the cost-effective use of health and human services; and 3) reducing the incidence of risk-taking behaviors that can lead to injury or illness. Pilot-testing and initial analysis of controlled evaluation data has shown that CHESS is extensively used, is useful and easy-to-use, and produces positive emotional outcomes. Further evaluation in continuing.

  3. Computing and information sciences preliminary engineering design study

    SciTech Connect

    Schroeder, J O; Pearson, E W; Thomas, J J; Brothers, J W; Campbell, W K; DeVaney, D M; Jones, D R; Littlefield, R J; Peterson, M J

    1991-04-01

    This document presents the preliminary design concept for the integrated computing and information system to be included in the Environmental and Molecular Sciences Laboratory (EMSL) at the Pacific Northwest Laboratory, Richland, Washington, for the US Department of Energy (DOE). The EMSL is scheduled for completion and occupancy in 1994 or 1995 and will support the DOE environmental mission, in particular hazardous waste remediation. The focus of the report is on the Computing and Information Sciences engineering task of providing a fully integrated state-of-the-art computing environment for simulation, experimentation and analysis in support of molecular research. The EMSL will house two major research organizations, the Molecular Sciences Research Center (MSRC) and part of the Environmental Sciences Research Center (ESRC). Included in the report is a preliminary description of the computing and information system to be included. The proposed system architecture is based on a preliminary understanding of the EMSL users' needs for computational resources. As users understand more about the scientific challenges they face, the definition of the functional requirements will change. At the same time, the engineering team will be gaining experience with new computing technologies. Accordingly, the design architecture must evolve to reflect this new understanding of functional requirements and enabling technologies. 3 figs., 2 tabs.

  4. FPGA Based High Performance Computing

    SciTech Connect

    Bennett, Dave; Mason, Jeff; Sundararajan, Prasanna; Dellinger, Erik; Putnam, Andrew; Storaasli, Olaf O

    2008-01-01

    Current high performance computing (HPC) applications are found in many consumer, industrial and research fields. From web searches to auto crash simulations to weather predictions, these applications require large amounts of power by the compute farms and supercomputers required to run them. The demand for more and faster computation continues to increase along with an even sharper increase in the cost of the power required to operate and cool these installations. The ability of standard processor based systems to address these needs has declined in both speed of computation and in power consumption over the past few years. This paper presents a new method of computation based upon programmable logic as represented by Field Programmable Gate Arrays (FPGAs) that addresses these needs in a manner requiring only minimal changes to the current software design environment.

  5. Computer Supported Cooperative Work in Information Search and Retrieval.

    ERIC Educational Resources Information Center

    Twidale, Michael B.; Nichols, David M.

    1998-01-01

    Considers how research in collaborative technologies can inform research and development in library and information science. Topics include computer supported collaborative work; shared drawing; collaborative writing; MUDs; MOOs; workflow; World Wide Web; collaborative learning; computer mediated communication; ethnography; evaluation; remote…

  6. Internet Use for Health-Related Information via Personal Computers and Cell Phones in Japan: A Cross-Sectional Population-Based Survey

    PubMed Central

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro

    2011-01-01

    Background The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. Objectives This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. Methods We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15–79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. Results The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Conclusions Japanese moderately used the Internet via

  7. Internet use for health-related information via personal computers and cell phones in Japan: a cross-sectional population-based survey.

    PubMed

    Takahashi, Yoshimitsu; Ohura, Tomoko; Ishizaki, Tatsuro; Okamoto, Shigeru; Miki, Kenji; Naito, Mariko; Akamatsu, Rie; Sugimori, Hiroki; Yoshiike, Nobuo; Miyaki, Koichi; Shimbo, Takuro; Nakayama, Takeo

    2011-12-14

    The Internet is known to be used for health purposes by the general public all over the world. However, little is known about the use of, attitudes toward, and activities regarding eHealth among the Japanese population. This study aimed to measure the prevalence of Internet use for health-related information compared with other sources, and to examine the effects on user knowledge, attitudes, and activities with regard to Internet use for health-related information in Japan. We examined the extent of use via personal computers and cell phones. We conducted a cross-sectional survey of a quasi-representative sample (N = 1200) of the Japanese general population aged 15-79 years in September 2007. The main outcome measures were (1) self-reported rates of Internet use in the past year to acquire health-related information and to contact health professionals, family, friends, and peers specifically for health-related purposes, and (2) perceived effects of Internet use on health care. The prevalence of Internet use via personal computer for acquiring health-related information was 23.8% (286/1200) among those surveyed, whereas the prevalence via cell phone was 6% (77). Internet use via both personal computer and cell phone for communicating with health professionals, family, friends, or peers was not common. The Internet was used via personal computer for acquiring health-related information primarily by younger people, people with higher education levels, and people with higher household incomes. The majority of those who used the Internet for health care purposes responded that the Internet improved their knowledge or affected their lifestyle attitude, and that they felt confident in the health-related information they obtained from the Internet. However, less than one-quarter thought it improved their ability to manage their health or affected their health-related activities. Japanese moderately used the Internet via personal computers for health purposes, and rarely

  8. PREFACE: Quantum Information, Communication, Computation and Cryptography

    NASA Astrophysics Data System (ADS)

    Benatti, F.; Fannes, M.; Floreanini, R.; Petritis, D.

    2007-07-01

    The application of quantum mechanics to information related fields such as communication, computation and cryptography is a fast growing line of research that has been witnessing an outburst of theoretical and experimental results, with possible practical applications. On the one hand, quantum cryptography with its impact on secrecy of transmission is having its first important actual implementations; on the other hand, the recent advances in quantum optics, ion trapping, BEC manipulation, spin and quantum dot technologies allow us to put to direct test a great deal of theoretical ideas and results. These achievements have stimulated a reborn interest in various aspects of quantum mechanics, creating a unique interplay between physics, both theoretical and experimental, mathematics, information theory and computer science. In view of all these developments, it appeared timely to organize a meeting where graduate students and young researchers could be exposed to the fundamentals of the theory, while senior experts could exchange their latest results. The activity was structured as a school followed by a workshop, and took place at The Abdus Salam International Center for Theoretical Physics (ICTP) and The International School for Advanced Studies (SISSA) in Trieste, Italy, from 12-23 June 2006. The meeting was part of the activity of the Joint European Master Curriculum Development Programme in Quantum Information, Communication, Cryptography and Computation, involving the Universities of Cergy-Pontoise (France), Chania (Greece), Leuven (Belgium), Rennes1 (France) and Trieste (Italy). This special issue of Journal of Physics A: Mathematical and Theoretical collects 22 contributions from well known experts who took part in the workshop. They summarize the present day status of the research in the manifold aspects of quantum information. The issue is opened by two review articles, the first by G Adesso and F Illuminati discussing entanglement in continuous variable

  9. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  10. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  11. The Computer-based Lecture

    PubMed Central

    Wofford, Marcia M; Spickard, Anderson W; Wofford, James L

    2001-01-01

    Advancing computer technology, cost-containment pressures, and desire to make innovative improvements in medical education argue for moving learning resources to the computer. A reasonable target for such a strategy is the traditional clinical lecture. The purpose of the lecture, the advantages and disadvantages of “live” versus computer-based lectures, and the technical options in computerizing the lecture deserve attention in developing a cost-effective, complementary learning strategy that preserves the teacher-learner relationship. Based on a literature review of the traditional clinical lecture, we build on the strengths of the lecture format and discuss strategies for converting the lecture to a computer-based learning presentation. PMID:11520384

  12. 5 CFR 838.805 - OPM computation of formulas in computing the designated base.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false OPM computation of formulas in computing... computing the designated base. (a) A court order awarding a former spouse survivor annuity is not a court...) To provide sufficient instructions and information for OPM to compute the amount of a former...

  13. Roots and promises of chemical-based computing.

    PubMed

    Rambidi, N

    2002-01-01

    General principles of information processing by chemical-based biomolecular systems (pseudobiological information processing paradigm) are discussed. These principles include very large scale parallelism of information processing, high behavioral complexity, complementarity of information features, self-organization, and multilevel architecture. Chemical-based information processing devices using these principles seem to be able to solve effectively problems of high computational complexity.

  14. Health Information System in a Cloud Computing Context.

    PubMed

    Sadoughi, Farahnaz; Erfannia, Leila

    2017-01-01

    Healthcare as a worldwide industry is experiencing a period of growth based on health information technology. The capabilities of cloud systems make it as an option to develop eHealth goals. The main objectives of the present study was to evaluate the advantages and limitations of health information systems implementation in a cloud-computing context that was conducted as a systematic review in 2016. Science direct, Scopus, Web of science, IEEE, PubMed and Google scholar were searched according study criteria. Among 308 articles initially found, 21 articles were entered in the final analysis. All the studies had considered cloud computing as a positive tool to help advance health technology, but none had insisted too much on its limitations and threats. Electronic health record systems have been mostly studied in the fields of implementation, designing, and presentation of models and prototypes. According to this research, the main advantages of cloud-based health information systems could be categorized into the following groups: economic benefits and advantages of information management. The main limitations of the implementation of cloud-based health information systems could be categorized into the 4 groups of security, legal, technical, and human restrictions. Compared to earlier studies, the present research had the advantage of dealing with the issue of health information systems in a cloud platform. The high frequency of studies conducted on the implementation of cloud-based health information systems revealed health industry interest in the application of this technology. Security was a subject discussed in most studies due to health information sensitivity. In this investigation, some mechanisms and solutions were discussed concerning the mentioned systems, which would provide a suitable area for future scientific research on this issue. The limitations and solutions discussed in this systematic study would help healthcare managers and decision

  15. QPSO-based adaptive DNA computing algorithm.

    PubMed

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  16. Understanding Computer-Based Digital Video.

    ERIC Educational Resources Information Center

    Martindale, Trey

    2002-01-01

    Discussion of new educational media and technology focuses on producing and delivering computer-based digital video. Highlights include video standards, including international standards and aspect ratio; camera formats and features, including costs; shooting digital video; editing software; compression; and a list of informative Web sites. (LRW)

  17. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    Computer programs for relational information management data base systems, spherical roller bearing analysis, a generalized pseudoinverse of a rectangular matrix, and software design and documentation language are summarized.

  18. Radiological Worker Computer Based Training

    SciTech Connect

    Butala, Stephen W.; Cullen, James J.; Corsolini, Jams; Zach, Karen; Przyzycki, Edward

    2003-02-06

    Argonne National Laboratory has developed an interactive computer based training (CBT) version of the standardized DOE Radiological Worker training program. This CD-ROM based program utilizes graphics, animation, photographs, sound and video to train users in ten topical areas: radiological fundamentals, biological effects, dose limits, ALARA, personnel monitoring, controls and postings, emergency response, contamination controls, high radiation areas, and lessons learned.

  19. Information-based clustering

    PubMed Central

    Slonim, Noam; Atwal, Gurinder Singh; Tkačik, Gašper; Bialek, William

    2005-01-01

    In an age of increasingly large data sets, investigators in many different disciplines have turned to clustering as a tool for data analysis and exploration. Existing clustering methods, however, typically depend on several nontrivial assumptions about the structure of data. Here, we reformulate the clustering problem from an information theoretic perspective that avoids many of these assumptions. In particular, our formulation obviates the need for defining a cluster “prototype,” does not require an a priori similarity metric, is invariant to changes in the representation of the data, and naturally captures nonlinear relations. We apply this approach to different domains and find that it consistently produces clusters that are more coherent than those extracted by existing algorithms. Finally, our approach provides a way of clustering based on collective notions of similarity rather than the traditional pairwise measures. PMID:16352721

  20. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  1. A simplified computational memory model from information processing

    NASA Astrophysics Data System (ADS)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  2. Gauging Information and Computer Skills for Curriculum Planning

    ERIC Educational Resources Information Center

    Krueger, Janice M.; Ha, YooJin

    2012-01-01

    Background: All types of librarians are expected to possess information and computer skills to actively assist patrons in accessing information and in recognizing reputable sources. Mastery of information and computer skills is a high priority for library and information science programs since graduate students have varied multidisciplinary…

  3. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  4. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  5. The Computer Subroutine in Information Handling.

    ERIC Educational Resources Information Center

    Riggs, Donald E.

    Generalized computational subroutines can reduce programing repetitions and wasteful computer storage use. The most useful are those that are flexible enough to handle a wide variety of situations. Subroutines may have details open to change in order to blend into the main program. They may be built into the computer library or supplied by the…

  6. Nature computes: information processing in quantum dynamical systems.

    PubMed

    Wiesner, Karoline

    2010-09-01

    Nature intrinsically computes. It has been suggested that the entire universe is a computer, in particular, a quantum computer. To corroborate this idea we require tools to quantify the information processing. Here we review a theoretical framework for quantifying information processing in a quantum dynamical system. So-called intrinsic quantum computation combines tools from dynamical systems theory, information theory, quantum mechanics, and computation theory. We will review how far the framework has been developed and what some of the main open questions are. On the basis of this framework we discuss upper and lower bounds for intrinsic information storage in a quantum dynamical system.

  7. 11 CFR 9003.6 - Production of computer information.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 11 Federal Elections 1 2011-01-01 2011-01-01 false Production of computer information. 9003.6... ELECTION FINANCING ELIGIBILITY FOR PAYMENTS § 9003.6 Production of computer information. (a) Categories of... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  8. 11 CFR 9003.6 - Production of computer information.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 11 Federal Elections 1 2012-01-01 2012-01-01 false Production of computer information. 9003.6... ELECTION FINANCING ELIGIBILITY FOR PAYMENTS § 9003.6 Production of computer information. (a) Categories of... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  9. 11 CFR 9003.6 - Production of computer information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 11 Federal Elections 1 2014-01-01 2014-01-01 false Production of computer information. 9003.6... ELECTION FINANCING ELIGIBILITY FOR PAYMENTS § 9003.6 Production of computer information. (a) Categories of... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  10. 11 CFR 9003.6 - Production of computer information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 11 Federal Elections 1 2013-01-01 2012-01-01 true Production of computer information. 9003.6... ELECTION FINANCING ELIGIBILITY FOR PAYMENTS § 9003.6 Production of computer information. (a) Categories of... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  11. A system of computer programs (WAT{_}MOVE) for transferring data among data bases in the US Geological Survey National Water Information System

    SciTech Connect

    Rogers, G.D.; Kerans, B.K.

    1991-11-01

    This report describes WAT{_}MOVE, a system of computer programs that was developed for moving National Water Information System data between US Geological Survey distributed computer databases. WAT{_}MOVE has three major sub-systems: one for retrieval, one for loading, and one for purging. The retrieval sub-system creates transaction files of retrieved data for transfer and invokes a file transfer to send the transaction files to the receiving site. The loading sub-system reads the control and transaction files retrieved from the source database and loads the data in the appropriate files. The purging sub-system deletes data from a database. Although WAT{_}MOVE was developed for use by the Geological Survey`s Hydrologic Investigations Program of the Yucca Mountain Project Branch, the software can be beneficial to any office maintaining data in the Site File, ADAPS (Automated Data Processing System), GWSI (Ground-Water Site Inventory), and QW (Quality of Water) sub-systems of the National Water Information System. The software also can be used to move data between databases on a single network node or to modify data within a database.

  12. Computer-Based Concept Mapping.

    ERIC Educational Resources Information Center

    Fisher, K. M.; And Others

    1990-01-01

    Described is a computer-based tool, SemNet, which can be used by good and poor students to facilitate, extend, and polish their learning skills. Emphasized are skills associated with meaningful or deep-level learning. The components, use, and theoretical background of the software are discussed. (CW)

  13. Results of Computer Based Training.

    ERIC Educational Resources Information Center

    1978

    This report compares the projected savings of using computer based training to conduct training for newly hired pilots to the results of that application. New Hire training, one of a number of programs conducted continuously at the United Airline Flight Operations Training Center, is designed to assure that any newly hired pilot will be able to…

  14. Computer-Based Concept Mapping.

    ERIC Educational Resources Information Center

    Fisher, K. M.; And Others

    1990-01-01

    Described is a computer-based tool, SemNet, which can be used by good and poor students to facilitate, extend, and polish their learning skills. Emphasized are skills associated with meaningful or deep-level learning. The components, use, and theoretical background of the software are discussed. (CW)

  15. Computer Based Virtual Field Trips.

    ERIC Educational Resources Information Center

    Clark, Kenneth F.; Hosticka, Alice; Schriver, Martha; Bedell, Jackie

    This paper discusses computer based virtual field trips that use technologies commonly found in public schools in the United States. The discussion focuses on the advantages of both using and creating these field trips for an instructional situation. A virtual field trip to Cumberland Island National Seashore, St. Marys, Georgia is used as a point…

  16. Results of Computer Based Training.

    ERIC Educational Resources Information Center

    1978

    This report compares the projected savings of using computer based training to conduct training for newly hired pilots to the results of that application. New Hire training, one of a number of programs conducted continuously at the United Airline Flight Operations Training Center, is designed to assure that any newly hired pilot will be able to…

  17. A dictionary based informational genome analysis

    PubMed Central

    2012-01-01

    Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068

  18. Differentiating Information Skills and Computer Skills: A Factor Analytic Approach

    ERIC Educational Resources Information Center

    Pask, Judith M.; Saunders, E. Stewart

    2004-01-01

    A basic tenet of information literacy programs is that the skills needed to use computers and the skills needed to find and evaluate information are two separate sets of skills. Outside the library this is not always the view. The claim is sometimes made that information skills are acquired by learning computer skills. All that is needed is a…

  19. Defining Information Needs of Computer Users: A Human Communication Problem.

    ERIC Educational Resources Information Center

    Kimbrough, Kenneth L.

    This exploratory investigation of the process of defining the information needs of computer users and the impact of that process on information retrieval focuses on communication problems. Six sites were visited that used computers to process data or to provide information, including the California Department of Transportation, the California…

  20. Defining Information Needs of Computer Users: A Human Communication Problem.

    ERIC Educational Resources Information Center

    Kimbrough, Kenneth L.

    This exploratory investigation of the process of defining the information needs of computer users and the impact of that process on information retrieval focuses on communication problems. Six sites were visited that used computers to process data or to provide information, including the California Department of Transportation, the California…

  1. Computer Human Interaction for Image Information Systems.

    ERIC Educational Resources Information Center

    Beard, David Volk

    1991-01-01

    Presents an approach to developing viable image computer-human interactions (CHI) involving user metaphors for comprehending image data and methods for locating, accessing, and displaying computer images. A medical-image radiology workstation application is used as an example, and feedback and evaluation methods are discussed. (41 references) (LRW)

  2. Computer Human Interaction for Image Information Systems.

    ERIC Educational Resources Information Center

    Beard, David Volk

    1991-01-01

    Presents an approach to developing viable image computer-human interactions (CHI) involving user metaphors for comprehending image data and methods for locating, accessing, and displaying computer images. A medical-image radiology workstation application is used as an example, and feedback and evaluation methods are discussed. (41 references) (LRW)

  3. The effect of format modifications and reading comprehension on recall of informed consent information by low-income parents: a comparison of print, video, and computer-based presentations.

    PubMed

    Campbell, Frances A; Goldman, Barbara D; Boccia, Maria L; Skinner, Martie

    2004-05-01

    A randomized trial comparing the amount of knowledge orally recalled from four different presentations of the same consent information was conducted in a non-clinic sample of 233 low-income parents who displayed a range of reading comprehension skill. The study simulated recruitment of children into one of two actual studies underway at another location: one involved high risk to participants, the other did not. Use of a non-clinic sample controlled for prior knowledge of the conditions, and avoiding discussion of the information further assured that differences in recalled information could be attributed more confidently to the format itself. The formats included the original written forms, enhanced print (simpler language, topic headings, pictures), narrated videotapes, and self-paced PowerPoint presentations via laptop computer with bulleted print information, pictures, and narration. No format-related differences in recalled information were found in the full sample but for the 124 individuals with reading comprehension scores at or below the 8th grade level, the enhanced print version tended to be more effective than either the original form or the video. Across all formats, more information was recalled about the low-risk study. The findings emphasize the necessity for clinicians and researchers to verify understanding of consent information, especially when there is risk of reduced literacy skill. Reliance on video to convey information in preference to well-done print media appeared questionable.

  4. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  5. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  6. Information Based Numerical Practice.

    DTIC Science & Technology

    1987-02-01

    characterization by comparative computational studies of various benchmark problems. See e.g. [MacNeal, Harder (1985)], [Robinson, Blackham (1981)] any...FOR NONADAPTIVE METHODS 2.1. THE QUADRATURE FORMULA The simplest example studied in detail in the literature is the problem of the optimal quadrature...formulae and the functional analytic prerequisites for the study of optimal formulae, we refer to the large monography (808 p) of [Sobolev (1974)]. Let us

  7. Computer-Based Information Management System. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume IV. Final Report.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the fourth of four project objectives, the development and preliminary evaluation of a computer-based…

  8. The Czechoslovak Computer Information System ASTI

    ERIC Educational Resources Information Center

    Fendrych, Miroslav; Fogl, Ing. Jiri

    1971-01-01

    Discussed is the Czechoslovak program of State Information Policy which aims at the construction of a unified, integrated national system of scientific, technological and economic information. (Author/SJ)

  9. The ABLEDATA System: Instant Information via Computer.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1983

    1983-01-01

    ABLEDATA, a computerized information database, provides information on commercially available rehabilitation products in 13 major categories, including personal care, mobility, communication, and prosthetics. Product information includes manufacturer's name, availability, cost, and formal evaluation results. The network is accessible to anyone via…

  10. The ABLEDATA System: Instant Information via Computer.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1983

    1983-01-01

    ABLEDATA, a computerized information database, provides information on commercially available rehabilitation products in 13 major categories, including personal care, mobility, communication, and prosthetics. Product information includes manufacturer's name, availability, cost, and formal evaluation results. The network is accessible to anyone via…

  11. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  12. Hypertext-based computer vision teaching packages

    NASA Astrophysics Data System (ADS)

    Marshall, A. David

    1994-10-01

    The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.

  13. Bridging the integration gap between imaging and information systems: a uniform data concept for content-based image retrieval in computer-aided diagnosis.

    PubMed

    Welter, Petra; Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno, Thomas M

    2011-01-01

    It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process.

  14. Can computational goals inform theories of vision?

    PubMed

    Anderson, Barton L

    2015-04-01

    One of the most lasting contributions of Marr's posthumous book is his articulation of the different "levels of analysis" that are needed to understand vision. Although a variety of work has examined how these different levels are related, there is comparatively little examination of the assumptions on which his proposed levels rest, or the plausibility of the approach Marr articulated given those assumptions. Marr placed particular significance on computational level theory, which specifies the "goal" of a computation, its appropriateness for solving a particular problem, and the logic by which it can be carried out. The structure of computational level theory is inherently teleological: What the brain does is described in terms of its purpose. I argue that computational level theory, and the reverse-engineering approach it inspires, requires understanding the historical trajectory that gave rise to functional capacities that can be meaningfully attributed with some sense of purpose or goal, that is, a reconstruction of the fitness function on which natural selection acted in shaping our visual abilities. I argue that this reconstruction is required to distinguish abilities shaped by natural selection-"natural tasks" -from evolutionary "by-products" (spandrels, co-optations, and exaptations), rather than merely demonstrating that computational goals can be embedded in a Bayesian model that renders a particular behavior or process rational. Copyright © 2015 Cognitive Science Society, Inc.

  15. Fostering an Informal Learning Community of Computer Technologies at School

    ERIC Educational Resources Information Center

    Xiao, Lu; Carroll, John M.

    2007-01-01

    Computer technologies develop at a challenging fast pace. Formal education should not only teach students basic computer skills to meet current computer needs, but also foster student development of informal learning ability for a lifelong learning process. On the other hand, students growing up in the digital world are often more skilled with…

  16. Information-theoretic temporal Bell inequality and quantum computation

    SciTech Connect

    Morikoshi, Fumiaki

    2006-05-15

    An information-theoretic temporal Bell inequality is formulated to contrast classical and quantum computations. Any classical algorithm satisfies the inequality, while quantum ones can violate it. Therefore, the violation of the inequality is an immediate consequence of the quantumness in the computation. Furthermore, this approach suggests a notion of temporal nonlocality in quantum computation.

  17. Remote sensing and geographically based information systems

    NASA Technical Reports Server (NTRS)

    Cicone, R. C.

    1977-01-01

    The incorporation of remotely sensed digital data in a computer based information system is seen to be equivalent to the incorporation of any other spatially oriented layer of data. The growing interest in such systems indicates a need to develop a generalized geographically oriented data base management system that could be made commercially available for a wide range of applications. Some concepts that distinguish geographic information systems were reviewed, and a simple model which can serve as a conceptual framework for the design of a generalized geographic information system was examined.

  18. Computation of Semantic Number from Morphological Information

    ERIC Educational Resources Information Center

    Berent, Iris; Pinker, Steven; Tzelgov, Joseph; Bibi, Uri; Goldfarb, Liat

    2005-01-01

    The distinction between singular and plural enters into linguistic phenomena such as morphology, lexical semantics, and agreement and also must interface with perceptual and conceptual systems that assess numerosity in the world. Three experiments examine the computation of semantic number for singulars and plurals from the morphological…

  19. Computation of Semantic Number from Morphological Information

    ERIC Educational Resources Information Center

    Berent, Iris; Pinker, Steven; Tzelgov, Joseph; Bibi, Uri; Goldfarb, Liat

    2005-01-01

    The distinction between singular and plural enters into linguistic phenomena such as morphology, lexical semantics, and agreement and also must interface with perceptual and conceptual systems that assess numerosity in the world. Three experiments examine the computation of semantic number for singulars and plurals from the morphological…

  20. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  1. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  2. 11 CFR 9003.6 - Production of computer information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 11 Federal Elections 1 2010-01-01 2010-01-01 false Production of computer information. 9003.6... ELECTION FINANCING ELIGIBILITY FOR PAYMENTS § 9003.6 Production of computer information. (a) Categories of... Commission's audit to review the committee's receipts, disbursements, loans, debts, obligations,...

  3. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  4. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  5. Entrepreneurial Health Informatics for Computer Science and Information Systems Students

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Narula, Stuti

    2014-01-01

    Corporate entrepreneurship is a critical area of curricula for computer science and information systems students. Few institutions of computer science and information systems have entrepreneurship in the curricula however. This paper presents entrepreneurial health informatics as a course in a concentration of Technology Entrepreneurship at a…

  6. Delay of Informative Feedback and Computer Managed Instruction.

    ERIC Educational Resources Information Center

    Sturges, Persis T.

    This experiment investigated the effect of a delay in informative feedback in computer-managed instruction upon later retention. Experimental groups of undergraduate college students were given a computer-managed test and received informative feedback (1) immediately, item by item (2 second delay); (2) following the entire test (20 minute delay);…

  7. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  8. Informal and Formal Computer Training in a Corporate Setting.

    ERIC Educational Resources Information Center

    Shapiro, Karen Rosenkrantz

    Experiences with computer education at the Bank of America suggest that, when educating nontechnical people to use computers, formal learning environments are not as effective as informal learning environments. It has become increasingly necessary to use informal training to educate the approximately 4,000 people who need the training because of…

  9. ICOHR: intelligent computer based oral health record.

    PubMed

    Peterson, L C; Cobb, D S; Reynolds, D C

    1995-01-01

    The majority of work on computer use in the dental field has focused on non-clinical practice management information needs. Very few computer-based dental information systems provide management support of the clinical care process, particularly with respect to quality management. Traditional quality assurance methods rely on the paper record and provide only retrospective analysis. Today, proactive quality management initiatives are on the rise. Computer-based dental information systems are being integrated into the care environment, actively providing decision support as patient care is being delivered. These new systems emphasize assessment and improvement of patient care at the time of treatment, thus building internal quality management into the caregiving process. The integration of real time quality management and patient care will be expedited by the introduction of an information system architecture that emulates the gathering and storage of clinical care data currently provided by the paper record. As a proposed solution to the problems associated with existing dental record systems, the computer-based patient record has emerged as a possible alternative to the paper dental record. The Institute of Medicine (IOM) recently conducted a study on improving the efficiency and accuracy of patient record keeping. As a result of this study, the IOM advocates the development and implementation of computer-based patient records as the standard for all patient care records. This project represents the ongoing efforts of The University of Iowa College of Dentistry's collaboration with the University of Uppsala Data Center, Uppsala, Sweden, on a computer-based patient dental record model. ICOHR (Intelligent Computer Based Oral Health Record) is an information system which brings together five important parts of the patient's dental record: medical and dental history; oral status; treatment planning; progress notes; and a Patient Care Database, generated from their

  10. Mapping the Information Trace in Local Field Potentials by a Computational Method of Two-Dimensional Time-Shifting Synchronization Likelihood Based on Graphic Processing Unit Acceleration.

    PubMed

    Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You

    2017-09-12

    The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.

  11. Computer-Based Medical System

    NASA Technical Reports Server (NTRS)

    1998-01-01

    SYMED, Inc., developed a unique electronic medical records and information management system. The S2000 Medical Interactive Care System (MICS) incorporates both a comprehensive and interactive medical care support capability and an extensive array of digital medical reference materials in either text or high resolution graphic form. The system was designed, in cooperation with NASA, to improve the effectiveness and efficiency of physician practices. The S2000 is a MS (Microsoft) Windows based software product which combines electronic forms, medical documents, records management, and features a comprehensive medical information system for medical diagnostic support and treatment. SYMED, Inc. offers access to its medical systems to all companies seeking competitive advantages.

  12. The concept of information in scientific computing

    NASA Astrophysics Data System (ADS)

    Gottlieb, D.; Abarbanel, S.; Tadmor, E.

    The spectral approximation of discontinuous problems is discussed to show that when the solution is a complicated function the grid point value approximation is not acceptable, and to suggest different ways for the realization of the solution in such cases. The information contained and how to extract it is discussed for the case of a linear hyperbolic operator with variable coefficients and a discontinuous function. A smoothing procedure is outlined and its efficiency is demonstrated. A differential method for extracting information is described. It is shown that the use of digital filters to overcome the Gibbs phenomenon is applicable to this type of problem.

  13. Integrated computational and conceptual solutions for complex environmental information management

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2016-06-01

    This paper presents the recent results of the integration of computational and conceptual solutions for the complex case of environmental information management. The solution for the major goal of creating and developing long-term multi-disciplinary knowledge resources and conceptual and computational support was achieved by implementing and integrating key components. The key components are long-term knowledge resources providing required structures for universal knowledge creation, documentation, and preservation, universal multi-disciplinary and multi-lingual conceptual knowledge and classification, especially, references to Universal Decimal Classification (UDC), sustainable workflows for environmental information management, and computational support for dynamical use, processing, and advanced scientific computing with Integrated Information and Computing System (IICS) components and High End Computing (HEC) resources.

  14. DNA based computing for understanding complex shapes.

    PubMed

    Ullah, A M M Sharif; D'Addona, Doriana; Arai, Nobuyuki

    2014-03-01

    This study deals with a computing method called DNA based computing (DBC) that takes inspiration from the Central Dogma of Molecular Biology. The proposed DBC uses a set of user-defined rules to create a DNA-like sequence from a given piece of problem-relevant information (e.g., image data) in a dry-media (i.e., in an ordinary computer). It then uses another set of user-defined rules to create an mRNA-like sequence from the DNA. Finally, it uses the genetic code to translate the mRNA (or directly the DNA) to a protein-like sequence (a sequence of amino acids). The informational characteristics of the protein (entropy, absence, presence, abundance of some selected amino acids, and relationships among their likelihoods) can be used to solve problems (e.g., to understand complex shapes from their image data). Two case studies ((1) fractal geometry generated shape of a fern-leaf and (2) machining experiment generated shape of the worn-zones of a cutting tool) are presented elucidating the shape understanding ability of the proposed DBC in the presence of a great deal of variability in the image data of the respective shapes. The implication of the proposed DBC from the context of Internet-aided manufacturing system is also described. Further study can be carried out in solving other complex computational problems by using the proposed DBC and its derivatives. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Industrial information database service by personal computer network 'Saitamaken Industrial Information System'

    NASA Astrophysics Data System (ADS)

    Sugahara, Keiji

    Saitamaken Industrial Information System provides onlined database services, which does not rely on computers for the whole operation, but utilizes computers, optical disk files or facsimiles for certain operations as we think fit. It employes the method of providing information for various, outputs, that is, image information is sent from optical disk files to facsimiles, or other information is provided from computers to terminals as well as facsimiles. Locating computers as a core in the system, it enables integrated operations. The system at terminal side was developed separately with functions such as operation by turnkey style, down-loading of statistical information and the newest menu.

  16. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  17. Information modification and particle collisions in distributed computation.

    PubMed

    Lizier, Joseph T; Prokopenko, Mikhail; Zomaya, Albert Y

    2010-09-01

    Distributed computation can be described in terms of the fundamental operations of information storage, transfer, and modification. To describe the dynamics of information in computation, we need to quantify these operations on a local scale in space and time. In this paper we extend previous work regarding the local quantification of information storage and transfer, to explore how information modification can be quantified at each spatiotemporal point in a system. We introduce the separable information, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome. We apply this measure to cellular automata, where it is shown to be the first direct quantitative measure to provide evidence for the long-held conjecture that collisions between emergent particles therein are the dominant information modification events.

  18. Design & implementation of distributed spatial computing node based on WPS

    NASA Astrophysics Data System (ADS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  19. Computer-based technological applications in psychotherapy training.

    PubMed

    Berger, Thomas

    2004-03-01

    Despite obvious and documented advantages of technological applications in education, psychotherapy training based on information technology is either still rare or limited to technical innovations such as videotape recordings. This article outlines opportunities new computer-based learning technology create for psychotherapy training. First, approaches that include computer-mediated communication between trainees and teachers/supervisors are presented. Then, computer-based learning technology for self-study purposes is discussed in the context of educational approaches to learning. Computer-based tools that have been developed and evaluated in the context of psychotherapy training are described. Evaluations of the tools are discussed. Copyright 2004 Wiley Periodicals, Inc. J Clin Psychol.

  20. Computationally Inexpensive Identification of Non-Informative Model Parameters

    NASA Astrophysics Data System (ADS)

    Mai, J.; Cuntz, M.; Kumar, R.; Zink, M.; Samaniego, L. E.; Schaefer, D.; Thober, S.; Rakovec, O.; Musuuza, J. L.; Craven, J. R.; Spieler, D.; Schrön, M.; Prykhodko, V.; Dalmasso, G.; Langenberg, B.; Attinger, S.

    2014-12-01

    Sensitivity analysis is used, for example, to identify parameters which induce the largest variability in model output and are thus informative during calibration. Variance-based techniques are employed for this purpose, which unfortunately require a large number of model evaluations and are thus ineligible for complex environmental models. We developed, therefore, a computational inexpensive screening method, which is based on Elementary Effects, that automatically separates informative and non-informative model parameters. The method was tested using the mesoscale hydrologic model (mHM) with 52 parameters. The model was applied in three European catchments with different hydrological characteristics, i.e. Neckar (Germany), Sava (Slovenia), and Guadalquivir (Spain). The method identified the same informative parameters as the standard Sobol method but with less than 1% of model runs. In Germany and Slovenia, 22 of 52 parameters were informative mostly in the formulations of evapotranspiration, interflow and percolation. In Spain 19 of 52 parameters were informative with an increased importance of soil parameters. We showed further that Sobol' indexes calculated for the subset of informative parameters are practically the same as Sobol' indexes before the screening but the number of model runs was reduced by more than 50%. The model mHM was then calibrated twice in the three test catchments. First all 52 parameters were taken into account and then only the informative parameters were calibrated while all others are kept fixed. The Nash-Sutcliffe efficiencies were 0.87 and 0.83 in Germany, 0.89 and 0.88 in Slovenia, and 0.86 and 0.85 in Spain, respectively. This minor loss of at most 4% in model performance comes along with a substantial decrease of at least 65% in model evaluations. In summary, we propose an efficient screening method to identify non-informative model parameters that can be discarded during further applications. We have shown that sensitivity

  1. Computer Center: BASIC String Models of Genetic Information Transfer.

    ERIC Educational Resources Information Center

    Spain, James D., Ed.

    1984-01-01

    Discusses some of the major genetic information processes which may be modeled by computer program string manipulation, focusing on replication and transcription. Also discusses instructional applications of using string models. (JN)

  2. Computer Center: BASIC String Models of Genetic Information Transfer.

    ERIC Educational Resources Information Center

    Spain, James D., Ed.

    1984-01-01

    Discusses some of the major genetic information processes which may be modeled by computer program string manipulation, focusing on replication and transcription. Also discusses instructional applications of using string models. (JN)

  3. Information visualization courses for students with a computer science background.

    PubMed

    Kerren, Andreas

    2013-01-01

    Linnaeus University offers two master's courses in information visualization for computer science students with programming experience. This article briefly describes the syllabi, exercises, and practices developed for these courses.

  4. The Information Science Experiment System - The computer for science experiments in space

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  5. The Information Science Experiment System - The computer for science experiments in space

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  6. "Computer" and "Information and Communication Technology": Students' Culture Specific Interpretations

    ERIC Educational Resources Information Center

    Elen, Jan; Clarebout, Geraldine; Sarfo, Frederick Kwaku; Louw, Lambertus Philippus; Poysa-Tarhonen, Johanna; Stassens, Nick

    2010-01-01

    Given the use of information and communication technology (ICT) and computer as synonyms in ICT-integration research on the one hand, and the potential problems in doing so on the other, this contribution tries to gain insight in the understanding of the words computer and ICT in different settings. In five different countries (Belgium, Finland,…

  7. Course Syllabus: The Social Impact of Computer Information Technology.

    ERIC Educational Resources Information Center

    Behar, Joseph

    1988-01-01

    This syllabus describes the course background, central themes and issues, texts, resources, and recommended readings. Main topics are the sociology of information technology, computers and social change, telecommunications, computers and human interactions, applications in working, and social issues and political implications. (YP)

  8. Agile Computing for Air Force Information Management Infrastructures

    DTIC Science & Technology

    2008-06-01

    Component Object Model FCS – Future Combat Systems FTP – File Transfer Protocol MANET – Mobile AdHoc Network MTOM – Message Transmission Optimization...Efficient data dissemination and predicate processing in dynamic networks . 15. SUBJECT TERMS Agile computing, data dissemination, predicate processing...1  2.2. Agile Computing for Military Information Networks

  9. Information Security in the Age of Cloud Computing

    ERIC Educational Resources Information Center

    Sims, J. Eric

    2012-01-01

    Information security has been a particularly hot topic since the enhanced internal control requirements of Sarbanes-Oxley (SOX) were introduced in 2002. At about this same time, cloud computing started its explosive growth. Outsourcing of mission-critical functions has always been a gamble for managers, but the advantages of cloud computing are…

  10. Information Security in the Age of Cloud Computing

    ERIC Educational Resources Information Center

    Sims, J. Eric

    2012-01-01

    Information security has been a particularly hot topic since the enhanced internal control requirements of Sarbanes-Oxley (SOX) were introduced in 2002. At about this same time, cloud computing started its explosive growth. Outsourcing of mission-critical functions has always been a gamble for managers, but the advantages of cloud computing are…

  11. Selective Bibliography on the History of Computing and Information Processing.

    ERIC Educational Resources Information Center

    Aspray, William

    1982-01-01

    Lists some of the better-known and more accessible books on the history of computing and information processing, covering: (1) popular general works; (2) more technical general works; (3) microelectronics and computing; (4) artificial intelligence and robotics; (5) works relating to Charles Babbage; (6) other biographical and personal accounts;…

  12. Selective Bibliography on the History of Computing and Information Processing.

    ERIC Educational Resources Information Center

    Aspray, William

    1982-01-01

    Lists some of the better-known and more accessible books on the history of computing and information processing, covering: (1) popular general works; (2) more technical general works; (3) microelectronics and computing; (4) artificial intelligence and robotics; (5) works relating to Charles Babbage; (6) other biographical and personal accounts;…

  13. Object-based media and stream-based computing

    NASA Astrophysics Data System (ADS)

    Bove, V. Michael, Jr.

    1998-03-01

    Object-based media refers to the representation of audiovisual information as a collection of objects - the result of scene-analysis algorithms - and a script describing how they are to be rendered for display. Such multimedia presentations can adapt to viewing circumstances as well as to viewer preferences and behavior, and can provide a richer link between content creator and consumer. With faster networks and processors, such ideas become applicable to live interpersonal communications as well, creating a more natural and productive alternative to traditional videoconferencing. In this paper is outlined an example of object-based media algorithms and applications developed by my group, and present new hardware architectures and software methods that we have developed to enable meeting the computational requirements of object- based and other advanced media representations. In particular we describe stream-based processing, which enables automatic run-time parallelization of multidimensional signal processing tasks even given heterogenous computational resources.

  14. Web application for simplifying access to computer center resources and information.

    SciTech Connect

    Long, J. W.

    2013-05-01

    Lorenz is a product of the ASC Scientific Data Management effort. Lorenz is a web-based application designed to help computer centers make information and resources more easily available to their users.

  15. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    ERIC Educational Resources Information Center

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  16. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    ERIC Educational Resources Information Center

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  17. Computational fluid dynamics: Complex flows requiring supercomputers. (Latest citations from the iINSPEC: Information Services for the Physics and Engineering Communities data base). Published Search

    SciTech Connect

    Not Available

    1992-09-01

    The bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets and missiles, and automobiles; heat transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid generation techniques required to apply CFD numerical solutions. (Contains 250 citations and includes a subject term index and title list.)

  18. Very large scale integration: computer aided layout and design. 1979-March, 1983 (Citations from the International Information Service for the Physics and Engineering Communities Data Base)

    SciTech Connect

    Not Available

    1983-03-01

    This bibliography contains citations concerning computer aided layout and design of Very Large Scale Integration (VLSI) circuits. Included are algorithms and languages for layout and design procedures. The computer aids provide for architectural design, for entering and verifying or editing the VLSI circuits and for topology developed for the architecture. Some citations reference color graphics to assist in differentiating among layers, and others discuss means of dealing with the architectural complexity, or techniques to minimize the complexity. (Contains 257 citations fully indexed and including a title list.)

  19. Remote sensing and geographically based information systems

    NASA Technical Reports Server (NTRS)

    Cicone, R. C.

    1977-01-01

    A structure is proposed for a geographically-oriented computer-based information system applicable to the analysis of remote sensing digital data. The structure, intended to answer a wide variety of user needs, would permit multiple views of the data, provide independent management of data security, quality and integrity, and rely on automatic data filing. Problems in geographically-oriented data systems, including those related to line encoding and cell encoding, are considered.

  20. Information security: where computer science, economics and psychology meet.

    PubMed

    Anderson, Ross; Moore, Tyler

    2009-07-13

    Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.

  1. The Impact of Instructional Elements in Computer-Based Instruction

    ERIC Educational Resources Information Center

    Martin, Florence; Klein, James D.; Sullivan, Howard

    2007-01-01

    This study investigated the effects of several elements of instruction (objectives, information, practice, examples and review) when they were combined in a systematic manner. College students enrolled in a computer literacy course used one of six different versions of a computer-based lesson delivered on the web to learn about input, processing,…

  2. Computer Self-Efficacy among Health Information Students

    ERIC Educational Resources Information Center

    Hendrix, Dorothy Marie

    2011-01-01

    Roles and functions of health information professionals are evolving due to the mandated electronic health record adoption process for healthcare facilities. A knowledgeable workforce with computer information technology skill sets is required for the successful collection of quality patient-care data, improvement of productivity, and…

  3. Computer Self-Efficacy among Health Information Students

    ERIC Educational Resources Information Center

    Hendrix, Dorothy Marie

    2011-01-01

    Roles and functions of health information professionals are evolving due to the mandated electronic health record adoption process for healthcare facilities. A knowledgeable workforce with computer information technology skill sets is required for the successful collection of quality patient-care data, improvement of productivity, and…

  4. Computer-assisted information graphics from the graphic design perspective

    SciTech Connect

    Marcus, A.

    1983-11-01

    Computer-assisted information graphics can benefit by adopting some of the working processes, principles, and areas of concern typical of information-oriented graphic designers. A review of some basic design considerations is followed by a discussion of the creation and design of a prototype nonverbal narrative which combines symbols, charts, maps, and diagrams.

  5. Continued research on computer-based testing.

    PubMed Central

    Clyman, S. G.; Julian, E. R.; Orr, N. A.; Dillon, G. F.; Cotton, K. E.

    1991-01-01

    The National Board of Medical Examiners has developed computer-based examination formats for use in evaluating physicians in training. This paper describes continued research on these formats including attitudes about computers and effects of factors not related to the trait being measured; differences between paper-administered and computer-administered multiple-choice questions; and the characteristics of simulation formats. The implications for computer-based testing and further research are discussed. PMID:1807703

  6. Splash, pop, sizzle: Information processing with phononic computing

    SciTech Connect

    Sklan, Sophia R.

    2015-05-15

    Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics) have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and the device level approach to diodes, transistors, memory, and logic. .

  7. Information theory based approaches to cellular signaling.

    PubMed

    Waltermann, Christian; Klipp, Edda

    2011-10-01

    Cells interact with their environment and they have to react adequately to internal and external changes such changes in nutrient composition, physical properties like temperature or osmolarity and other stresses. More specifically, they must be able to evaluate whether the external change is significant or just in the range of noise. Based on multiple external parameters they have to compute an optimal response. Cellular signaling pathways are considered as the major means of information perception and transmission in cells. Here, we review different attempts to quantify information processing on the level of individual cells. We refer to Shannon entropy, mutual information, and informal measures of signaling pathway cross-talk and specificity. Information theory in systems biology has been successfully applied to identification of optimal pathway structures, mutual information and entropy as system response in sensitivity analysis, and quantification of input and output information. While the study of information transmission within the framework of information theory in technical systems is an advanced field with high impact in engineering and telecommunication, its application to biological objects and processes is still restricted to specific fields such as neuroscience, structural and molecular biology. However, in systems biology dealing with a holistic understanding of biochemical systems and cellular signaling only recently a number of examples for the application of information theory have emerged. This article is part of a Special Issue entitled Systems Biology of Microorganisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  9. Recommendations for responsible monitoring and regulation of clinical software systems. American Medical Informatics Association, Computer-based Patient Record Institute, Medical Library Association, Association of Academic Health Science Libraries, American Health Information Management Association, American Nurses Association.

    PubMed

    Miller, R A; Gardner, R M

    1997-01-01

    In mid-1996, the FDA called for discussions on regulation of clinical software programs as medical devices. In response, a consortium of organizations dedicated to improving health care through information technology has developed recommendations for the responsible regulation and monitoring of clinical software systems by users, vendors, and regulatory agencies. Organizations assisting in development of recommendations, or endorsing the consortium position include the American Medical Informatics Association, the Computer-based Patient Record Institute, the Medical Library Association, the Association of Academic Health Sciences Libraries, the American Health Information Management Association, the American Nurses Association, the Center for Healthcare Information Management, and the American College of Physicians. The consortium proposes four categories of clinical system risks and four classes of measured monitoring and regulatory actions that can be applied strategically based on the level of risk in a given setting. The consortium recommends local oversight of clinical software systems, and adoption by healthcare information system developers of a code of good business practices. Budgetary and other constraints limit the type and number of systems that the FDA can regulate effectively. FDA regulation should exempt most clinical software systems and focus on those systems posing highest clinical risk, with limited opportunities for competent human intervention.

  10. Embrittlement Database from the Radiation Safety Information Computational Center

    DOE Data Explorer

    The Embrittlement Data Base (EDB) is a comprehensive collection of data from surveillance capsules of U.S. commercial nuclear power reactors and from experiments in material test reactors. The collected data are contained in either the Power Reactor Embrittlement Data Base (PR-EDB) or the Test Reactor Embrittlement Data Base (TR-EDB). The EDB work includes verification of the quality of the EDB, provision for user-friendly software to access and process the data, exploration and/or confirmation of embrittlement prediction models, provision for rapid investigation of regulatory issues, and provision for the technical bases for voluntary consensus standards or regulatory guides. The EDB is designed for use with a personal computer. The data are collected into "raw data files." Traceability of all data is maintained by including complete references along with the page numbers. External data verification of the PR-EDB is the responsibility of the vendors, who were responsible for the insertion and testing of the materials in the surveillance capsules. Internal verification is accomplished by checking against references and checking for inconsistencies. Examples of information contained in the EDBs are: Charpy data, tensile data, reactor type, irradiation environments, fracture toughness data, instrumented Charpy data, pressure-temperature (P-T) data, chemistry data, and material history. The TR-EDB additionally has annealing Charpy data. The current version of the PR-EDB contains the test results from 269 Charpy capsules irradiated in 101 reactors. These results include 320 plate data points, 123 forging data points, 113 standard reference materials (SRMS) or correlation monitor (CM) points, 244 weld material data points, and 220 heat-affected-zone (HAZ) material data points. Similarly, the TR-EDB contains information for 290 SRM or CM points, 342 plate data points, 165 forging data points, 378 welds, and 55 HAZ materials. [copied from http://rsicc.ornl.gov/RelatedLinks.aspx?t=edb

  11. Computer-Based Learning in Chemistry Classes

    ERIC Educational Resources Information Center

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  12. Computer-Based Learning in Chemistry Classes

    ERIC Educational Resources Information Center

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  13. Small Computer Applications for Base Supply.

    DTIC Science & Technology

    1984-03-01

    assistance. Special recognition goes to the members of the International Logistics Section. Third, to the members of the Supply community, for their... Recognition of Small Computers. . . 11 Major Command Applications .. ......... 12 Base Level Applications ... . .. . . . . 14 Problem Statement...computer use in the Air Force? Air Force Recognition of Small Computers The Air Force has recently realized the potential offered by these small

  14. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  15. Using the Computer as a Specific Information Resource in Computer-Aided Language Learning Programs.

    ERIC Educational Resources Information Center

    Witton, Niclas

    1984-01-01

    Classifies several kinds of readily available foreign language computer programs. Most of the programs fall into either game/activity or testing categories. In all the programs, feedback to the student who has made an error is limited. Sees the computer as a specific information resource in self-directed learning programs. (SED)

  16. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  17. A Computer Data Base for Clinicians, Managers and Researchers

    PubMed Central

    Gottfredson, Douglas K.

    1981-01-01

    Since 1972 the Salt Lake VA Medical Center has designed, developed and upgraded a computer system to improve the quality of health care for veterans. The computer system has greatly increased the ease and accuracy with which information is gathered, stored, retrieved and analysed. Though it has not been possible to anticipate every question which might be asked, we have attempted to recognize the special interests of various groups and individuals and to tailor the computer data base to meet their needs. The SL VAMC computer system facilitates meeting accountability requirements established by different agencies to assure quality of care. Computer techniques provide clinicians with information for assessment, planning, providing treatment, following progress and establishing discharge and after-care plans. Managers are provided information vital for decisions and to complete required reports. Researchers can readily study the effectiveness of assessment, diagnoses and treatment and recommend program improvements.

  18. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  19. 77 FR 24538 - Advisory Committee for Computer and Information Science And Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... FOUNDATION Advisory Committee for Computer and Information Science And Engineering; Notice of Meeting In... Foundation announces the following meeting: Name: Advisory Committee for Computer and Information Science and...: Carmen Whitson, Directorate for Computer and Information Science and Engineering, National Science...

  20. 76 FR 20051 - Advisory Committee for Computer and Information; Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... Advisory Committee for Computer and Information; Science and Engineering; Notice of Meeting In accordance... announces the following meeting: ] Name: Advisory Committee for Computer and Information Science and..., Directorate for Computer and Information, Science and Engineering, National Science Foundation, 4201 Wilson...

  1. Updating data bases with incomplete information

    SciTech Connect

    Winslett, M.S.

    1987-01-01

    Suppose one wishes to construct, use, and maintain a data base of facts about the real work, even though the state of that world is only partially known. In the AI domain, this problem arises when an agent has a base set of beliefs that reflect partial knowledge about the world, and then tries to incorporate new, possibly contradictory knowledge into this set of beliefs. In the data-base domain, one facet of this situation is the well-known null values problem. The author chooses to represent such a data base as a logical theory, and views the models of the theory as representing possible states of the world that are consistent with all known information. How can new information be incorporated into the database. The research produced a formal method of specifying the desired change intensionally, by stating a well-formed formula that the state of the world is now known to satisfy. The data base update algorithms provided automatically accomplishes that change. The approach embeds the incomplete data base and the incoming information in the language of mathematical logic, and gives formal definitions of the semantics of the update operators, along with proofs of correctness for their associated algorithms. The author assesses the computational complexity of the algorithms, and proposes a means of lazy evaluation to avoid undersirable expense during execution of updates. He also examines means of enforcing integrity constraints as the data base is updated.

  2. Computer-Based Job Aiding: Problem Solving at Work.

    DTIC Science & Technology

    1984-01-01

    KEY .ORDS (CUMue M mum. Wif. of aeeeM. am 8 F Wp Wi MMW) technical literacy , problem solving, computer based job aiding comliute~r based instruction...discourse processes, although those notions are opera- tionalized in a new way. Infomation Search in Technical Literacy as Problem Solving The dimensions of...computer-assisted technical literacy , information seeking strategies employed during an assembly task were analyzed in terms of overall group frequencies

  3. The Impact of Computers on the Retrieval and Utilization of Chemical Information.

    ERIC Educational Resources Information Center

    Lawrence, Barbara

    The use of computers in retrieving bibliographic chemical information is traced through the SDI, batch, and online modes, and related changes are noted in such areas as data base availability, cost, software, and amount of user control. The impact of these changes on both the quality and quantity of chemical information use is discussed, as well…

  4. The Impact of Computers on the Retrieval and Utilization of Chemical Information.

    ERIC Educational Resources Information Center

    Lawrence, Barbara

    The use of computers in retrieving bibliographic chemical information is traced through the SDI, batch, and online modes, and related changes are noted in such areas as data base availability, cost, software, and amount of user control. The impact of these changes on both the quality and quantity of chemical information use is discussed, as well…

  5. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  6. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  7. Information-seeking behavior and computer literacy among resident doctors in Maiduguri, Nigeria.

    PubMed

    Abbas, A D; Abubakar, A M; Omeiza, B; Minoza, K

    2013-01-01

    Resident doctors are key actors in patient management in all the federal training institutions in nigeria. Knowing the information-seeking behavior of this group of doctors and their level of computer knowledge would facilitate informed decision in providing them with the relevant sources of information as well as encouraging the practice of evidence-based medicine. This is to examine information-seeking behavior among resident doctors and analyze its relationship to computer ownership and literacy. A pretested self-administered questionnaire was used to obtain information from the resident doctors in the University of Maiduguri Teaching Hospital (UMTH) and the Federal Neuro-Psychiatry Hospital (FNPH). The data fields requested included the biodata, major source of medical information, level of computer literacy, and computer ownership. Other questions included were their familiarity with basic computer operations as well as versatility on the use of the Internet and possession of an active e-mail address. Out of 109 questionnaires distributed 100 were returned (91.7% response rate). Seventy three of the 100 respondents use printed material as their major source of medical information. Ninety three of the respondents own a laptop, a desktop or both, while 7 have no computers. Ninety-four respondents are computer literate while 6 are computer illiterates. Seventy-five respondents have an e-mail address while 25 do not have e-mail address. Seventy-five search the Internet for information while 25 do not know how to use the Internet. Despite the high computer ownership and literacy rate among resident doctors, the printed material remains their main source of medical information.

  8. Computing, information, and communications: Technologies for the 21. Century

    SciTech Connect

    1998-11-01

    To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of the National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.

  9. Computational spectrotemporal auditory model with applications to acoustical information processing

    NASA Astrophysics Data System (ADS)

    Chi, Tai-Shih

    A computational spectrotemporal auditory model based on neurophysiological findings in early auditory and cortical stages is described. The model provides a unified multiresolution representation of the spectral and temporal features of sound likely critical in the perception of timbre. Several types of complex stimuli are used to demonstrate the spectrotemporal information preserved by the model. Shown by these examples, this two stage model reflects the apparent progressive loss of temporal dynamics along the auditory pathway from the rapid phase-locking (several kHz in auditory nerve), to moderate rates of synchrony (several hundred Hz in midbrain), to much lower rates of modulations in the cortex (around 30 Hz). To complete this model, several projection-based reconstruction algorithms are implemented to resynthesize the sound from the representations with reduced dynamics. One particular application of this model is to assess speech intelligibility. The spectro-temporal Modulation Transfer Functions (MTF) of this model is investigated and shown to be consistent with the salient trends in the human MTFs (derived from human detection thresholds) which exhibit a lowpass function with respect to both spectral and temporal dimensions, with 50% bandwidths of about 16 Hz and 2 cycles/octave. Therefore, the model is used to demonstrate the potential relevance of these MTFs to the assessment of speech intelligibility in noise and reverberant conditions. Another useful feature is the phase singularity emerged in the scale space generated by this multiscale auditory model. The singularity is shown to have certain robust properties and carry the crucial information about the spectral profile. Such claim is justified by perceptually tolerable resynthesized sounds from the nonconvex singularity set. In addition, the singularity set is demonstrated to encode the pitch and formants at different scales. These properties make the singularity set very suitable for traditional

  10. A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…

  11. A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…

  12. Bioinformatics process management: information flow via a computational journal

    PubMed Central

    Feagan, Lance; Rohrer, Justin; Garrett, Alexander; Amthauer, Heather; Komp, Ed; Johnson, David; Hock, Adam; Clark, Terry; Lushington, Gerald; Minden, Gary; Frost, Victor

    2007-01-01

    This paper presents the Bioinformatics Computational Journal (BCJ), a framework for conducting and managing computational experiments in bioinformatics and computational biology. These experiments often involve series of computations, data searches, filters, and annotations which can benefit from a structured environment. Systems to manage computational experiments exist, ranging from libraries with standard data models to elaborate schemes to chain together input and output between applications. Yet, although such frameworks are available, their use is not widespread–ad hoc scripts are often required to bind applications together. The BCJ explores another solution to this problem through a computer based environment suitable for on-site use, which builds on the traditional laboratory notebook paradigm. It provides an intuitive, extensible paradigm designed for expressive composition of applications. Extensive features facilitate sharing data, computational methods, and entire experiments. By focusing on the bioinformatics and computational biology domain, the scope of the computational framework was narrowed, permitting us to implement a capable set of features for this domain. This report discusses the features determined critical by our system and other projects, along with design issues. We illustrate the use of our implementation of the BCJ on two domain-specific examples. PMID:18053179

  13. Fully computed holographic stereogram based algorithm for computer-generated holograms with accurate depth cues.

    PubMed

    Zhang, Hao; Zhao, Yan; Cao, Liangcai; Jin, Guofan

    2015-02-23

    We propose an algorithm based on fully computed holographic stereogram for calculating full-parallax computer-generated holograms (CGHs) with accurate depth cues. The proposed method integrates point source algorithm and holographic stereogram based algorithm to reconstruct the three-dimensional (3D) scenes. Precise accommodation cue and occlusion effect can be created, and computer graphics rendering techniques can be employed in the CGH generation to enhance the image fidelity. Optical experiments have been performed using a spatial light modulator (SLM) and a fabricated high-resolution hologram, the results show that our proposed algorithm can perform quality reconstructions of 3D scenes with arbitrary depth information.

  14. Computer-Based Vocational Guidance Systems.

    ERIC Educational Resources Information Center

    Gallagher, James J., Ed.

    Job and educational opportunities are increasing dramatically. School guidance counselors can no longer cope adequately with the available information, nor can they help all students to see the complex range of alternative life styles. Computers are now programed to give students information. They can also provide for students' decision-making…

  15. Curriculum Development of Computer and Information Literacy in the Netherlands.

    ERIC Educational Resources Information Center

    Hartsuijker, Ard P.

    1986-01-01

    Describes a national project on computer and information literacy in the Netherlands for all students in lower secondary education, i.e., the 12- to 16-year-old age group. Curriculum development is discussed with emphasis on equal participation by female students and teachers, and hardware and software specifications are noted. (LRW)

  16. The Use of Blackboard in Computer Information Systems Courses.

    ERIC Educational Resources Information Center

    Figueroa, Sandy; Huie, Carol

    This paper focuses on the rationale for incorporating Blackboard, a Web-authoring software package, as the knowledge construction tool in computer information system courses. The authors illustrate previous strategies they incorporated in their classes, and they present their uses of Blackboard. They point out their reactions as teachers, and the…

  17. Comprehension: Mapping Information, a Program for the Apple Computer.

    ERIC Educational Resources Information Center

    Rauch, Margaret; Wogen, David

    A computer program written for the Apple II was designed to teach college students how to map expository information. The program consists of five lessons, the first beginning with a definition and illustration of mapping. In this introductory lesson, students are asked to answer three basic questions when reading expository text: identify the…

  18. Large-scale temporal analysis of computer and information science

    NASA Astrophysics Data System (ADS)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  19. Computer-based anesthesiology paging system.

    PubMed

    Abenstein, John P; Allan, Jonathan A; Ferguson, Jennifer A; Deick, Steven D; Rose, Steven H; Narr, Bradly J

    2003-07-01

    For more than a century, Mayo Clinic has used various communication strategies to optimize the efficiency of physicians. Anesthesiology has used colored wooden tabs, colored lights, and, most recently, a distributed video paging system (VPS) that was near the end of its useful life. A computer-based anesthesiology paging system (CAPS) was developed to replace the VPS. The CAPS uses a hands-off paradigm with ubiquitous displays to inform the practice where personnel are needed. The system consists of a dedicated Ethernet network connecting redundant central servers, terminal servers, programmable keypads, and light-emitting diode displays. Commercially available hardware and software tools minimized development and maintenance costs. The CAPS was installed in >200 anesthetizing and support locations. Downtime for the CAPS averaged 0.144 min/day, as compared with 24.2 min/day for the VPS. During installation, neither system was available and the department used beepers for communications. With a beeper, the median response time of an anesthesiologist to a page from a beeper was 2.78 min, and with the CAPS 1.57 min; this difference was statistically significant (P = 0.021, t(67) = 2.36). We conclude that the CAPS is a reliable and efficient paging system that may contribute to the efficiency of the practice. Mayo Clinic installed a computer-based anesthesiology paging system (CAPS) to inform operating suite personnel when assistance is needed in procedure and recovery areas. The CAPS is more reliable than the system it replaced. Anesthesiologists arrive at a patient's bedside faster when they are paged with the CAPS than with a beeper.

  20. How social and non-social information influence classification decisions: A computational modelling approach.

    PubMed

    Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg

    2017-08-01

    Social information such as observing others can improve performance in decision making. In particular, social information has been shown to be useful when finding the best solution on one's own is difficult, costly, or dangerous. However, past research suggests that when making decisions people do not always consider other people's behaviour when it is at odds with their own experiences. Furthermore, the cognitive processes guiding the integration of social information with individual experiences are still under debate. Here, we conducted two experiments to test whether information about other persons' behaviour influenced people's decisions in a classification task. Furthermore, we examined how social information is integrated with individual learning experiences by testing different computational models. Our results show that social information had a small but reliable influence on people's classifications. The best computational model suggests that in categorization people first make up their own mind based on the non-social information, which is then updated by the social information.

  1. Description of the computer-based patient record and computer-based patient record system. CPRI Work Group on CPR Description.

    PubMed

    1996-01-01

    Computer-based patient records and computer-based patient record systems support health care effectiveness and efficiency with appropriate safeguards for confidentiality. Achieving a health information infrastructure with computer-based patient records supported by fully integrated computer-based patient record systems is obviously a process of incremental steps. However, CPRI believes significant benefits in health care delivery are certain to be realized over the full course of this process.

  2. Evolving the Land Information System into a Cloud Computing Service

    SciTech Connect

    Houser, Paul R.

    2015-02-17

    The Land Information System (LIS) was developed to use advanced flexible land surface modeling and data assimilation frameworks to integrate extremely large satellite- and ground-based observations with advanced land surface models to produce continuous high-resolution fields of land surface states and fluxes. The resulting fields are extremely useful for drought and flood assessment, agricultural planning, disaster management, weather and climate forecasting, water resources assessment, and the like. We envisioned transforming the LIS modeling system into a scientific cloud computing-aware web and data service that would allow clients to easily setup and configure for use in addressing large water management issues. The focus of this Phase 1 project was to determine the scientific, technical, commercial merit and feasibility of the proposed LIS-cloud innovations that are currently barriers to broad LIS applicability. We (a) quantified the barriers to broad LIS utility and commercialization (high performance computing, big data, user interface, and licensing issues); (b) designed the proposed LIS-cloud web service, model-data interface, database services, and user interfaces; (c) constructed a prototype LIS user interface including abstractions for simulation control, visualization, and data interaction, (d) used the prototype to conduct a market analysis and survey to determine potential market size and competition, (e) identified LIS software licensing and copyright limitations and developed solutions, and (f) developed a business plan for development and marketing of the LIS-cloud innovation. While some significant feasibility issues were found in the LIS licensing, overall a high degree of LIS-cloud technical feasibility was found.

  3. Effective information spreading based on local information in correlated networks

    NASA Astrophysics Data System (ADS)

    Gao, Lei; Wang, Wei; Pan, Liming; Tang, Ming; Zhang, Hai-Feng

    2016-12-01

    Using network-based information to facilitate information spreading is an essential task for spreading dynamics in complex networks. Focusing on degree correlated networks, we propose a preferential contact strategy based on the local network structure and local informed density to promote the information spreading. During the spreading process, an informed node will preferentially select a contact target among its neighbors, basing on their degrees or local informed densities. By extensively implementing numerical simulations in synthetic and empirical networks, we find that when only consider the local structure information, the convergence time of information spreading will be remarkably reduced if low-degree neighbors are favored as contact targets. Meanwhile, the minimum convergence time depends non-monotonically on degree-degree correlation, and a moderate correlation coefficient results in the most efficient information spreading. Incorporating the local informed density information into contact strategy, the convergence time of information spreading can be further reduced, and be minimized by an moderately preferential selection.

  4. Issues in Text Design and Layout for Computer Based Communications.

    ERIC Educational Resources Information Center

    Andresen, Lee W.

    1991-01-01

    Discussion of computer-based communications (CBC) focuses on issues involved with screen design and layout for electronic text, based on experiences with electronic messaging, conferencing, and publishing within the Australian Open Learning Information Network (AOLIN). Recommendations for research on design and layout for printed text are also…

  5. Use of computers and the Internet for health information by patients with epilepsy.

    PubMed

    Escoffery, Cam; Diiorio, Colleen; Yeager, Katherine A; McCarty, Frances; Robinson, Elise; Reisinger, Elizabeth; Henry, Thomas; Koganti, Archana

    2008-01-01

    The purpose of this study was to describe computer and Internet use among an online group and a clinic-based group of people with epilepsy. Greater than 95% of the online group and 60% of the clinic group have access to computers and the Internet. More than 99% of the online group and 57% of the clinic group used the Internet to find health information. A majority of people reported being likely to employ an Internet-based self-management program to control their epilepsy. About 43% reported searching for general information on epilepsy, 30% for medication, 23% for specific types of epilepsy, and 20% for treatment. This study found that people with epilepsy have access to computers and the Internet, desire epilepsy-specific information, and are receptive to online health information on how to manage their epilepsy.

  6. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  7. A bibliometric study on chemical information and computer sciences focusing on literature of JCICS.

    PubMed

    Onodera, N

    2001-01-01

    A bibliometric approach was used to survey the state-of-the-art of research in the field of chemical information and computer sciences (CICS). By examining the CA database for the articles abstracted under the subsection "Chemical information, documentation, and data processing", Journal of Chemical Information and Computer Sciences (JCICS) was identified to have been the top journal in this subsection for the last 30 years. Based on this result, CA subsections and controlled index terms given to JCICS articles were analyzed to see trends in subjects and topics in the CICS field during the last two decades. These analyses revealed that the subjects of research in CICS have diversified from traditional information science and computer applications to chemistry to "molecular information sciences". The SCISEARCH database was used to grasp interdependency between JCICS and other key journals and also the international nature of JCICS in its publications and citedness.

  8. A novel bit-quad-based Euler number computing algorithm.

    PubMed

    Yao, Bin; He, Lifeng; Kang, Shiying; Chao, Yuyan; Zhao, Xiao

    2015-01-01

    The Euler number of a binary image is an important topological property in computer vision and pattern recognition. This paper proposes a novel bit-quad-based Euler number computing algorithm. Based on graph theory and analysis on bit-quad patterns, our algorithm only needs to count two bit-quad patterns. Moreover, by use of the information obtained during processing the previous bit-quad, the average number of pixels to be checked for processing a bit-quad is only 1.75. Experimental results demonstrated that our method outperforms significantly conventional Euler number computing algorithms.

  9. Mobile healthcare information management utilizing Cloud Computing and Android OS.

    PubMed

    Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias

    2010-01-01

    Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.

  10. Privacy protections afforded by computer-based patient record systems.

    PubMed

    Amatayakul, M

    1996-05-01

    Computer-based patient record (CPR) systems can afford greater protection of private health information. Key factors that enhance security of CPR systems include the capability to identify the user, verify authorization, determine legitimacy of use, restrict retrieval to only specific "need-to-know" information, encrypt access mechanisms and content, and track all access. The public demands greater protections for computer systems than for paper-based systems. Coupled with appropriate internal management controls and federal preemptive privacy law, breaches of confidentiality from CPRs would occur virtually through the only means that cannot be safeguarded: human communication.

  11. Computational Vision Based on Neurobiology

    DTIC Science & Technology

    1993-07-09

    on each side of the horopter. The observers’ eye movements were tracked with an SRI dual-Purkinje eye tracker while they watched a dynamic random... eye convergence, can be equated since under most circumstances the subject will try to track and converge on the same object. The two points are not...collaborators’ conditions2?3" were a special case (very slow tracking eye movements) and that under more general conditions, visual information alone would

  12. Computer-Assisted Search Of Large Textual Data Bases

    NASA Technical Reports Server (NTRS)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  13. Intelligent Image Based Computer Aided Education (IICAE)

    NASA Astrophysics Data System (ADS)

    David, Amos A.; Thiery, Odile; Crehange, Marion

    1989-03-01

    Artificial Intelligence (AI) has found its way into Computer Aided Education (CAE), and there are several systems constructed to put in evidence its interesting advantages. We believe that images (graphic or real) play an important role in learning. However, the use of images, outside their use as illustration, makes it necessary to have applications such as AI. We shall develop the application of AI in an image based CAE and briefly present the system under construction to put in evidence our concept. We shall also elaborate a methodology for constructing such a system. Futhermore we shall briefly present the pedagogical and psychological activities in a learning process. Under the pedagogical and psychological aspect of learning, we shall develop areas such as the importance of image in learning both as pedagogical objects as well as means for obtaining psychological information about the learner. We shall develop the learner's model, its use, what to build into it and how. Under the application of AI in an image based CAE, we shall develop the importance of AI in exploiting the knowledge base in the learning environment and its application as a means of implementing pedagogical strategies.

  14. Interdisciplinary Study with Computer-Based Multimedia.

    ERIC Educational Resources Information Center

    Couch, John D.; And Others

    Interdisciplinary study with computer-based multimedia in the classroom is reviewed. The multimedia revolution involves multiple technologies and multiple modes of sensation, but the computer is at the heart of this revolution. Despite the many challenges, interest is strong for multimedia courseware. The predicted market is enormous, and nowhere…

  15. Computer-Based Training: An Institutional Approach.

    ERIC Educational Resources Information Center

    Barker, Philip; Manji, Karim

    1992-01-01

    Discussion of issues related to computer-assisted learning (CAL) and computer-based training (CBT) describes approaches to electronic learning; principles underlying courseware development to support these approaches; and a plan for creation of a CAL/CBT development center, including its functional role, campus services, staffing, and equipment…

  16. Interdisciplinary Study with Computer-Based Multimedia.

    ERIC Educational Resources Information Center

    Couch, John D.; And Others

    Interdisciplinary study with computer-based multimedia in the classroom is reviewed. The multimedia revolution involves multiple technologies and multiple modes of sensation, but the computer is at the heart of this revolution. Despite the many challenges, interest is strong for multimedia courseware. The predicted market is enormous, and nowhere…

  17. Response Evaluation in Computer Based Tutorials.

    ERIC Educational Resources Information Center

    Smyth, Timothy

    1987-01-01

    Describes a computer program that allows undergraduate students to undertake nomenclature tutorials in organic chemistry, where user response is evaluated based on a set of rules that determine appropriate feedback. Design of the software is described, including the use of computer graphics, and results of students' evaluations are briefly…

  18. A micro-computer based system to compute magnetic variation

    NASA Technical Reports Server (NTRS)

    Kaul, R.

    1984-01-01

    A mathematical model of magnetic variation in the continental United States (COT48) was implemented in the Ohio University LORAN C receiver. The model is based on a least squares fit of a polynomial function. The implementation on the microprocessor based LORAN C receiver is possible with the help of a math chip, Am9511 which performs 32 bit floating point mathematical operations. A Peripheral Interface Adapter (M6520) is used to communicate between the 6502 based micro-computer and the 9511 math chip. The implementation provides magnetic variation data to the pilot as a function of latitude and longitude. The model and the real time implementation in the receiver are described.

  19. All-optical reservoir computer based on saturation of absorption.

    PubMed

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  20. [Computational chemistry in structure-based drug design].

    PubMed

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  1. Information Hiding based Trusted Computing System Design

    DTIC Science & Technology

    2014-07-18

    This has also been identified in the Defense Science Board study on High Performance Microchip Supply, “Trust cannot be added to integrated circuits...Science Board Task Force on High Performance Microchip Supply, February 2005. [27] B.S. Cohen. “On Integrated Circuits Supply Chain Issues in a Global

  2. Solving Information-Based Problems: Evaluating Sources and Information

    ERIC Educational Resources Information Center

    Brand-Gruwel, Saskia; Stadtler, Marc

    2011-01-01

    The focus of this special section is on the processes involved when solving information-based problems. Solving these problems requires from people that they are able to define the information problem, search and select usable and reliable sources and information and synthesise information into a coherent body of knowledge. An important aspect…

  3. Solving Information-Based Problems: Evaluating Sources and Information

    ERIC Educational Resources Information Center

    Brand-Gruwel, Saskia; Stadtler, Marc

    2011-01-01

    The focus of this special section is on the processes involved when solving information-based problems. Solving these problems requires from people that they are able to define the information problem, search and select usable and reliable sources and information and synthesise information into a coherent body of knowledge. An important aspect…

  4. A computer-supported information system for forensic services.

    PubMed

    Petrila, J P; Hedlund, J L

    1983-05-01

    Recently many state departments of mental health have decentralized their forensic services programs. This trend has increased administrative needs for accurate, easily accessible information on the forensic services' caseload. The Missouri Department of Mental Health and the Missouri Institute of Psychiatry have developed and implemented a computer-supported system that provides data on the number of cases referred by criminal courts, the questions asked by the courts, the clinical answers to those questions, and demographic information about the evaluated population. The system is a part of the department's other computer systems so that forensic clients may be tracked through various mental health facilities. Mental health administrators may use the system to monitor department policies, ensure appropriate allocation of resources, and improve the quality of forensic reports.

  5. Department of Energy: MICS (Mathematical Information, and Computational Sciences Division). High performance computing and communications program

    SciTech Connect

    1996-06-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, {open_quotes}The DOE Program in HPCC{close_quotes}), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW). The information pointed to by the URL is updated frequently, and the interested reader is urged to access the WWW for the latest information.

  6. Big data mining analysis method based on cloud computing

    NASA Astrophysics Data System (ADS)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  7. Information based universal feature extraction

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad; Brause, Rüdiger

    2015-02-01

    In many real world image based pattern recognition tasks, the extraction and usage of task-relevant features are the most crucial part of the diagnosis. In the standard approach, they mostly remain task-specific, although humans who perform such a task always use the same image features, trained in early childhood. It seems that universal feature sets exist, but they are not yet systematically found. In our contribution, we tried to find those universal image feature sets that are valuable for most image related tasks. In our approach, we trained a neural network by natural and non-natural images of objects and background, using a Shannon information-based algorithm and learning constraints. The goal was to extract those features that give the most valuable information for classification of visual objects hand-written digits. This will give a good start and performance increase for all other image learning tasks, implementing a transfer learning approach. As result, in our case we found that we could indeed extract features which are valid in all three kinds of tasks.

  8. Resonant transition-based quantum computation

    NASA Astrophysics Data System (ADS)

    Chiang, Chen-Fu; Hsieh, Chang-Yu

    2017-05-01

    In this article we assess a novel quantum computation paradigm based on the resonant transition (RT) phenomenon commonly associated with atomic and molecular systems. We thoroughly analyze the intimate connections between the RT-based quantum computation and the well-established adiabatic quantum computation (AQC). Both quantum computing frameworks encode solutions to computational problems in the spectral properties of a Hamiltonian and rely on the quantum dynamics to obtain the desired output state. We discuss how one can adapt any adiabatic quantum algorithm to a corresponding RT version and the two approaches are limited by different aspects of Hamiltonians' spectra. The RT approach provides a compelling alternative to the AQC under various circumstances. To better illustrate the usefulness of the novel framework, we analyze the time complexity of an algorithm for 3-SAT problems and discuss straightforward methods to fine tune its efficiency.

  9. MTA Computer Based Evaluation System.

    ERIC Educational Resources Information Center

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  10. MTA Computer Based Evaluation System.

    ERIC Educational Resources Information Center

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  11. Selected computer system controls at the Energy Information Administration

    SciTech Connect

    Not Available

    1991-09-01

    The purpose of our review of the Energy Information Administration's (EIA) computer system was to evaluate disk and tape information storage and the adequacy of internal controls in the operating system programs. We used a set of computer-assisted audit techniques called CAATS, developed by the US Department of Transportation, Office of Inspector General, in performing the review at the EIA Forrestal Computer Facility. Improved procedures are needed to assure more efficient use of disk space. By transferring data sets from disk to tape, deleting invalid data, releasing unused reserve space and blocking data efficiently, disk space with an estimated value of $1.1 million a year could be recovered for current use. Also, procedures governing the maximum times for storage of information on tapes should be enforced to help ensure that data is not lost. In addition, improved internal controls are needed over granting users system-wide privileges and over authorized program library names to prevent unauthorized access to the system and possible destruction or manipulation of data. Automated Data Processing (ADP) Services Staff officials indicated that software maintenance was not current, due to contractual difficulties with the operating contractor for the Forrestal Facility. Our review confirmed that improvements were needed to help prevent malfunctions of the operating system, which could cause performance degradations, system failures, or loss of either system or user data. Management generally concurred with the recommendations in the report.

  12. Task-Based Information Searching.

    ERIC Educational Resources Information Center

    Vakkari, Pertti

    2003-01-01

    Reviews studies on the relationship between task performance and information searching by end-users, focusing on information searching in electronic environments and information retrieval systems. Topics include task analysis; task characteristics; search goals; modeling information searching; modeling search goals; information seeking behavior;…

  13. Storing and managing information artifacts collected by information analysts using a computing device

    DOEpatents

    Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J

    2012-09-18

    Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.

  14. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  15. Ontology for cell-based geographic information

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Huang, Lina; Lu, Xinhai

    2009-10-01

    Inter-operability is a key notion in geographic information science (GIS) for the sharing of geographic information (GI). That requires a seamless translation among different information sources. Ontology is enrolled in GI discovery to settle the semantic conflicts for its natural language appearance and logical hierarchy structure, which are considered to be able to provide better context for both human understanding and machine cognition in describing the location and relationships in the geographic world. However, for the current, most studies on field ontology are deduced from philosophical theme and not applicable for the raster expression in GIS-which is a kind of field-like phenomenon but does not physically coincide to the general concept of philosophical field (mostly comes from the physics concepts). That's why we specifically discuss the cell-based GI ontology in this paper. The discussion starts at the investigation of the physical characteristics of cell-based raster GI. Then, a unified cell-based GI ontology framework for the recognition of the raster objects is introduced, from which a conceptual interface for the connection of the human epistemology and the computer world so called "endurant-occurrant window" is developed for the better raster GI discovery and sharing.

  16. Computer-based Guideline Implementation Systems

    PubMed Central

    Shiffman, Richard N.; Liaw, Yischon; Brandt, Cynthia A.; Corb, Geoffrey J.

    1999-01-01

    In this systematic review, the authors analyze the functionality provided by recent computer-based guideline implementation systems and characterize the effectiveness of the systems. Twenty-five studies published between 1992 and January 1998 were identified. Articles were included if the authors indicated an intent to implement guideline recommendations for clinicians and if the effectiveness of the system was evaluated. Provision of eight information management services and effects on guideline adherence, documentation, user satisfaction, and patient outcome were noted. All systems provided patient-specific recommendations. In 19, recommendations were available concurrently with care. Explanation services were described for nine systems. Nine systems allowed interactive documentation, and 17 produced paper-based output. Communication services were present most often in systems integrated with electronic medical records. Registration, calculation, and aggregation services were infrequently reported. There were 10 controlled trials (9 randomized) and 10 time-series correlational studies. Guideline adherence improved in 14 of 18 systems in which it was measured. Documentation improved in 4 of 4 studies. PMID:10094063

  17. Computer-Based Modeling Environments

    DTIC Science & Technology

    1988-12-01

    and Kernighan 򒾃>), CAMPS (Lucas and Mitra 򒾁>), GAMS (Bisschop and Meeraus 򒽾>), LINGO (Cunningham and Schrage 򒾄>), LPL (Hurlimann and...times; and Vo 򒾁>, which describes the integration approach used by a UNIX -based analytical modeling environment at AT&T Bell Laboratories called...platform such as UNIX , as ANALYTICOL does (Childs and Meacham 򒾁>). Or one might build a modeling environment around a suitable, and probably relational

  18. Cloud Computing in the Curricula of Schools of Computer Science and Information Systems

    ERIC Educational Resources Information Center

    Lawler, James P.

    2011-01-01

    The cloud continues to be a developing area of information systems. Evangelistic literature in the practitioner field indicates benefit for business firms but disruption for technology departments of the firms. Though the cloud currently is immature in methodology, this study defines a model program by which computer science and information…

  19. DPMA Model Curriculum for Undergraduate Computer Information Systems Education.

    ERIC Educational Resources Information Center

    Adams, David R., Ed.; Athey, Thomas H., Ed.

    Designed primarily for 4-year undergraduate programs for business applications programmer/analysts offered through schools of business or through applied computer science programs that require a concentration of business courses, these guidelines are based on national and regional conferences, questionnaire surveys, and consultation with computer…

  20. An Investigation of Computer Coaching for Informal Learning Activities.

    ERIC Educational Resources Information Center

    Burton, Richard R.; Brown, John Seely

    This paper provides an in-depth view of computer based tutoring/coaching systems and (1) the philosophy behind them, (2) the kinds of diagnostic modeling strategies required to infer a student's shortcomings from observing his behavior, and (3) the range of explicit tutorial strategies needed for directing the tutor to say the right thing at the…

  1. DPMA Model Curriculum for Undergraduate Computer Information Systems Education.

    ERIC Educational Resources Information Center

    Adams, David R., Ed.; Athey, Thomas H., Ed.

    Designed primarily for 4-year undergraduate programs for business applications programmer/analysts offered through schools of business or through applied computer science programs that require a concentration of business courses, these guidelines are based on national and regional conferences, questionnaire surveys, and consultation with computer…

  2. Extraction of information of targets based on frame buffer

    NASA Astrophysics Data System (ADS)

    Han, Litao; Kong, Qiaoli; Zhao, Xiangwei

    2008-10-01

    In all ways of perception, vision is the main channel of getting environmental information for intelligent virtual agent (IVA). Reality and real-time computation of behavior simulation of intelligent objects in interactive virtual environment are required. This paper proposes a new method of getting environmental information. Firstly visual images are generated by setting a second view port in the location of viewpoint of IVA, and then the target location, distance, azimuth, and other basic geometric information and semantic information can be acquired based on the images. Experiments show that the method gives full play to the performance of computer graphic hardware with simple process and higher efficiency.

  3. A Computer-Based Dietary Counseling System.

    ERIC Educational Resources Information Center

    Slack, Warner V.; And Others

    1976-01-01

    The preliminary trial of a program in which principles of patient-computer dialogue have been applied to dietary counseling is described. The program was designed to obtain historical information from overweight patients and to provide instruction and guidance regarding dietary behavior. Beginning with a teaching sequence, 25 non-overweight…

  4. Computer security: a necessary element of integrated information systems.

    PubMed Central

    Butzen, F; Furler, F

    1986-01-01

    The Matheson Report sees the medical library as playing a key role in a network of interlocking information bases that will extend from central repositories of medical information to each physician's personal records. It appears, however, that the role of security in this vision has not been fully delineated. This paper discusses problems in maintaining the security of confidential medical information, the state of the applicable law, and techniques for security (with special emphasis on the UNIX operating system). It is argued that the absence of security threatens any plan to build an information network, as there will be resistance to any system that may give intruders access to confidential data. PMID:3742113

  5. Base Information Transport Infrastructure Wired (BITI Wired)

    DTIC Science & Technology

    2016-03-01

    2016 Major Automated Information System Annual Report Base Information Transport Infrastructure Wired (BITI Wired) Defense Acquisition Management ...Major Automated Information System MAIS OE - MAIS Original Estimate MAR – MAIS Annual Report MDA - Milestone Decision Authority MDD - Materiel...Combat Information Transport System program was restructured into two pre-Major Automated Information System (pre-MAIS) components: Information

  6. Impact of Computer Based Online Entrepreneurship Distance Education in India

    ERIC Educational Resources Information Center

    Shree Ram, Bhagwan; Selvaraj, M.

    2012-01-01

    The success of Indian enterprises and professionals in the computer and information technology (CIT) domain during the twenty year has been spectacular. Entrepreneurs, bureaucrats and technocrats are now advancing views about how India can ride CIT bandwagon and leapfrog into a knowledge-based economy in the area of entrepreneurship distance…

  7. SpecialNet. A National Computer-Based Communications Network.

    ERIC Educational Resources Information Center

    Morin, Alfred J.

    1986-01-01

    "SpecialNet," a computer-based communications network for educators at all administrative levels, has been established and is managed by National Systems Management, Inc. Users can send and receive electronic mail, share information on electronic bulletin boards, participate in electronic conferences, and send reports and other documents to each…

  8. SpecialNet. A National Computer-Based Communications Network.

    ERIC Educational Resources Information Center

    Morin, Alfred J.

    1986-01-01

    "SpecialNet," a computer-based communications network for educators at all administrative levels, has been established and is managed by National Systems Management, Inc. Users can send and receive electronic mail, share information on electronic bulletin boards, participate in electronic conferences, and send reports and other documents to each…

  9. Hospital information systems: measuring end user computing satisfaction (EUCS).

    PubMed

    Aggelidis, Vassilios P; Chatzoglou, Prodromos D

    2012-06-01

    Over the past decade, hospitals in Greece have made significant investments in adopting and implementing new hospital information systems (HISs). Whether these investments will prove beneficial for these organizations depends on the support that will be provided to ensure the effective use of the information systems implemented and also on the satisfaction of its users, which is one of the most important determinants of the success of these systems. Measuring end-user computing satisfaction has a long history within the IS discipline. A number of attempts have been made to evaluate the overall post hoc impact of HIS, focusing on the end-users and more specifically on their satisfaction and the parameters that determine it. The purpose of this paper is to build further upon the existing body of the relevant knowledge by testing past models and suggesting new conceptual perspectives on how end-user computing satisfaction (EUCS) is formed among hospital information system users. All models are empirically tested using data from hospital information system (HIS) users (283). Correlation, explanatory and confirmation factor analysis was performed to test the reliability and validity of the measurement models. The structural equation modeling technique was also used to evaluate the causal models. The empirical results of the study provide support for the EUCS model (incorporating new factors) and enhance the generalizability of the EUCS instrument and its robustness as a valid measure of computing satisfaction and a surrogate for system success in a variety of cultural and linguistic settings. Although the psychometric properties of EUCS appear to be robust across studies and user groups, it should not be considered as the final chapter in the validation and refinement of these scales. Continuing efforts should be made to validate and extend the instrument. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Autonomous management of distributed information systems using evolutionary computation techniques

    NASA Astrophysics Data System (ADS)

    Oates, Martin J.

    1999-03-01

    As the size of typical industrial strength information systems continues to rise, particularly in the arena of Internet based management information systems and multimedia servers, the issue of managing data distribution over clusters or `farms' to overcome performance and scalability issues is becoming of paramount importance. Further, where access is global, this can cause points of geographically localized load contention to `follow the sun' during the day. Traditional site mirroring is not overly effective in addressing this contention and so a more dynamic approach is being investigated to tackle load balancing. The general objective is to manage a self-adapting, distributed database so as to reliably and consistently provide near optimal performance as perceived by client applications. Such a management system must be ultimately capable of operating over a range of time varying usage profiles and fault scenarios, incorporate considerations for communications network delays, multiple updates and maintenance operations. It must also be shown to be capable of being scaled in a practical fashion to ever larger sized networks and databases. Two key components of such an automated system are an optimiser capable of efficiently finding new configuration options, and a suitable model of the system capable of accurately reflecting the performance (or any other required quality of service metric) of the real world system. As conditions change in the real world system, these are fed into the model. The optimiser is then run to find new configurations which are tested in the model prior to implementation in the real world. The model therefore forms an evaluation function which the optimiser utilises to direct its search. Whilst it has already been shown that Genetic Algorithms can provide good solutions to this problem, there are a number of issues associated with this approach. In particular, for industrial strength applications, it must be shown that the GA employed

  11. Power of surface-based DNA computation

    SciTech Connect

    Cai, Weiping; Condon, A.E.; Corn, R.M.

    1997-12-01

    A new model of DNA computation that is based on surface chemistry is studied. Such computations involve the manipulation of DNA strands that are immobilized on a surface, rather than in solution as in the work of Adleman. Surface-based chemistry has been a critical technology in many recent advances in biochemistry and offers several advantages over solution-based chemistry, including simplified handling of samples and elimination of loss of strands, which reduce error in the computation. The main contribution of this paper is in showing that in principle, surface-based DNA chemistry can efficiently support general circuit computation on many inputs in parallel. To do this, an abstract model of computation that allows parallel manipulation of binary inputs is described. It is then shown that this model can be implemented by encoding inputs as DNA strands and repeatedly modifying the strands in parallel on a surface, using the chemical processes of hybridization, exonuclease degradation, polymerase extension, and ligation. Thirdly, it is shown that the model supports efficient circuit simulation in the following sense: exactly those inputs that satisfy a circuit can be isolated and the number of parallel operations needed to do this is proportional to the size of the circuit. Finally, results are presented on the power of the model when another resource of DNA computation is limited, namely strand length. 12 refs.

  12. The Intersection of Community-Based Writing and Computer-Based Writing: A Cyberliteracy Case Study.

    ERIC Educational Resources Information Center

    Gabor, Catherine

    The learning goals that inform service learning as a whole can contribute to the computers and writing field significantly. This paper demonstrates how two lines of inquiry can be furthered, community-based writing and computers and writing, through new data and critical reflection on learning goals and communication tools. The paper presents a…

  13. Computer vision based room interior design

    NASA Astrophysics Data System (ADS)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  14. An Interactive Computer-Based Revision Aid

    ERIC Educational Resources Information Center

    Stevens, J. M.; Harris, F. T. C.

    1977-01-01

    A computer-based review system has been developed, based on the multiple-choice technique, at the University of London for medical students. The user can enter an answer or can have a list of questions to take away and enter later. Student response has been favorable. (LBH)

  15. A computer-based specification methodology

    NASA Technical Reports Server (NTRS)

    Munck, Robert G.

    1986-01-01

    Standard practices for creating and using system specifications are inadequate for large, advanced-technology systems. A need exists to break away from paper documents in favor of documents that are stored in computers and which are read and otherwise used with the help of computers. An SADT-based system, running on the proposed Space Station data management network, could be a powerful tool for doing much of the required technical work of the Station, including creating and operating the network itself.

  16. Establishing performance requirements of computer based systems subject to uncertainty

    SciTech Connect

    Robinson, D.

    1997-02-01

    An organized systems design approach is dictated by the increasing complexity of computer based systems. Computer based systems are unique in many respects but share many of the same problems that have plagued design engineers for decades. The design of complex systems is difficult at best, but as a design becomes intensively dependent on the computer processing of external and internal information, the design process quickly borders chaos. This situation is exacerbated with the requirement that these systems operate with a minimal quantity of information, generally corrupted by noise, regarding the current state of the system. Establishing performance requirements for such systems is particularly difficult. This paper briefly sketches a general systems design approach with emphasis on the design of computer based decision processing systems subject to parameter and environmental variation. The approach will be demonstrated with application to an on-board diagnostic (OBD) system for automotive emissions systems now mandated by the state of California and the Federal Clean Air Act. The emphasis is on an approach for establishing probabilistically based performance requirements for computer based systems.

  17. Using a small/low cost computer in an information center

    NASA Technical Reports Server (NTRS)

    Wilde, D. U.

    1972-01-01

    Small/low cost computers are available with I/O capacities that make them suitable for SDI and retrospective searching on any of the many commercially available data bases. A small two-tape computer system is assumed, and an analysis of its run-time equations leads to a three-step search procedure. Run times and costs are shown as a function of file size, number of search terms, and input transmission rates. Actual examples verify that it is economically feasible for an information center to consider its own small, dedicated computer system.

  18. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  19. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  20. Computer-based Approaches to Patient Education

    PubMed Central

    Lewis, Deborah

    1999-01-01

    All articles indexed in MEDLINE or CINAHL, related to the use of computer technology in patient education, and published in peer-reviewed journals between 1971 and 1998 were selected for review. Sixty-six articles, including 21 research-based reports, were identified. Forty-five percent of the studies were related to the management of chronic disease. Thirteen studies described an improvement in knowledge scores or clinical outcomes when computer-based patient education was compared with traditional instruction. Additional articles examined patients' computer experience, socioeconomic status, race, and gender and found no significant differences when compared with program outcomes. Sixteen of the 21 research-based studies had effect sizes greater than 0.5, indicating a significant change in the described outcome when the study subjects participated in computer-based patient education. The findings from this review support computer-based education as an effective strategy for transfer of knowledge and skill development for patients. The limited number of research studies (N = 21) points to the need for additional research. Recommendations for new studies include cost-benefit analysis and the impact of these new technologies on health outcomes over time. PMID:10428001

  1. In-silico design of computational nucleic acids for molecular information processing

    PubMed Central

    2013-01-01

    Within recent years nucleic acids have become a focus of interest for prototype implementations of molecular computing concepts. During the same period the importance of ribonucleic acids as components of the regulatory networks within living cells has increasingly been revealed. Molecular computers are attractive due to their ability to function within a biological system; an application area extraneous to the present information technology paradigm. The existence of natural information processing architectures (predominately exemplified by protein) demonstrates that computing based on physical substrates that are radically different from silicon is feasible. Two key principles underlie molecular level information processing in organisms: conformational dynamics of macromolecules and self-assembly of macromolecules. Nucleic acids support both principles, and moreover computational design of these molecules is practicable. This study demonstrates the simplicity with which one can construct a set of nucleic acid computing units using a new computational protocol. With the new protocol, diverse classes of nucleic acids imitating the complete set of boolean logical operators were constructed. These nucleic acid classes display favourable thermodynamic properties and are significantly similar to the approximation of successful candidates implemented in the laboratory. This new protocol would enable the construction of a network of interconnecting nucleic acids (as a circuit) for molecular information processing. PMID:23647621

  2. Realizing the Potential of Information Resources: Information, Technology, and Services. Track 8: Academic Computing and Libraries.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Eight papers are presented from the 1995 CAUSE conference track on academic computing and library issues faced by managers of information technology at colleges and universities. The papers include: (1) "Where's the Beef?: Implementation of Discipline-Specific Training on Internet Resources" (Priscilla Hancock and others); (2)…

  3. The Use of Information Technologies for Education in Science, Mathematics, and Computers. An Agenda for Research.

    ERIC Educational Resources Information Center

    Educational Technology Center, Cambridge, MA.

    Developed to guide the research of the Educational Technology Center, a consortium based at Harvard Graduate School of Education, this report addresses the use of new information technologies to enrich, extend, and transform current instructional practice in science, mathematics, and computer education. A discussion of the basic elements required…

  4. Information and Computing Technology and the Gap between School Pedagogy and Children's Culture

    ERIC Educational Resources Information Center

    Sorensen, Birgitte Holm

    2005-01-01

    This article focuses on the gap between the use of Information and Computing Technology (ICT) in schools and the use made of ICT in children's everyday life outside school. Particular emphasis is placed on the communities of practice that young people create through their use of digital technology. The article is based on data collected over five…

  5. An Exploratory Study of Malaysian Publication Productivity in Computer Science and Information Technology.

    ERIC Educational Resources Information Center

    Gu, Yinian

    2002-01-01

    Explores the Malaysian computer science and information technology publication productivity as indicated by data collected from three Web-based databases. Relates possible reasons for the amount and pattern of contributions to the size of researcher population, the availability of refereed scholarly journals, and the total expenditure allocated to…

  6. Informing One-to-One Computing in Primary Schools: Student Use of Netbooks

    ERIC Educational Resources Information Center

    Larkin, Kevin; Finger, Glenn

    2011-01-01

    Although one-to-one laptop programs are being introduced in many schools, minimal research has been conducted regarding their effectiveness in primary schools. Evidence-based research is needed to inform significant funding, deployment and student use of computers. This article analyses key findings from a study conducted in four Year 7 classrooms…

  7. Correcting well-log information for computer processing and analysis

    NASA Astrophysics Data System (ADS)

    Robinson, Joseph E.

    Detailed subsurface analysis is dependent on suites of good quality well logs that are similar in lithologic response and in presentation. Well-to-well variations in the logs should reflect changes in lithology, not in the recording instrument and instrument settings. Differences in the recording and play back of logged curves, even with the most modern instruments, seriously can affect geologic interpretations. Computer analysis of digitally recorded logs is more dependent on recording quality and can produce erroneous results if there are small errors in the measured input values and control parameters. Good exploration requires good data, however the geologist cannot control the log quality and must work with whatever data are available. Fortunately most digital log curves can be corrected and standardized so that they qualify as good information. In mature exploration areas, where drilling has been spread through a number of years, the well logs will range from modern digital computer presentations to old hard-copy display exhibiting a variety of depth and instrument response scales. They may seem an agglomeration of misfit logs that are impossible to work with in their original form. Photographic methods of producing standard presentations gives only marginal improvements. The most practical method of creating uniform sets of logs is to digitize the logs then correct and replay them in a form that is optimal for either geologic analysis or extended computer processing. Examples from New York State show how computer processing can be used to transform old logs so that they display uniform responses to lithology. Cross sections illustrate detailed correlations that were not practical with the original hard-copy logs and examples from individual wells display computer calculated porosities from corrected curves utilizing whole-rock compensation.

  8. Color graph based wavelet transform with perceptual information

    NASA Astrophysics Data System (ADS)

    Malek, Mohamed; Helbert, David; Carré, Philippe

    2015-09-01

    We propose a numerical strategy to define a multiscale analysis for color and multicomponent images based on the representation of data on a graph. Our approach consists of computing the graph of an image using the psychovisual information and analyzing it by using the spectral graph wavelet transform. We suggest introducing color dimension into the computation of the weights of the graph and using the geodesic distance as a mean of distance measurement. We thus have defined a wavelet transform based on a graph with perceptual information by using the CIELab color distance. This new representation is illustrated with denoising and inpainting applications. Overall, by introducing psychovisual information in the graph computation for the graph wavelet transform, we obtain very promising results. Thus, results in image restoration highlight the interest of the appropriate use of color information.

  9. Broadcasting Topology and Routing Information in Computer Networks

    DTIC Science & Technology

    1985-05-01

    Andrew S. Tanenbaum , Computer Networks. Prentice-Hall, Inc., 1981. [21 Frank Harvey, Graph Theory. Addison-Wesley, 1969. p [31 J. M. McQuillan and D...networks is that of keeping all nodes informed of the current operational status of each communication link in the network. The failure or repair of one or...where the network topology is in a nearly continuous state of change. 8 • I 1.2 The Topology Problem At any time while a network is in operation , one of

  10. Effective information spreading based on local information in correlated networks

    PubMed Central

    Gao, Lei; Wang, Wei; Pan, Liming; Tang, Ming; Zhang, Hai-Feng

    2016-01-01

    Using network-based information to facilitate information spreading is an essential task for spreading dynamics in complex networks. Focusing on degree correlated networks, we propose a preferential contact strategy based on the local network structure and local informed density to promote the information spreading. During the spreading process, an informed node will preferentially select a contact target among its neighbors, basing on their degrees or local informed densities. By extensively implementing numerical simulations in synthetic and empirical networks, we find that when only consider the local structure information, the convergence time of information spreading will be remarkably reduced if low-degree neighbors are favored as contact targets. Meanwhile, the minimum convergence time depends non-monotonically on degree-degree correlation, and a moderate correlation coefficient results in the most efficient information spreading. Incorporating the local informed density information into contact strategy, the convergence time of information spreading can be further reduced, and be minimized by an moderately preferential selection. PMID:27910882

  11. Information Sharing for Computing Trust Metrics on COTS Electronic Components

    DTIC Science & Technology

    2008-09-01

    ix LIST OF FIGURES Figure 1. Photograph of Xbox internals.....................3 Figure 2. Photograph used to identify Xbox 360 DVD drives...1. Photograph of Xbox internals. For example, in November of 2001, Microsoft released the Xbox video game console based on common personal...computer (PC) hardware. The Xbox is essentially a PC with an Intel Mobile Celeron processor, hard drive, nVidia GeForce video card, random access memory

  12. Representing spatial information in a computational model for network management

    NASA Technical Reports Server (NTRS)

    Blaisdell, James H.; Brownfield, Thomas F.

    1994-01-01

    While currently available relational database management systems (RDBMS) allow inclusion of spatial information in a data model, they lack tools for presenting this information in an easily comprehensible form. Computer-aided design (CAD) software packages provide adequate functions to produce drawings, but still require manual placement of symbols and features. This project has demonstrated a bridge between the data model of an RDBMS and the graphic display of a CAD system. It is shown that the CAD system can be used to control the selection of data with spatial components from the database and then quickly plot that data on a map display. It is shown that the CAD system can be used to extract data from a drawing and then control the insertion of that data into the database. These demonstrations were successful in a test environment that incorporated many features of known working environments, suggesting that the techniques developed could be adapted for practical use.

  13. Department of Energy Mathematical, Information, and Computational Sciences Division: High Performance Computing and Communications Program

    SciTech Connect

    1996-11-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, The DOE Program in HPCC), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW).

  14. Computer Competencies for the 21st Century Information Systems Educator.

    ERIC Educational Resources Information Center

    McCoy, Randall W.

    2001-01-01

    In a Delphi study 23 experts identified the need for competence in computer technologies of business education teachers. Competencies were grouped into five categories: computer hardware, software, programming, computer integration, and general computer knowledge. (Contains 28 references.) (JOW)

  15. Mutual information-based facial expression recognition

    NASA Astrophysics Data System (ADS)

    Hazar, Mliki; Hammami, Mohamed; Hanêne, Ben-Abdallah

    2013-12-01

    This paper introduces a novel low-computation discriminative regions representation for expression analysis task. The proposed approach relies on interesting studies in psychology which show that most of the descriptive and responsible regions for facial expression are located around some face parts. The contributions of this work lie in the proposition of new approach which supports automatic facial expression recognition based on automatic regions selection. The regions selection step aims to select the descriptive regions responsible or facial expression and was performed using Mutual Information (MI) technique. For facial feature extraction, we have applied Local Binary Patterns Pattern (LBP) on Gradient image to encode salient micro-patterns of facial expressions. Experimental studies have shown that using discriminative regions provide better results than using the whole face regions whilst reducing features vector dimension.

  16. Automatic generation of computable implementation guides from clinical information models.

    PubMed

    Boscá, Diego; Maldonado, José Alberto; Moner, David; Robles, Montserrat

    2015-06-01

    Clinical information models are increasingly used to describe the contents of Electronic Health Records. Implementation guides are a common specification mechanism used to define such models. They contain, among other reference materials, all the constraints and rules that clinical information must obey. However, these implementation guides typically are oriented to human-readability, and thus cannot be processed by computers. As a consequence, they must be reinterpreted and transformed manually into an executable language such as Schematron or Object Constraint Language (OCL). This task can be difficult and error prone due to the big gap between both representations. The challenge is to develop a methodology for the specification of implementation guides in such a way that humans can read and understand easily and at the same time can be processed by computers. In this paper, we propose and describe a novel methodology that uses archetypes as basis for generation of implementation guides. We use archetypes to generate formal rules expressed in Natural Rule Language (NRL) and other reference materials usually included in implementation guides such as sample XML instances. We also generate Schematron rules from NRL rules to be used for the validation of data instances. We have implemented these methods in LinkEHR, an archetype editing platform, and exemplify our approach by generating NRL rules and implementation guides from EN ISO 13606, openEHR, and HL7 CDA archetypes.

  17. Computer based safety training: an investigation of methods

    PubMed Central

    Wallen, E; Mulloy, K

    2005-01-01

    Background: Computer based methods are increasingly being used for training workers, although our understanding of how to structure this training has not kept pace with the changing abilities of computers. Information on a computer can be presented in many different ways and the style of presentation can greatly affect learning outcomes and the effectiveness of the learning intervention. Many questions about how adults learn from different types of presentations and which methods best support learning remain unanswered. Aims: To determine if computer based methods, which have been shown to be effective on younger students, can also be an effective method for older workers in occupational health and safety training. Methods: Three versions of a computer based respirator training module were developed and presented to manufacturing workers: one consisting of text only; one with text, pictures, and animation; and one with narration, pictures, and animation. After instruction, participants were given two tests: a multiple choice test measuring low level, rote learning; and a transfer test measuring higher level learning. Results: Participants receiving the concurrent narration with pictures and animation scored significantly higher on the transfer test than did workers receiving the other two types of instruction. There were no significant differences between groups on the multiple choice test. Conclusions: Narration with pictures and text may be a more effective method for training workers about respirator safety than other popular methods of computer based training. Further study is needed to determine the conditions for the effective use of this technology. PMID:15778259

  18. Towards Information-Based Economies.

    ERIC Educational Resources Information Center

    Cronin, Blaise

    An information society is one in which the expression "to earn one's daily bread by the sweat of one's brow" appears decidedly anachronistic. People have been seduced by the rhetoric of novelty and confused by the surface significance of terms which have become accepted parts of everyday speech. What do rubrics such as information society,…

  19. Towards Information-Based Economies.

    ERIC Educational Resources Information Center

    Cronin, Blaise

    An information society is one in which the expression "to earn one's daily bread by the sweat of one's brow" appears decidedly anachronistic. People have been seduced by the rhetoric of novelty and confused by the surface significance of terms which have become accepted parts of everyday speech. What do rubrics such as information society,…

  20. The Computation of Potential Harmonic Coefficients Using Global Crustal Information

    NASA Astrophysics Data System (ADS)

    Tsoulis, D.

    Topographic/isostatic potential harmonic coefficients can be computed from a global elevation model, when one accounts for the compensation of the upper crust according to a certain model of isostasy. The theory is based on a series expansion of the inverse distance function, which enables an efficient computation of the dimensionless poten- tial coefficients on the sphere. The availability of global crustal models permits the application of the same theory, with the exception that here the theoretically defined boundary between upper crust and mantle is replaced with crustal thickness informa- tion derived mainly from processing repeated seismic observations. The present paper deals with the spherical harmonic analysis of such a model, namely the CRUST 2.0 global crustal model, and compares the derived spectrum with the respective coeffi- cient sets delivered by the application of idealized isostatic models such as those of Airy/Heiskanen or Pratt/Hayford.

  1. Commodity-Based Computing Clusters at PPPL.

    NASA Astrophysics Data System (ADS)

    Wah, Darren; Davis, Steven L.; Johansson, Marques; Klasky, Scott; Tang, William; Valeo, Ernest

    2002-11-01

    In order to cost-effectively facilitate mid-scale serial and parallel computations and code development, a number of commodity-based clusters have been built at PPPL. A recent addition is the PETREL cluster, consisting of 100 dual-processor machines, both Intel and AMD, interconnected by a 100Mbit switch. Sixteen machines have an additional Myrinet 2000 interconnect. Also underway is the implementation of a Prototype Topical Computing Facility which will explore the effectiveness and scaling of cluster computing for larger scale fusion codes, specifically including those being developed under the SCIDAC auspices. This facility will consist of two parts: a 64 dual-processor node cluster, with high speed interconnect, and a 16 dual-processor node cluster, utilizing gigabit networking, built for the purpose of exploring grid-enabled computing. The initial grid explorations will be in collaboration with the Princeton University Institute for Computational Science and Engineering (PICSciE), where a 16 processor cluster dedicated to investigation of grid computing is being built. The initial objectives are to (1) grid-enable the GTC code and an MHD code, making use of MPICH-G2 and (2) implement grid-enabled interactive visualization using DXMPI and the Chromium API.

  2. Computer Based Decision Support in Dentistry.

    ERIC Educational Resources Information Center

    Wagner, Ina-Veronika; Schneider, Werner

    1991-01-01

    The paper discusses computer-based decision support in the following areas: the dental patient record system; diagnosis and treatment of diseases of the oral mucosa; treatment strategy in complex clinical situations; diagnosis and treatment of functional disturbances of the masticatory system; and patient recall. (DB)

  3. Computer-Based Testing: Test Site Security.

    ERIC Educational Resources Information Center

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  4. Computer Based Decision Support in Dentistry.

    ERIC Educational Resources Information Center

    Wagner, Ina-Veronika; Schneider, Werner

    1991-01-01

    The paper discusses computer-based decision support in the following areas: the dental patient record system; diagnosis and treatment of diseases of the oral mucosa; treatment strategy in complex clinical situations; diagnosis and treatment of functional disturbances of the masticatory system; and patient recall. (DB)

  5. Network based high performance concurrent computing

    SciTech Connect

    Sunderam, V.S.

    1991-01-01

    The overall objectives of this project are to investigate research issues pertaining to programming tools and efficiency issues in network based concurrent computing systems. The basis for these efforts is the PVM project that evolved during my visits to Oak Ridge Laboratories under the DOE Faculty Research Participation program; I continue to collaborate with researchers at Oak Ridge on some portions of the project.

  6. Computer-Based Instruction in Dietetics Education.

    ERIC Educational Resources Information Center

    Schroeder, Lois; Kent, Phyllis

    1982-01-01

    Details the development and system design of a computer-based instruction (CBI) program designed to provide tutorial training in diet modification as part of renal therapy and provides the results of a study that compared the effectiveness of the CBI program with the traditional lecture/laboratory method. (EAO)

  7. Computer-Based Instruction in Dietetics Education.

    ERIC Educational Resources Information Center

    Schroeder, Lois; Kent, Phyllis

    1982-01-01

    Details the development and system design of a computer-based instruction (CBI) program designed to provide tutorial training in diet modification as part of renal therapy and provides the results of a study that compared the effectiveness of the CBI program with the traditional lecture/laboratory method. (EAO)

  8. Computer-Game-Based Tutoring of Mathematics

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  9. Computer-Game-Based Tutoring of Mathematics

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  10. Computational Viscoplasticity Based on Overstress (CVBO) Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zheng; Ruggles-wrenn, Marina; Fish, Jacob

    2014-03-01

    This article presents an efficient computational viscoplasticity based on an overstress (CVBO) model, including three-dimensional formulation, implicit stress update procedures, consistent tangent, and systematic calibration of the model parameters to experimental data. The model has been validated for PMR 15 neat resin, including temperature and aging dependence.

  11. Computer-Based Training Starter Kit.

    ERIC Educational Resources Information Center

    Federal Interagency Group for Computer-Based Training, Washington, DC.

    Intended for use by training professionals with little or no background in the application of automated data processing (ADP) systems, processes, or procurement requirements, this reference manual provides guidelines for establishing a computer based training (CBT) program within a federal agency of the United States government. The manual covers:…

  12. Evaluation of a Computer-Based Narrative

    ERIC Educational Resources Information Center

    Sharf, Richard S.

    1978-01-01

    A computer-based narrative report integrating results from the Strong Vocational Interest Blank, the Opinion Attitude and Interest Survey, and the Cooperative English Test was compared with a standard profile format. No differences were found between the two methods for male and female. (Author)

  13. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  14. Educator Beliefs Regarding Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Swann, D. LaDon; Branson, Floyd, Jr.; Talbert, B. Allen

    2003-01-01

    Extension educators (n=17) completed two of five technical sections from an aquaculture CD-ROM tutorial. Evidence from pre/post-training questionnaires, content assessments, and follow-up interviews reveals favorable attitudes toward computer-based inservice training. The ability to spend less time out of their county and to review materials after…

  15. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  16. Educator Beliefs Regarding Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Swann, D. LaDon; Branson, Floyd, Jr.; Talbert, B. Allen

    2003-01-01

    Extension educators (n=17) completed two of five technical sections from an aquaculture CD-ROM tutorial. Evidence from pre/post-training questionnaires, content assessments, and follow-up interviews reveals favorable attitudes toward computer-based inservice training. The ability to spend less time out of their county and to review materials after…

  17. Prototyping of Computer-Based Training Materials.

    ERIC Educational Resources Information Center

    Gray, D. E.; Black, T. R.

    1994-01-01

    Defines prototyping as an original version or model on which a completed software system for computer-based training is formed; examines the development process of a prototype; describes how prototyping can assist in facilitating communication between educational technology, software engineering, and project management; and discusses why…

  18. Computer Based Instructional Systems--1985-1995.

    ERIC Educational Resources Information Center

    Micheli, Gene S.; And Others

    This report discusses developments in computer based instruction (CBI) and presents initiatives for the improvement of Navy instructional management in the 1985 to 1995 time frame. The state of the art in instructional management and delivery is assessed, projections for the capabilities for instructional management and delivery systems during…

  19. Prototyping of Computer-Based Training Materials.

    ERIC Educational Resources Information Center

    Gray, D. E.; Black, T. R.

    1994-01-01

    Defines prototyping as an original version or model on which a completed software system for computer-based training is formed; examines the development process of a prototype; describes how prototyping can assist in facilitating communication between educational technology, software engineering, and project management; and discusses why…

  20. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  1. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  2. 78 FR 79014 - Advisory Committee for Computer and Information Science and Engineering Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-27

    ... From the Federal Register Online via the Government Publishing Office NATIONAL SCIENCE FOUNDATION Advisory Committee for Computer and Information Science and Engineering Notice of Meeting In accordance... announces the following meeting: NAME: Advisory Committee for Computer and Information Science and...

  3. Waveguide-QED-based photonic quantum computation.

    PubMed

    Zheng, Huaixiu; Gauthier, Daniel J; Baranger, Harold U

    2013-08-30

    We propose a new scheme for quantum computation using flying qubits--propagating photons in a one-dimensional waveguide interacting with matter qubits. Photon-photon interactions are mediated by the coupling to a four-level system, based on which photon-photon π-phase gates (CONTROLLED-NOT) can be implemented for universal quantum computation. We show that high gate fidelity is possible, given recent dramatic experimental progress in superconducting circuits and photonic-crystal waveguides. The proposed system can be an important building block for future on-chip quantum networks.

  4. New Mathematics of Information: Homotopical and Higher Categorical Foundations of Information and Computation

    DTIC Science & Technology

    2014-09-13

    specialists in mathematics , logic, and computer science with diverse backgrounds rang- ing from homotopical algebra and category theory to theoretical and...AFRL-OSR-VA-TR-2014-0227 New Mathematics of Information Homotopical Steven Awodey CARNEGIE MELLON UNIVERSITY Final Report 09/24/2014 DISTRIBUTION A...RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 13-09-2014 FINAL 15 JUNE 2011 - 30 APR 2014 FINAL REPORT FOR AWARD: "NEW MATHEMATICS OF

  5. Computing the trajectory mutual information between a point process and an analog stochastic process.

    PubMed

    Pasha, Syed Ahmed; Solo, Victor

    2012-01-01

    In a number of application areas such as neural coding there is interest in computing, from real data, the information flows between stochastic processes one of which is a point process. Of particular interest is the calculation of the trajectory (as opposed to marginal) mutual information between an observed point process which is influenced by an underlying but unobserved analog stochastic process i.e. a state. Using particle filtering we develop a model based trajectory mutual information calculation for apparently the first time.

  6. A community-based study of asthenopia in computer operators

    PubMed Central

    Choudhary, Sushilkumar; Doshi, Vikas G

    2008-01-01

    Context: There is growing body of evidence that use of computers can adversely affect the visual health. Considering the rising number of computer users in India, computer-related asthenopia might take an epidemic form. In view of that, this study was undertaken to find out the magnitude of asthenopia in computer operators and its relationship with various personal and workplace factors. Aims: To study the prevalence of asthenopia among computer operators and its association with various epidemiological factors. Settings and Design: Community-based cross-sectional study of 419 subjects who work on computer for varying period of time. Materials and Methods: Four hundred forty computer operators working in different institutes were selected randomly. Twenty-one did not participate in the study, making the nonresponse rate 4.8%. Rest of the subjects (n = 419) were asked to fill a pre-tested questionnaire, after obtaining their verbal consent. Other relevant information was obtained by personal interview and inspection of workstation. Statistical Analysis Used: Simple proportions and Chi-square test. Results: Among the 419 subjects studied, 194 (46.3%) suffered from asthenopia during or after work on computer. Marginally higher proportion of asthenopia was noted in females compared to males. Occurrence of asthenopia was significantly associated with age of starting use of computer, presence of refractive error, viewing distance, level of top of the computer screen with respect to eyes, use of antiglare screen and adjustment of contrast and brightness of monitor screen. Conclusions: Prevalence of asthenopia was noted to be quite high among computer operators, particularly in those who started its use at an early age. Individual as well as work-related factors were found to be predictive of asthenopia. PMID:18158404

  7. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  8. Geometric algebra and information geometry for quantum computational software

    NASA Astrophysics Data System (ADS)

    Cafaro, Carlo

    2017-03-01

    The art of quantum algorithm design is highly nontrivial. Grover's search algorithm constitutes a masterpiece of quantum computational software. In this article, we use methods of geometric algebra (GA) and information geometry (IG) to enhance the algebraic efficiency and the geometrical significance of the digital and analog representations of Grover's algorithm, respectively. Specifically, GA is used to describe the Grover iterate and the discretized iterative procedure that exploits quantum interference to amplify the probability amplitude of the target-state before measuring the query register. The transition from digital to analog descriptions occurs via Stone's theorem which relates the (unitary) Grover iterate to a suitable (Hermitian) Hamiltonian that controls Schrodinger's quantum mechanical evolution of a quantum state towards the target state. Once the discrete-to-continuos transition is completed, IG is used to interpret Grover's iterative procedure as a geodesic path on the manifold of the parametric density operators of pure quantum states constructed from the continuous approximation of the parametric quantum output state in Grover's algorithm. Finally, we discuss the dissipationless nature of quantum computing, recover the quadratic speedup relation, and identify the superfluity of the Walsh-Hadamard operation from an IG perspective with emphasis on statistical mechanical considerations.

  9. Photonic reservoir computing: a new approach to optical information processing

    NASA Astrophysics Data System (ADS)

    Vandoorne, Kristof; Fiers, Martin; Verstraeten, David; Schrauwen, Benjamin; Dambre, Joni; Bienstman, Peter

    2010-06-01

    Despite ever increasing computational power, recognition and classification problems remain challenging to solve. Recently, advances have been made by the introduction of the new concept of reservoir computing. This is a methodology coming from the field of machine learning and neural networks that has been successfully used in several pattern classification problems, like speech and image recognition. Thus far, most implementations have been in software, limiting their speed and power efficiency. Photonics could be an excellent platform for a hardware implementation of this concept because of its inherent parallelism and unique nonlinear behaviour. Moreover, a photonic implementation offers the promise of massively parallel information processing with low power and high speed. We propose using a network of coupled Semiconductor Optical Amplifiers (SOA) and show in simulation that it could be used as a reservoir by comparing it to conventional software implementations using a benchmark speech recognition task. In spite of the differences with classical reservoir models, the performance of our photonic reservoir is comparable to that of conventional implementations and sometimes slightly better. As our implementation uses coherent light for information processing, we find that phase tuning is crucial to obtain high performance. In parallel we investigate the use of a network of photonic crystal cavities. The coupled mode theory (CMT) is used to investigate these resonators. A new framework is designed to model networks of resonators and SOAs. The same network topologies are used, but feedback is added to control the internal dynamics of the system. By adjusting the readout weights of the network in a controlled manner, we can generate arbitrary periodic patterns.

  10. Distributed computer taxonomy based on O/S structure

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.

    1985-01-01

    The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.

  11. 77 FR 66873 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In accordance... announces the following meeting: Name: Advisory Committee for Computer and Information Science and... impact of its policies, programs and activities on the Computer and Information Science and Engineering...

  12. 75 FR 19428 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-14

    ... Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In accordance... announces the following meeting: Name: Advisory Committee for Computer and Information Science and... Cassandra Queen at the Directorate for Computer and Information Science and Engineering at 703/292-8900...

  13. 76 FR 61118 - Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... FOUNDATION Advisory Committee for Computer and Information Science and Engineering; Notice of Meeting In... Foundation announces the following meeting: Name: Advisory Committee for Computer and Information Science and.... Contact Person: Carmen Whitson, Directorate for Computer and Information Science and Engineering, National...

  14. Hanford general employee training: Computer-based training instructor's manual

    SciTech Connect

    Not Available

    1990-10-01

    The Computer-Based Training portion of the Hanford General Employee Training course is designed to be used in a classroom setting with a live instructor. Future references to this course'' refer only to the computer-based portion of the whole. This course covers the basic Safety, Security, and Quality issues that pertain to all employees of Westinghouse Hanford Company. The topics that are covered were taken from the recommendations and requirements for General Employee Training as set forth by the Institute of Nuclear Power Operations (INPO) in INPO 87-004, Guidelines for General Employee Training, applicable US Department of Energy orders, and Westinghouse Hanford Company procedures and policy. Besides presenting fundamental concepts, this course also contains information on resources that are available to assist students. It does this using Interactive Videodisk technology, which combines computer-generated text and graphics with audio and video provided by a videodisk player.

  15. A Comparative Evaluation of Computer Based and Non-Computer Based Instructional Strategies.

    ERIC Educational Resources Information Center

    Emerson, Ian

    1988-01-01

    Compares the computer assisted instruction (CAI) tutorial with its non-computerized pedagogical roots: the Socratic Dialog with Skinner's Programmed Instruction. Tests the effectiveness of a CAI tutorial on diffusion and osmosis against four other interactive and non-interactive instructional strategies. Notes computer based strategies were…

  16. A Comparative Evaluation of Computer Based and Non-Computer Based Instructional Strategies.

    ERIC Educational Resources Information Center

    Emerson, Ian

    1988-01-01

    Compares the computer assisted instruction (CAI) tutorial with its non-computerized pedagogical roots: the Socratic Dialog with Skinner's Programmed Instruction. Tests the effectiveness of a CAI tutorial on diffusion and osmosis against four other interactive and non-interactive instructional strategies. Notes computer based strategies were…

  17. A simple computational algorithm of model-based choice preference.

    PubMed

    Toyama, Asako; Katahira, Kentaro; Ohira, Hideki

    2017-06-01

    A broadly used computational framework posits that two learning systems operate in parallel during the learning of choice preferences-namely, the model-free and model-based reinforcement-learning systems. In this study, we examined another possibility, through which model-free learning is the basic system and model-based information is its modulator. Accordingly, we proposed several modified versions of a temporal-difference learning model to explain the choice-learning process. Using the two-stage decision task developed by Daw, Gershman, Seymour, Dayan, and Dolan (2011), we compared their original computational model, which assumes a parallel learning process, and our proposed models, which assume a sequential learning process. Choice data from 23 participants showed a better fit with the proposed models. More specifically, the proposed eligibility adjustment model, which assumes that the environmental model can weight the degree of the eligibility trace, can explain choices better under both model-free and model-based controls and has a simpler computational algorithm than the original model. In addition, the forgetting learning model and its variation, which assume changes in the values of unchosen actions, substantially improved the fits to the data. Overall, we show that a hybrid computational model best fits the data. The parameters used in this model succeed in capturing individual tendencies with respect to both model use in learning and exploration behavior. This computational model provides novel insights into learning with interacting model-free and model-based components.

  18. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  19. Managing geometric information with a data base management system

    NASA Technical Reports Server (NTRS)

    Dube, R. P.

    1984-01-01

    The strategies for managing computer based geometry are described. The computer model of geometry is the basis for communication, manipulation, and analysis of shape information. The research on integrated programs for aerospace-vehicle design (IPAD) focuses on the use of data base management system (DBMS) technology to manage engineering/manufacturing data. The objectives of IPAD is to develop a computer based engineering complex which automates the storage, management, protection, and retrieval of engineering data. In particular, this facility must manage geometry information as well as associated data. The approach taken on the IPAD project to achieve this objective is discussed. Geometry management in current systems and the approach taken in the early IPAD prototypes are examined.

  20. Cloud computing: a new business paradigm for biomedical information sharing.

    PubMed

    Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti

    2010-04-01

    We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.

  1. Computer-based and web-based radiation safety training

    SciTech Connect

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  2. Turning text into research networks: information retrieval and computational ontologies in the creation of scientific databases.

    PubMed

    Ceci, Flávio; Pietrobon, Ricardo; Gonçalves, Alexandre Leopoldo

    2012-01-01

    Web-based, free-text documents on science and technology have been increasing growing on the web. However, most of these documents are not immediately processable by computers slowing down the acquisition of useful information. Computational ontologies might represent a possible solution by enabling semantically machine readable data sets. But, the process of ontology creation, instantiation and maintenance is still based on manual methodologies and thus time and cost intensive. We focused on a large corpus containing information on researchers, research fields, and institutions. We based our strategy on traditional entity recognition, social computing and correlation. We devised a semi automatic approach for the recognition, correlation and extraction of named entities and relations from textual documents which are then used to create, instantiate, and maintain an ontology. We present a prototype demonstrating the applicability of the proposed strategy, along with a case study describing how direct and indirect relations can be extracted from academic and professional activities registered in a database of curriculum vitae in free-text format. We present evidence that this system can identify entities to assist in the process of knowledge extraction and representation to support ontology maintenance. We also demonstrate the extraction of relationships among ontology classes and their instances. We have demonstrated that our system can be used for the conversion of research information in free text format into database with a semantic structure. Future studies should test this system using the growing number of free-text information available at the institutional and national levels.

  3. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  4. A Computer Network Protocol for Library and Information Science Applications. NCLIS/NBS Task Force on Computer Network Protocol.

    ERIC Educational Resources Information Center

    National Commission on Libraries and Information Science, Washington, DC.

    This document describes a proposed computer to computer protocol for electronic communication of digital information over a nationwide library bibliographic network. The protocol allows application tasks at one site on the network to converse with application tasks at any other site, regardless of differences in computer architecture or operating…

  5. Nanoinformatics and DNA-based computing: catalyzing nanomedicine.

    PubMed

    Maojo, Victor; Martin-Sanchez, Fernando; Kulikowski, Casimir; Rodriguez-Paton, Alfonso; Fritts, Martin

    2010-05-01

    Five decades of research and practical application of computers in biomedicine has given rise to the discipline of medical informatics, which has made many advances in genomic and translational medicine possible. Developments in nanotechnology are opening up the prospects for nanomedicine and regenerative medicine where informatics and DNA computing can become the catalysts enabling health care applications at sub-molecular or atomic scales. Although nanomedicine promises a new exciting frontier for clinical practice and biomedical research, issues involving cost-effectiveness studies, clinical trials and toxicity assays, drug delivery methods, and the implementation of new personalized therapies still remain challenging. Nanoinformatics can accelerate the introduction of nano-related research and applications into clinical practice, leading to an area that could be called "translational nanoinformatics." At the same time, DNA and RNA computing presents an entirely novel paradigm for computation. Nanoinformatics and DNA-based computing are together likely to completely change the way we model and process information in biomedicine and impact the emerging field of nanomedicine most strongly. In this article, we review work in nanoinformatics and DNA (and RNA)-based computing, including applications in nanopediatrics. We analyze their scientific foundations, current research and projects, envisioned applications and potential problems that might arise from them.

  6. Call Admission Control Scheme Based on Statistical Information

    NASA Astrophysics Data System (ADS)

    Fujiwara, Takayuki; Oki, Eiji; Shiomoto, Kohei

    A call admission control (CAC) scheme based on statistical information is proposed, called the statistical CAC scheme. A conventional scheme needs to manage session information for each link to update the residual bandwidth of a network in real time. This scheme has a scalability problem in terms of network size. The statistical CAC rejects session setup requests in accordance to a pre-computed ratio, called the rejection ratio. The rejection ratio is computed by using statistical information about the bandwidth requested for each link so that the congestion probability is less than an upper bound specified by a network operator. The statistical CAC is more scalable in terms of network size than the conventional scheme because it does not need to keep accommodated session state information. Numerical results show that the statistical CAC, even without exact session state information, only slightly degrades network utilization compared with the conventional scheme.

  7. Shuttle Program Information Management System (SPIMS) data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Shuttle Program Information Management System (SPIMS) is a computerized data base operations system. The central computer is the CDC 170-730 located at Johnson Space Center (JSC), Houston, Texas. There are several applications which have been developed and supported by SPIMS. A brief description is given.

  8. A rule based computer aided design system

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1986-01-01

    A Computer Aided Design (CAD) system is presented which supports the iterative process of design, the dimensional continuity between mating parts, and the hierarchical structure of the parts in their assembled configuration. Prolog, an interactive logic programming language, is used to represent and interpret the data base. The solid geometry representing the parts is defined in parameterized form using the swept volume method. The system is demonstrated with a design of a spring piston.

  9. Computer-based theory of strategies

    SciTech Connect

    Findler, N.V.

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, utility theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. The author also discusses three long-term projects which aim at automatically analyzing and synthesizing strategies. 27 references.

  10. Computer-assisted engineering data base

    NASA Technical Reports Server (NTRS)

    Dube, R. P.; Johnson, H. R.

    1983-01-01

    General capabilities of data base management technology are described. Information requirements posed by the space station life cycle are discussed, and it is asserted that data base management technology supporting engineering/manufacturing in a heterogeneous hardware/data base management system environment should be applied to meeting these requirements. Today's commercial systems do not satisfy all of these requirements. The features of an R&D data base management system being developed to investigate data base management in the engineering/manufacturing environment are discussed. Features of this system represent only a partial solution to space station requirements. Areas where this system should be extended to meet full space station information management requirements are discussed.

  11. Computing border bases using mutant strategies

    NASA Astrophysics Data System (ADS)

    Ullah, E.; Abbas Khan, S.

    2014-01-01

    Border bases, a generalization of Gröbner bases, have actively been addressed during recent years due to their applicability to industrial problems. In cryptography and coding theory a useful application of border based is to solve zero-dimensional systems of polynomial equations over finite fields, which motivates us for developing optimizations of the algorithms that compute border bases. In 2006, Kehrein and Kreuzer formulated the Border Basis Algorithm (BBA), an algorithm which allows the computation of border bases that relate to a degree compatible term ordering. In 2007, J. Ding et al. introduced mutant strategies bases on finding special lower degree polynomials in the ideal. The mutant strategies aim to distinguish special lower degree polynomials (mutants) from the other polynomials and give them priority in the process of generating new polynomials in the ideal. In this paper we develop hybrid algorithms that use the ideas of J. Ding et al. involving the concept of mutants to optimize the Border Basis Algorithm for solving systems of polynomial equations over finite fields. In particular, we recall a version of the Border Basis Algorithm which is actually called the Improved Border Basis Algorithm and propose two hybrid algorithms, called MBBA and IMBBA. The new mutants variants provide us space efficiency as well as time efficiency. The efficiency of these newly developed hybrid algorithms is discussed using standard cryptographic examples.

  12. Quantum One Go Computation and the Physical Computation Level of Biological Information Processing

    NASA Astrophysics Data System (ADS)

    Castagnoli, Giuseppe

    2010-02-01

    By extending the representation of quantum algorithms to problem-solution interdependence, the unitary evolution part of the algorithm entangles the register containing the problem with the register containing the solution. Entanglement becomes correlation, or mutual causality, between the two measurement outcomes: the string of bits encoding the problem and that encoding the solution. In former work, we showed that this is equivalent to the algorithm knowing in advance 50% of the bits of the solution it will find in the future, which explains the quantum speed up. Mutual causality between bits of information is also equivalent to seeing quantum measurement as a many body interaction between the parts of a perfect classical machine whose normalized coordinates represent the qubit populations. This “hidden machine” represents the problem to be solved. The many body interaction (measurement) satisfies all the constraints of a nonlinear Boolean network “together and at the same time”—in one go—thus producing the solution. Quantum one go computation can formalize the physical computation level of the theories that place consciousness in quantum measurement. In fact, in visual perception, we see, thus recognize, thus process, a significant amount of information “together and at the same time”. Identifying the fundamental mechanism of consciousness with that of the quantum speed up gives quantum consciousness, with respect to classical consciousness, a potentially enormous evolutionary advantage.

  13. SACRD: a data base for fast reactor safety computer codes, operational procedures

    SciTech Connect

    Forsberg, V.M.; Arwood, J.W.; Greene, N.M.; Raiford, G.B.

    1980-09-01

    SACRD (Safety Analysis Computerized Reactor Data) is a data base of nondesign-related information used in computer codes for fast reactor safety analyses. This document reports the procedures used in SACRD to help assure a reasonable level of integrity of the material contained in the data base. It also serves to document much of the computer software used with the data base.

  14. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  15. Computer-Based Training for Non-Computer Subjects.

    ERIC Educational Resources Information Center

    Blank, George

    1984-01-01

    Discusses computer programs used to teach: engineers to analyze structural stress and strain; salesmen to sell; managers to write memos; musicians to compose; and pilots to use navigational instruments. A list of software on noncomputer subjects is provided; each entry includes title, supplier, format, current price, and computer needed. (JN)

  16. Phonetics Information Base and Lexicon

    ERIC Educational Resources Information Center

    Moran, Steven Paul

    2012-01-01

    In this dissertation, I investigate the linguistic and technological challenges involved in creating a cross-linguistic data set to undertake phonological typology. I then address the question of whether more sophisticated, knowledge-based approaches to data modeling, coupled with a broad cross-linguistic data set, can extend previous typological…

  17. Phonetics Information Base and Lexicon

    ERIC Educational Resources Information Center

    Moran, Steven Paul

    2012-01-01

    In this dissertation, I investigate the linguistic and technological challenges involved in creating a cross-linguistic data set to undertake phonological typology. I then address the question of whether more sophisticated, knowledge-based approaches to data modeling, coupled with a broad cross-linguistic data set, can extend previous typological…

  18. Space-Based Information Services

    NASA Astrophysics Data System (ADS)

    Lee, C.

    With useful data now beginning to flow from earth observation and navigation satellites, it is an active time for the development of space services - all types of satellites are now being put to work, not just Comsats. However derived products require a blend of innovative software design, low cost operational support and a real insight into the information needs of the customer. Science Systems is meeting this challenge through a series of on-going projects, three of which are summarised here (addressing navigation, communications and earth observation). By demonstrating a broad range of related disciplines; from monitoring and control to back-room billing; from data management to intelligent systems, Science Systems hopes to play a key role in this developing market.

  19. Microcomputer Based School Information Management Systems (SIMS) in Alberta Junior and Senior High Schools. Final Report.

    ERIC Educational Resources Information Center

    Wright, P.; Valbonesi, P.

    This report comprises a detailed evaluation of three IBM microcomputer-based school information management systems: Student Information and Records System (SIRS) by Management Information Group, The School System (TSS) by Columbia Computing Services, and Computer Educational Management Accounting System (CEMAS) by Computerlib. These three systems…

  20. Computational methodologies for compound database searching that utilize experimental protein-ligand interaction information.

    PubMed

    Tan, Lu; Batista, Jose; Bajorath, Jürgen

    2010-09-01

    Ligand- and target structure-based methods are widely used in virtual screening, but there is currently no methodology available that fully integrates these different approaches. Herein, we provide an overview of various attempts that have been made to combine ligand- and structure-based computational screening methods. We then review different types of approaches that utilize protein-ligand interaction information for database screening and filtering. Interaction-based approaches make use of a variety of methodological concepts including pharmacophore modeling and direct or indirect encoding of protein-ligand interactions in fingerprint formats. These interaction-based methods have been successfully applied to tackle different tasks related to virtual screening including postprocessing of docking poses, prioritization of binding modes, selectivity analysis, or similarity searching. Furthermore, we discuss the recently developed interacting fragment approach that indirectly incorporates 3D interaction information into 2D similarity searching and bridges between ligand- and structure-based methods.

  1. Modular multiple sensors information management for computer-integrated surgery.

    PubMed

    Vaccarella, Alberto; Enquobahrie, Andinet; Ferrigno, Giancarlo; Momi, Elena De

    2012-09-01

    In the past 20 years, technological advancements have modified the concept of modern operating rooms (ORs) with the introduction of computer-integrated surgery (CIS) systems, which promise to enhance the outcomes, safety and standardization of surgical procedures. With CIS, different types of sensor (mainly position-sensing devices, force sensors and intra-operative imaging devices) are widely used. Recently, the need for a combined use of different sensors raised issues related to synchronization and spatial consistency of data from different sources of information. In this study, we propose a centralized, multi-sensor management software architecture for a distributed CIS system, which addresses sensor information consistency in both space and time. The software was developed as a data server module in a client-server architecture, using two open-source software libraries: Image-Guided Surgery Toolkit (IGSTK) and OpenCV. The ROBOCAST project (FP7 ICT 215190), which aims at integrating robotic and navigation devices and technologies in order to improve the outcome of the surgical intervention, was used as the benchmark. An experimental protocol was designed in order to prove the feasibility of a centralized module for data acquisition and to test the application latency when dealing with optical and electromagnetic tracking systems and ultrasound (US) imaging devices. Our results show that a centralized approach is suitable for minimizing synchronization errors; latency in the client-server communication was estimated to be 2 ms (median value) for tracking systems and 40 ms (median value) for US images. The proposed centralized approach proved to be adequate for neurosurgery requirements. Latency introduced by the proposed architecture does not affect tracking system performance in terms of frame rate and limits US images frame rate at 25 fps, which is acceptable for providing visual feedback to the surgeon in the OR. Copyright © 2012 John Wiley & Sons, Ltd.

  2. UNIVIEW: A computer graphics platform bringing information databases to life

    NASA Astrophysics Data System (ADS)

    Warnstam, J.

    2008-06-01

    Uniview is a PC-based software platform for three-dimensional exploration of the Universe and the visualisation of information that is located at any position in this Universe, be it on the surface of the Earth or many light-years away from home. What began as a collaborative project with the American Museum of Natural History1 in New York in 2003 has now evolved into one of the leading visualisation platforms for the planetarium and science centre market with customers in both Europe and USA.

  3. Detecting Soft Errors in Stencil based Computations

    SciTech Connect

    Sharma, V.; Gopalkrishnan, G.; Bronevetsky, G.

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  4. New Trends in Taylor Series Based Computations

    NASA Astrophysics Data System (ADS)

    Kunovský, Jiří; Kraus, Michal; Šátek, Václav

    2009-09-01

    Motto: For the derivatives of all decent functions analytic formulas can be found but with integration this is only true for very special decent functions. The aim of our paper is to describe a new modern numerical method based on the Taylor Series Method and to show how to evaluate the high accuracy and speed of the corresponding computations. It is also the aim of our paper to show how to calculate finite integrals that are the fundamental part in signal processing, especially in Fourier analysis and how to use it for symbolic operations. It is a fact that the accuracy and stability of the algorithms we have designed significantly exceeds the presently known systems. In particular, the paper wants to concentrate, using the previous results and latest development trends, on the simulation of dynamic systems and on extremely exact mathematical computations.

  5. Base information content in organic formulas

    PubMed

    Graham; Schacht

    2000-07-01

    Three questions are addressed concerning organic formulas at their most primitive level: (1) What is the information per atomic symbol? (2) What is the level of system redundancy? (3) How are high-information formulas distinguished from low-information ones? The results are simple yet interesting. Carbon chemistry embodies a code which is low in base information and high in redundancy, irrespective of database size. Moreover, code units associated with halocarbons, proteins, and polynucleotides are especially high in information. Low-information units are more often associated with simple alkanes, aromatics, and common functional groups. Overall, the work for this paper quantifies the base information content in organic formulas; this contributes to research on symbolic language, chemical information, and molecular diversity.

  6. Using a Semantic Information Network to Develop Computer Literacy.

    ERIC Educational Resources Information Center

    Denenberg, Stewart A.

    1980-01-01

    Discusses a semantic network of computer literacy topics and a computer implementation (ACCOLADE) for the PLATO IV system. ACCOLADE was used by learners as a search tool for accessing knowledge about a particular topic in the computer literacy knowledge space and as a mechanism to reveal the structure of that knowledge space. References are…

  7. 2.5D dictionary learning based computed tomography reconstruction

    NASA Astrophysics Data System (ADS)

    Luo, Jiajia; Eri, Haneda; Can, Ali; Ramani, Sathish; Fu, Lin; De Man, Bruno

    2016-05-01

    A computationally efficient 2.5D dictionary learning (DL) algorithm is proposed and implemented in the model- based iterative reconstruction (MBIR) framework for low-dose CT reconstruction. MBIR is based on the minimization of a cost function containing data-fitting and regularization terms to control the trade-off between data-fidelity and image noise. Due to the strong denoising performance of DL, it has previously been considered as a regularizer in MBIR, and both 2D and 3D DL implementations are possible. Compared to the 2D case, 3D DL keeps more spatial information and generates images with better quality although it requires more computation. We propose a novel 2.5D DL scheme, which leverages the computational advantage of 2D-DL, while attempting to maintain reconstruction quality similar to 3D-DL. We demonstrate the effectiveness of this new 2.5D DL scheme for MBIR in low-dose CT. By applying the 2D DL method in three different orthogonal planes and calculating the sparse coefficients accordingly, much of the 3D spatial information can be preserved without incurring the computational penalty of the 3D DL method. For performance evaluation, we use baggage phantoms with different number of projection views. In order to quantitatively compare the performance of different algorithms, we use PSNR, SSIM and region based standard deviation to measure the noise level, and use the edge response to calculate the resolution. Experimental results with full view datasets show that the different DL based algorithms have similar performance and 2.5D DL has the best resolution. Results with sparse view datasets show that 2.5D DL outperforms both 2D and 3D DL in terms of noise reduction. We also compare the computational costs, and 2.5D DL shows strong advantage over 3D DL in both full-view and sparse-view cases.

  8. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    NASA Astrophysics Data System (ADS)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  9. A behavioral biometric system based on human-computer interaction

    NASA Astrophysics Data System (ADS)

    Gamboa, Hugo; Fred, Ana

    2004-08-01

    In this paper we describe a new behavioural biometric technique based on human computer interaction. We developed a system that captures the user interaction via a pointing device, and uses this behavioural information to verify the identity of an individual. Using statistical pattern recognition techniques, we developed a sequential classifier that processes user interaction, according to which the user identity is considered genuine if a predefined accuracy level is achieved, and the user is classified as an impostor otherwise. Two statistical models for the features were tested, namely Parzen density estimation and a unimodal distribution. The system was tested with different numbers of users in order to evaluate the scalability of the proposal. Experimental results show that the normal user interaction with the computer via a pointing device entails behavioural information with discriminating power, that can be explored for identity authentication.

  10. Unified framework for information integration based on information geometry

    PubMed Central

    Oizumi, Masafumi; Amari, Shun-ichi

    2016-01-01

    Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289

  11. Unified framework for information integration based on information geometry.

    PubMed

    Oizumi, Masafumi; Tsuchiya, Naotsugu; Amari, Shun-Ichi

    2016-12-20

    Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner.

  12. Bio-Inspired Computing, Information Swarms, and the Problem of Data Fusion

    NASA Astrophysics Data System (ADS)

    Nordmann, Brian

    The problem of information overload becomes a huge challenge, particularly when attempting to understanding how to introduce more and more disparate data streams into a data system. Little has been done on how to make those data streams understandable and usable by an analyst. A new paradigm is constructed here, unconstrained by the limits of the current desktop computer, to develop new ways of processing and analyzing data based on the behavior of cellular scale organisms. The additional issue of analytic "groupthink" or "information swarms" is also addressed, with potential solutions to the problem of "paralysis by analysis."

  13. Psychopathy-related traits and the use of reward and social information: a computational approach.

    PubMed

    Brazil, Inti A; Hunt, Laurence T; Bulten, Berend H; Kessels, Roy P C; de Bruijn, Ellen R A; Mars, Rogier B

    2013-01-01

    Psychopathy is often linked to disturbed reinforcement-guided adaptation of behavior in both clinical and non-clinical populations. Recent work suggests that these disturbances might be due to a deficit in actively using information to guide changes in behavior. However, how much information is actually used to guide behavior is difficult to observe directly. Therefore, we used a computational model to estimate the use of information during learning. Thirty-six female subjects were recruited based on their total scores on the Psychopathic Personality Inventory (PPI), a self-report psychopathy list, and performed a task involving simultaneous learning of reward-based and social information. A Bayesian reinforcement-learning model was used to parameterize the use of each source of information during learning. Subsequently, we used the subscales of the PPI to assess psychopathy-related traits, and the traits that were strongly related to the model's parameters were isolated through a formal variable selection procedure. Finally, we assessed how these covaried with model parameters. We succeeded in isolating key personality traits believed to be relevant for psychopathy that can be related to model-based descriptions of subject behavior. Use of reward-history information was negatively related to levels of trait anxiety and fearlessness, whereas use of social advice decreased as the perceived ability to manipulate others and lack of anxiety increased. These results corroborate previous findings suggesting that sub-optimal use of different types of information might be implicated in psychopathy. They also further highlight the importance of considering the potential of computational modeling to understand the role of latent variables, such as the weight people give to various sources of information during goal-directed behavior, when conducting research on psychopathy-related traits and in the field of forensic psychiatry.

  14. Psychopathy-related traits and the use of reward and social information: a computational approach

    PubMed Central

    Brazil, Inti A.; Hunt, Laurence T.; Bulten, Berend H.; Kessels, Roy P. C.; de Bruijn, Ellen R. A.; Mars, Rogier B.

    2013-01-01

    Psychopathy is often linked to disturbed reinforcement-guided adaptation of behavior in both clinical and non-clinical populations. Recent work suggests that these disturbances might be due to a deficit in actively using information to guide changes in behavior. However, how much information is actually used to guide behavior is difficult to observe directly. Therefore, we used a computational model to estimate the use of information during learning. Thirty-six female subjects were recruited based on their total scores on the Psychopathic Personality Inventory (PPI), a self-report psychopathy list, and performed a task involving simultaneous learning of reward-based and social information. A Bayesian reinforcement-learning model was used to parameterize the use of each source of information during learning. Subsequently, we used the subscales of the PPI to assess psychopathy-related traits, and the traits that were strongly related to the model's parameters were isolated through a formal variable selection procedure. Finally, we assessed how these covaried with model parameters. We succeeded in isolating key personality traits believed to be relevant for psychopathy that can be related to model-based descriptions of subject behavior. Use of reward-history information was negatively related to levels of trait anxiety and fearlessness, whereas use of social advice decreased as the perceived ability to manipulate others and lack of anxiety increased. These results corroborate previous findings suggesting that sub-optimal use of different types of information might be implicated in psychopathy. They also further highlight the importance of considering the potential of computational modeling to understand the role of latent variables, such as the weight people give to various sources of information during goal-directed behavior, when conducting research on psychopathy-related traits and in the field of forensic psychiatry. PMID:24391615

  15. Information Gain Based Dimensionality Selection for Classifying Text Documents

    SciTech Connect

    Dumidu Wijayasekara; Milos Manic; Miles McQueen

    2013-06-01

    Selecting the optimal dimensions for various knowledge extraction applications is an essential component of data mining. Dimensionality selection techniques are utilized in classification applications to increase the classification accuracy and reduce the computational complexity. In text classification, where the dimensionality of the dataset is extremely high, dimensionality selection is even more important. This paper presents a novel, genetic algorithm based methodology, for dimensionality selection in text mining applications that utilizes information gain. The presented methodology uses information gain of each dimension to change the mutation probability of chromosomes dynamically. Since the information gain is calculated a priori, the computational complexity is not affected. The presented method was tested on a specific text classification problem and compared with conventional genetic algorithm based dimensionality selection. The results show an improvement of 3% in the true positives and 1.6% in the true negatives over conventional dimensionality selection methods.

  16. Safeguards instrumentation: a computer-based catalog

    SciTech Connect

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available.

  17. Model-based neuroimaging for cognitive computing.

    PubMed

    Poznanski, Roman R

    2009-09-01

    The continuity of the mind is suggested to mean the continuous spatiotemporal dynamics arising from the electrochemical signature of the neocortex: (i) globally through volume transmission in the gray matter as fields of neural activity, and (ii) locally through extrasynaptic signaling between fine distal dendrites of cortical neurons. If the continuity of dynamical systems across spatiotemporal scales defines a stream of consciousness then intentional metarepresentations as templates of dynamic continuity allow qualia to be semantically mapped during neuroimaging of specific cognitive tasks. When interfaced with a computer, such model-based neuroimaging requiring new mathematics of the brain will begin to decipher higher cognitive operations not possible with existing brain-machine interfaces.

  18. Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing.

    PubMed

    Yilmaz, Ozgur

    2015-12-01

    This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation. A cellular automaton is used as the reservoir of dynamical systems. Input is randomly projected onto the initial conditions of automaton cells, and nonlinear computation is performed on the input via application of a rule in the automaton for a period of time. The evolution of the automaton creates a space-time volume of the automaton state space, and it is used as the reservoir. The proposed framework is shown to be capable of long-term memory, and it requires orders of magnitude less computation compared to echo state networks. As the focus of the letter, we suggest that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing, paving a direct way for concept building and symbolic processing. To demonstrate the capability of the proposed system, we make analogies directly on image data by asking, What is the automobile of air?

  19. An object oriented computer-based patient record reference model.

    PubMed Central

    Doré, L.; Lavril, M.; Jean, F. C.; Degoulet, P.

    1995-01-01

    In the context of health care information systems based on client/server architecture, we address the problem of a common Computer-based Patient Record (CPR). We define it as a collection of faithful observations about patients care, with respect to the free expression of physicians. This CPR model supports several views of the medical data, in order to provide applications with a comprehensive and standardized access to distributed patient data. Finally, we validated our CPR approach as a primary data model server for an application for hypertensive patient management. PMID:8563306

  20. Impacts of Personal Characteristics on Computer Attitude and Academic Users Information System Satisfaction.

    ERIC Educational Resources Information Center

    Lim, Kee-Sook

    2002-01-01

    Describes a study that evaluated the effects of computer experience, gender, and academic performance on computer attitude and user information system satisfaction in a university setting. Results of an analysis of variance showed that the personal characteristics made a difference in computer attitudes but not in academic computer system user…

  1. An analysis of computer-related patient safety incidents to inform the development of a classification

    PubMed Central

    Ong, Mei-Sing; Runciman, William; Coiera, Enrico

    2010-01-01

    Objective To analyze patient safety incidents associated with computer use to develop the basis for a classification of problems reported by health professionals. Design Incidents submitted to a voluntary incident reporting database across one Australian state were retrieved and a subset (25%) was analyzed to identify ‘natural categories’ for classification. Two coders independently classified the remaining incidents into one or more categories. Free text descriptions were analyzed to identify contributing factors. Where available medical specialty, time of day and consequences were examined. Measurements Descriptive statistics; inter-rater reliability. Results A search of 42 616 incidents from 2003 to 2005 yielded 123 computer related incidents. After removing duplicate and unrelated incidents, 99 incidents describing 117 problems remained. A classification with 32 types of computer use problems was developed. Problems were grouped into information input (31%), transfer (20%), output (20%) and general technical (24%). Overall, 55% of problems were machine related and 45% were attributed to human–computer interaction. Delays in initiating and completing clinical tasks were a major consequence of machine related problems (70%) whereas rework was a major consequence of human–computer interaction problems (78%). While 38% (n=26) of the incidents were reported to have a noticeable consequence but no harm, 34% (n=23) had no noticeable consequence. Conclusion Only 0.2% of all incidents reported were computer related. Further work is required to expand our classification using incident reports and other sources of information about healthcare IT problems. Evidence based user interface design must focus on the safe entry and retrieval of clinical information and support users in detecting and correcting errors and malfunctions. PMID:20962128

  2. A Unified Computational Architecture for Preprocessing Visual Information in Space and Time.

    NASA Astrophysics Data System (ADS)

    Skrzypek, Josef

    1986-06-01

    The success of autonomous mobile robots depends on the ability to understand continuously changing scenery. Present techniques for analysis of images are not always suitable because in sequential paradigm, computation of visual functions based on absolute values of stimuli is inefficient. Important aspects of visual information are encoded in discontinuities of intensity, hence a representation in terms of relative values seems advantageous. We present the computing architecture of a massively parallel vision module which optimizes the detection of relative intensity changes in space and time. Visual information must remain constant despite variation in ambient light level or velocity of target and robot. Constancy can be achieved by normalizing motion and lightness scales. In both cases, basic computation involves a comparison of the center pixels with the context of surrounding values. Therefore, a similar computing architecture, composed of three functionally-different and hierarchically-arranged layers of overlapping operators, can be used for two integrated parts of the module. The first part maintains high sensitivity to spatial changes by reducing noise and normalizing the lightness scale. The result is used by the second part to maintain high sensitivity to temporal discontinuities and to compute relative motion information. Simulation results show that response of the module is proportional to contrast of the stimulus and remains constant over the whole domain of intensity. It is also proportional to velocity of motion limited to any small portion of the visual field. Uniform motion throughout the visual field results in constant response, independent of velocity. Spatial and temporal intensity changes are enhanced because computationally, the module resembles the behavior of a DOG function.

  3. Computationally Informed Design of a Multi-Axial Actuated Microfluidic Chip Device.

    PubMed

    Gizzi, Alessio; Giannitelli, Sara Maria; Trombetta, Marcella; Cherubini, Christian; Filippi, Simonetta; De Ninno, Adele; Businaro, Luca; Gerardino, Annamaria; Rainer, Alberto

    2017-07-14

    This paper describes the computationally informed design and experimental validation of a microfluidic chip device with multi-axial stretching capabilities. The device, based on PDMS soft-lithography, consisted of a thin porous membrane, mounted between two fluidic compartments, and tensioned via a set of vacuum-driven actuators. A finite element analysis solver implementing a set of different nonlinear elastic and hyperelastic material models was used to drive the design and optimization of chip geometry and to investigate the resulting deformation patterns under multi-axial loading. Computational results were cross-validated by experimental testing of prototypal devices featuring the in silico optimized geometry. The proposed methodology represents a suite of computationally handy simulation tools that might find application in the design and in silico mechanical characterization of a wide range of stretchable microfluidic devices.

  4. An Overview of Computer-Based Natural Language Processing.

    ERIC Educational Resources Information Center

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  5. Biomedical signals monitoring based in mobile computing.

    PubMed

    Serigioli, Nilton; Reina Munoz, Rodrigo; Rodriguez, Edgar Charry

    2010-01-01

    The main objective of this project consists in the development of a biomedical instrumentation prototype for acquisition, processing and transmission of biomedical signals. These biomedical signals are acquired and then processed with a microcontroller. After processing, all data are sent to a communication interface that can send this information to a personal computer or a cell phone. The prototype developed, which is a digital blood pressure meter, is intended to allow remote monitoring of patients living in areas with limited access to medical assistance or scarce clinical resources. We believe that this development could be helpful to improve people's quality of life, as well as to allow an improvement in the government attendance indices.

  6. Information-time based futures pricing

    NASA Astrophysics Data System (ADS)

    Yen, Simon; Wang, Jai Jen

    2009-09-01

    This study follows Clark [P.K. Clark, A subordinated stochastic process model with finite variance for speculative prices, Econometrica 41 (1973) 135-155] and Chang, Chang and Lim [C.W. Chang, S.K. Chang, K.G. Lim, Information-time option pricing: Theory and empirical evidence, Journal of Financial Economics 48 (1998) 211-242] to subordinate an information-time based directing process into calendar-time based parent processes. A closed-form futures pricing formula is derived after taking into account the information-time setting and the stochasticity of the spot price, interest rate, and convenience yield. According to the empirical results on the TAIEX and TFETX data from 1998/7/21 to 2003/12/31, the information-time based model performs better than its calendar-time based counterpart and the cost of carry model, especially when the information arrival intensity estimates become larger.

  7. Computer Center (VAXcluster Libraries/DTNSRDC (Commands and General Information).

    DTIC Science & Technology

    1986-05-01

    SURFACE EFFECTS DEPARTMENT DEPARTMENT k’ A 15 16 I - STRUCTURES COMPUTATION, I",,STRUCTURES MATHEMATICS AND -- = DEPARTMENT LOGISTICS DEPARTMENT " 17...18. SUPPLEMENTARY NOTES % ’- 19. KEY WORDS IContinue on reverse side if necessary and identify by block numberl " ,’.. Computer...documentation .16 q’ 20. ABSTRACT (Continue on reverse side if necessary and identify by block number) - * The Computer Center DEC VAXcluster Libraries

  8. NLM Evidence-based Information at Your Fingertips - NBNA

    SciTech Connect

    Womble, R.

    2010-08-06

    The workshop titled, National Library of Medicine: Evidence-based Information At Your Fingertips, is a computer training class designed to meet the needs of nurses who require access to information on specific medical topics and on the adverse health effects of exposure to hazardous substances. The Specialized Information Services Division of the National Library of Medicine (NLM) is sponsoring this workshop for the National Black Nurses Association to increase the awareness of health professionals of the availability and value of the free NLM medical, environmental health, and toxicology databases.

  9. Performance of four computer-based diagnostic systems.

    PubMed

    Berner, E S; Webster, G D; Shugerman, A A; Jackson, J R; Algina, J; Baker, A L; Ball, E V; Cobbs, C G; Dennis, V W; Frenkel, E P

    1994-06-23

    Computer-based diagnostic systems are available commercially, but there has been limited evaluation of their performance. We assessed the diagnostic capabilities of four internal medicine diagnostic systems: Dxplain, Iliad, Meditel, and QMR. Ten expert clinicians created a set of 105 diagnostically challenging clinical case summaries involving actual patients. Clinical data were entered into each program with the vocabulary provided by the program's developer. Each of the systems produced a ranked list of possible diagnoses for each patient, as did the group of experts. We calculated scores on several performance measures for each computer program. No single computer program scored better than the others on all performance measures. Among all cases and all programs, the proportion of correct diagnoses ranged from 0.52 to 0.71, and the mean proportion of relevant diagnoses ranged from 0.19 to 0.37. On average, less than half the diagnoses on the experts' original list of reasonable diagnoses were suggested by any of the programs. However, each program suggested an average of approximately two additional diagnoses per case that the experts found relevant but had not originally considered. The results provide a profile of the strengths and limitations of these computer programs. The programs should be used by physicians who can identify and use the relevant information and ignore the irrelevant information that can be produced.

  10. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    DTIC Science & Technology

    1987-10-01

    This was an instrumentation grant to purchase equipment of support of research in neural networks, information science , artificial intelligence, and applied mathematics. Computer lab equipment, motor control and robotics lab equipment, speech analysis equipment and computational vision equipment were purchased.

  11. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-24

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Access to Confidential Business Information by Computer Sciences Corporation and Its Identified... contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access...

  12. Developing Visualization Techniques for Semantics-based Information Networks

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  13. Developing Visualization Techniques for Semantics-based Information Networks

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  14. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  15. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  16. Ultrasound-based liver computer assisted surgery.

    PubMed

    Windyga, P; Hiransakolwong, N; Vu, K; Medina, R; Onik, G

    2004-01-01

    Ongoing research toward development of a computer-assisted, ultrasound-based software/hardware tool to improve instrument positioning in moving organs during minimally invasive abdominal surgery is presented. The main objective of this research is to calculate, in real time and without user intervention, the pre-/intra-operative 3D/2D image misalignment due to patient respiration and the shift induced by the surgical instrument. Our methodology applied to the particular case of the liver, and partial results related to the image registration approach, based on organ segmentation and shape description, are presented. Preliminary results are highly encouraging. Among other benefits, use of this tool will increase surgeon confidence and improve surgery outcomes.

  17. Computational algebraic topology-based video restoration

    NASA Astrophysics Data System (ADS)

    Rochel, Alban; Ziou, Djemel; Auclair-Fortier, Marie-Flavie

    2005-03-01

    This paper presents a scheme for video denoising by diffusion of gray levels, based on the Computational Algebraic Topology (CAT) image model. The diffusion approach is similar to the one used to denoise static images. Rather than using the heat transfer partial differential equation, discretizing it and solving it by a purely mathematical process, the CAT approach considers the global expression of the heat transfer and decomposes it into elementary physical laws. Some of these laws describe conservative relations, leading to error-free expressions, whereas others depend on metric quantities and require approximation. This scheme allows for a physical interpretation for each step of the resolution process. We propose a nonlinear and an anisotropic diffusion algorithms based on the extension to video of an existing 2D algorithm thanks to the flexibility of the topological support. Finally it is validated with experimental results.

  18. A computer-oriented system for assembling and displaying land management information

    Treesearch

    Elliot L. Amidon

    1964-01-01

    Maps contain information basic to land management planning. By transforming conventional map symbols into numbers which are punched into cards, the land manager can have a computer assemble and display information required for a specific job. He can let a computer select information from several maps, combine it with such nonmap data as treatment cost or benefit per...

  19. American Bar Association—Computer Law Division Legal Protection for the Value of Information in Computer Systems

    PubMed Central

    Ochs, Laurance J.

    1983-01-01

    The U. S. legal structure for protecting intellectual property is comprised of patent, copyright and trade secret concepts. Whether it be in self-contained desk top models, mainframes or national networks, that structure is finding awkward application in protecting the value of information in computer systems. The Computer Law Division of the Science and Technology Section of the American Bar Association has developed a project to analyze and provide for and protect the economic value of information in computer systems. The goal of the project is to provide a comprehensive and authoritative analysis of all aspects of U. S. legal protections for the value of information in computer systems. Such an analysis has not been attempted before, and if successful, could have a major impact upon the understanding of the bar and bench, as well as legislators, of the problems our present legal system presents for our infant information economy.

  20. Thermodynamic cost of computation, algorithmic complexity and the information metric

    NASA Technical Reports Server (NTRS)

    Zurek, W. H.

    1989-01-01

    Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.