Science.gov

Sample records for head-controlled human-computer interface

  1. Human-computer interface

    DOEpatents

    Anderson, Thomas G.

    2004-12-21

    The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

  2. Human-computer interface design

    SciTech Connect

    Bowser, S.E.

    1995-04-01

    Modern military forces assume that computer-based information is reliable, timely, available, usable, and shared. The importance of computer-based information is based on the assumption that {open_quotes}shared situation awareness, coupled with the ability to conduct continuous operations, will allow information age armies to observe, decide, and act faster, more correctly and more precisely than their enemies.{close_quotes} (Sullivan and Dubik 1994). Human-Computer Interface (HCI) design standardization is critical to the realization of the previously stated assumptions. Given that a key factor of a high-performance, high-reliability system is an easy-to-use, effective design of the interface between the hardware, software, and the user, it follows logically that the interface between the computer and the military user is critical to the success of the information-age military. The proliferation of computer technology has resulted in the development of an extensive variety of computer-based systems and the implementation of varying HCI styles on these systems. To accommodate the continued growth in computer-based systems, minimize HCI diversity, and improve system performance and reliability, the U.S. Department of Defense (DoD) is continuing to adopt interface standards for developing computer-based systems.

  3. Formal specification of human-computer interfaces

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent

    1990-01-01

    A high-level formal specification of a human computer interface is described. Previous work is reviewed and the ASLAN specification language is described. Top-level specifications written in ASLAN for a library and a multiwindow interface are discussed.

  4. Human computer interface guide, revision A

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Human Computer Interface Guide, SSP 30540, is a reference document for the information systems within the Space Station Freedom Program (SSFP). The Human Computer Interface Guide (HCIG) provides guidelines for the design of computer software that affects human performance, specifically, the human-computer interface. This document contains an introduction and subparagraphs on SSFP computer systems, users, and tasks; guidelines for interactions between users and the SSFP computer systems; human factors evaluation and testing of the user interface system; and example specifications. The contents of this document are intended to be consistent with the tasks and products to be prepared by NASA Work Package Centers and SSFP participants as defined in SSP 30000, Space Station Program Definition and Requirements Document. The Human Computer Interface Guide shall be implemented on all new SSFP contractual and internal activities and shall be included in any existing contracts through contract changes. This document is under the control of the Space Station Control Board, and any changes or revisions will be approved by the deputy director.

  5. Human/Computer Interfacing in Educational Environments.

    ERIC Educational Resources Information Center

    Sarti, Luigi

    1992-01-01

    This discussion of educational applications of user interfaces covers the benefits of adopting database techniques in organizing multimedia materials; the evolution of user interface technology, including teletype interfaces, analogic overlay graphics, window interfaces, and adaptive systems; application design problems, including the…

  6. Learning Machine, Vietnamese Based Human-Computer Interface.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    The sixth session of IT@EDU98 consisted of seven papers on the topic of the learning machine--Vietnamese based human-computer interface, and was chaired by Phan Viet Hoang (Informatics College, Singapore). "Knowledge Based Approach for English Vietnamese Machine Translation" (Hoang Kiem, Dinh Dien) presents the knowledge base approach,…

  7. Portable human/computer interface mounted in eyewear

    NASA Astrophysics Data System (ADS)

    Spitzer, Mark B.; Aquilino, P. D.; Olson, Mark H.; McClelland, Robert W.; Rensing, Noa M.

    1998-08-01

    This paper presents results on the development of an eyeglass based human/computer interface. The interface comprises a display mounted within the eyeglasses, and a lens for relaying information inconspicuously to the wearer's eye. The paper will discuss eyeglass interface systems that utilize miniature displays and magnifying optics to provide a field of view of up to 10 degrees, with a resolution of approximately .03 degrees per pixel. Details of the design and construction of such systems, including methods of addressing the need for prescriptive correction will be presented. The paper concludes with comments on adding other new features to the interface system.

  8. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  9. Human-computer interface incorporating personal and application domains

    DOEpatents

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  10. Human-computer interface incorporating personal and application domains

    DOEpatents

    Anderson, Thomas G.

    2004-04-20

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  11. Human-computer interface including haptically controlled interactions

    DOEpatents

    Anderson, Thomas G.

    2005-10-11

    The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.

  12. Human-computer interface controlled by the lip.

    PubMed

    Jose, Marcelo Archajo; de Deus Lopes, Roseli

    2015-01-01

    Lip control system is an innovative human-computer interface specially designed for people with tetraplegia. This paper presents an evaluation of the lower lip potential to control an input device, according to Fitts' law (ISO/TS 9241-411:2012 standard). The results show that the lower lip throughput is comparable with the thumb throughput using the same input device under the same conditions. These results establish the baseline for future research studies about the lower lip capacity to operate a computer input device.

  13. Human-computer interface glove using flexible piezoelectric sensors

    NASA Astrophysics Data System (ADS)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  14. A low cost human computer interface based on eye tracking.

    PubMed

    Hiley, Jonathan B; Redekopp, Andrew H; Fazel-Rezai, Reza

    2006-01-01

    This paper describes the implementation of a human computer interface based on eye tracking. Current commercially available systems exist, but have limited use due mainly to their large cost. The system described in this paper was designed to be a low cost and unobtrusive. The technique was video-oculography assisted by corneal reflections. An off-the shelf CCD webcam was used to capture images. The images were analyzed in software to extract key features of the eye. The users gaze point was then calculated based on the relative position of these features. The system is capable of calculating eye-gaze in real-time to provide a responsive interaction. A throughput of eight gaze points per second was achieved. The accuracy of the fixations based on the calculated eye-gazes were within 1 cm of the on-screen gaze location. By developing a low-cost system, this technology is made accessible to a wider range of applications.

  15. Designers' models of the human-computer interface

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Breedin, Sarah D.

    1993-01-01

    Understanding design models of the human-computer interface (HCI) may produce two types of benefits. First, interface development often requires input from two different types of experts: human factors specialists and software developers. Given the differences in their backgrounds and roles, human factors specialists and software developers may have different cognitive models of the HCI. Yet, they have to communicate about the interface as part of the design process. If they have different models, their interactions are likely to involve a certain amount of miscommunication. Second, the design process in general is likely to be guided by designers' cognitive models of the HCI, as well as by their knowledge of the user, tasks, and system. Designers do not start with a blank slate; rather they begin with a general model of the object they are designing. The author's approach to a design model of the HCI was to have three groups make judgments of categorical similarity about the components of an interface: human factors specialists with HCI design experience, software developers with HCI design experience, and a baseline group of computer users with no experience in HCI design. The components of the user interface included both display components such as windows, text, and graphics, and user interaction concepts, such as command language, editing, and help. The judgments of the three groups were analyzed using hierarchical cluster analysis and Pathfinder. These methods indicated, respectively, how the groups categorized the concepts, and network representations of the concepts for each group. The Pathfinder analysis provides greater information about local, pairwise relations among concepts, whereas the cluster analysis shows global, categorical relations to a greater extent.

  16. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1991-01-01

    Some of the many analytical models in human-computer interface design that are currently being developed are described. The usefulness of analytical models for human-computer interface design is evaluated. Can the use of analytical models be recommended to interface designers? The answer, based on the empirical research summarized here, is: not at this time. There are too many unanswered questions concerning the validity of models and their ability to meet the practical needs of design organizations.

  17. Developing the human-computer interface for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Holden, Kritina L.

    1991-01-01

    For the past two years, the Human-Computer Interaction Laboratory (HCIL) at the Johnson Space Center has been involved in prototyping and prototype reviews of in support of the definition phase of the Space Station Freedom program. On the Space Station, crew members will be interacting with multi-monitor workstations where interaction with several displays at one time will be common. The HCIL has conducted several experiments to begin to address design issues for this complex system. Experiments have dealt with design of ON/OFF indicators, the movement of the cursor across multiple monitors, and the importance of various windowing capabilities for users performing multiple tasks simultaneously.

  18. User interface issues in supporting human-computer integrated scheduling

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    Explored here is the user interface problems encountered with the Operations Missions Planner (OMP) project at the Jet Propulsion Laboratory (JPL). OMP uses a unique iterative approach to planning that places additional requirements on the user interface, particularly to support system development and maintenance. These requirements are necessary to support the concepts of heuristically controlled search, in-progress assessment, and iterative refinement of the schedule. The techniques used to address the OMP interface needs are given.

  19. User interface issues in supporting human-computer integrated scheduling

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  20. User interface issues in supporting human-computer integrated scheduling

    NASA Astrophysics Data System (ADS)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-09-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  1. Competence of People with Intellectual Disabilities on Using Human-Computer Interface

    ERIC Educational Resources Information Center

    Wong, Alex W. K.; Chan, Chetwyn C. H.; Li-Tsang, Cecilia W. P.; Lam, Chow S.

    2009-01-01

    We investigated the task processes which hinder people with intellectual disabilities (ID) when using the human-computer interface. This involved testing performance on specific computer tasks and conducting detailed analyses of the task demands imposed on the participants. The interface used by Internet Explorer (IE) was standardized into 16…

  2. A Framework and Implementation of User Interface and Human-Computer Interaction Instruction

    ERIC Educational Resources Information Center

    Peslak, Alan

    2005-01-01

    Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…

  3. A Framework and Implementation of User Interface and Human-Computer Interaction Instruction

    ERIC Educational Resources Information Center

    Peslak, Alan

    2005-01-01

    Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…

  4. Competence of People with Intellectual Disabilities on Using Human-Computer Interface

    ERIC Educational Resources Information Center

    Wong, Alex W. K.; Chan, Chetwyn C. H.; Li-Tsang, Cecilia W. P.; Lam, Chow S.

    2009-01-01

    We investigated the task processes which hinder people with intellectual disabilities (ID) when using the human-computer interface. This involved testing performance on specific computer tasks and conducting detailed analyses of the task demands imposed on the participants. The interface used by Internet Explorer (IE) was standardized into 16…

  5. Gesture controlled human-computer interface for the disabled.

    PubMed

    Szczepaniak, Oskar M; Sawicki, Dariusz J

    2017-02-28

    The possibility of using a computer by a disabled person is one of the difficult problems of the human-computer interaction (HCI), while the professional activity (employment) is one of the most important factors affecting the quality of life, especially for disabled people. The aim of the project has been to propose a new HCI system that would allow for resuming employment for people who have lost the possibility of a standard computer operation. The basic requirement was to replace all functions of a standard mouse without the need of performing precise hand movements and using fingers. The Microsoft's Kinect motion controller had been selected as a device which would recognize hand movements. Several tests were made in order to create optimal working environment with the new device. The new communication system consisted of the Kinect device and the proper software had been built. The proposed system was tested by means of the standard subjective evaluations and objective metrics according to the standard ISO 9241-411:2012. The overall rating of the new HCI system shows the acceptance of the solution. The objective tests show that although the new system is a bit slower, it may effectively replace the computer mouse. The new HCI system fulfilled its task for a specific disabled person. This resulted in the ability to return to work. Additionally, the project confirmed the possibility of effective but nonstandard use of the Kinect device. Med Pr 2017;68(1):1-21.

  6. Enhancing the human-computer interface of power system applications

    SciTech Connect

    Azevedo, G.P. de; Souza, C.S. de; Feijo, B.

    1995-12-31

    This paper examines a topic of increasing importance: the interpretation of the massive amount of data available to power system engineers. The solutions currently adopted in the presentation of data in graphical interfaces are discussed. It is demonstrated that the representations of electric diagrams can be considerably enhanced through the adequate exploitation of resources available in full-graphics screens and the use of basic concepts from human-factors research. Enhanced representations of electric diagrams are proposed and tested. The objective is to let the user see the behavior of the system, allowing for better interpretation of program data and results and improving user`s productivity.

  7. Enhancing the human-computer interface of power system applications

    SciTech Connect

    Azevedo, G.P. de; Souza, C.S. de; Feijo, B.

    1996-05-01

    This paper examines a topic of increasing importance: the interpretation of the massive amount of data available to power system engineers. The solutions currently adopted in the presentation of data in graphical interfaces are discussed. It is demonstrated that the representations of electric diagrams can be considerably enhanced through the adequate exploitation of resources available in full-graphics screens and the use of basic concepts from human-factors research. Enhanced representations of electric diagrams are proposed and tested. The objective is to let the user ``see`` the behavior of the system, allowing for better interpretation of program data and results and improving user`s productivity.

  8. Emerging human-computer interface (HCI) design guidelines for graphical user interface (GUI)

    SciTech Connect

    Bowser, S.E.; Adams, S.M.

    1993-10-01

    The requirement to establish baseline style references for Graphical User Interfaces (GUIs) is recognized. The ability to obtain consensus among user communities has been limited to nonexistent. The authors are part of a team that has developed a generic baseline human-computer interface (HCI) style guide for the U.S. Department of Defense (DoD). The DoD HCI Style Guide has its origin in a style guide developed by the intelligence community and in human factors design guidelines developed for Army tactical command and control systems. The DoD HCI Style Guide is intended to be a baseline style reference for the design of HCIs within DoD. The needs of specific user communities have been addressed by including addenda that expand on the baseline and address focus areas of interest. The conclusion is that an overall or general style guide should be adopted for GUIs with allowance for specialized user group requirements and additions. The anticipated results would be higher productivity and reduced training and development time.

  9. New Human-Computer Interface Concepts for Mission Operations

    NASA Technical Reports Server (NTRS)

    Fox, Jeffrey A.; Hoxie, Mary Sue; Gillen, Dave; Parkinson, Christopher; Breed, Julie; Nickens, Stephanie; Baitinger, Mick

    2000-01-01

    The current climate of budget cuts has forced the space mission operations community to reconsider how it does business. Gone are the days of building one-of-kind control centers with teams of controllers working in shifts 24 hours per day, 7 days per week. Increasingly, automation is used to significantly reduce staffing needs. In some cases, missions are moving towards lights-out operations where the ground system is run semi-autonomously. On-call operators are brought in only to resolve anomalies. Some operations concepts also call for smaller operations teams to manage an entire family of spacecraft. In the not too distant future, a skeleton crew of full-time general knowledge operators will oversee the operations of large constellations of small spacecraft, while geographically distributed specialists will be assigned to emergency response teams based on their expertise. As the operations paradigms change, so too must the tools to support the mission operations team's tasks. Tools need to be built not only to automate routine tasks, but also to communicate varying types of information to the part-time, generalist, or on-call operators and specialists more effectively. Thus, the proper design of a system's user-system interface (USI) becomes even more importance than before. Also, because the users will be accessing these systems from various locations (e.g., control center, home, on the road) via different devices with varying display capabilities (e.g., workstations, home PCs, PDAS, pagers) over connections with various bandwidths (e.g., dial-up 56k, wireless 9.6k), the same software must have different USIs to support the different types of users, their equipment, and their environments. In other words, the software must now adapt to the needs of the users! This paper will focus on the needs and the challenges of designing USIs for mission operations. After providing a general discussion of these challenges, the paper will focus on the current efforts of

  10. The Impact of Verbal Report Protocol Analysis on a Model of Human-Computer Interface Cognitive Processing

    DTIC Science & Technology

    1991-03-01

    interface decision, a complete picture of the human - computer interaction must be acquired. This can only be accomplished if all of the design factors...they are related to human - computer interaction . First of all, the nature of the interface design influences the users’ mental models of a system...effective tool in assessing the cognitive process of human - computer interaction . 63 LIST OF REFERENCES 1. Coventry, Lynn, "Some Effects of Cognitive Style

  11. Improving the human-computer interface: a human factors engineering approach.

    PubMed

    Salvemini, A V

    1998-01-01

    Human factors engineering involves the application of information about human behavior and characteristics in the design and testing of products, systems, and environments. A computing system's interface is developed on the basis of potential users' capabilities and limitations, the users' tasks, and the environment in which those tasks are performed. When human factors engineers work with users, subject-matter experts, and developers to design and test a system, they analyze and document users' tasks and requirements and develop prototype designs. Usability studies are conducted and user errors are analyzed to identify problems and develop recommendations for improving the human-computer interface.

  12. Sensory system for implementing a human-computer interface based on electrooculography.

    PubMed

    Barea, Rafael; Boquete, Luciano; Rodriguez-Ascariz, Jose Manuel; Ortega, Sergio; López, Elena

    2011-01-01

    This paper describes a sensory system for implementing a human-computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes.

  13. Improving interface quality: an investigation of human-computer interaction task learning.

    PubMed

    Mitta, D A; Packebush, S J

    1995-07-01

    User learning is of critical importance in evaluating interface usability (and in turn interface quality). The focus of this research in on interface learnability, where a stochastic model represents the learning process required for successful completion of human-computer interaction tasks. The parameter used to quantify learning is a learning rate. Of interest here is the validation of learning rate as a measure of interface quality. Learning rate was validated against two traditional measures of interface quality: task completion time, and error frequency. SuperCard, a Macintosh project utility, provided an empirical learning environment in which 32 participants learned 16 fundamental SuperCard tasks. Results of correlation analyses suggested the usefulness of learning rate as an indicator of interface quality. Our learning rate analysis identified four tasks presenting learning difficulties. (Analysis of task completion times identified two of these four tasks, and error frequency analysis identified one). Learning rate data captured all of the information available from the two traditional interface quality measures and identified two tasks disregarded by them. Incorporating learning rates in the interface evaluation process precludes time-intensive video tape analysis typically required by more traditional interface quality measures.

  14. The use of analytical models in human-computer interface design

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  15. Effects of muscle fatigue on the usability of a myoelectric human-computer interface.

    PubMed

    Barszap, Alexander G; Skavhaug, Ida-Maria; Joshi, Sanjay S

    2016-10-01

    Electromyography-based human-computer interface development is an active field of research. However, knowledge on the effects of muscle fatigue for specific devices is limited. We have developed a novel myoelectric human-computer interface in which subjects continuously navigate a cursor to targets by manipulating a single surface electromyography (sEMG) signal. Two-dimensional control is achieved through simultaneous adjustments of power in two frequency bands through a series of dynamic low-level muscle contractions. Here, we investigate the potential effects of muscle fatigue during the use of our interface. In the first session, eight subjects completed 300 cursor-to-target trials without breaks; four using a wrist muscle and four using a head muscle. The wrist subjects returned for a second session in which a static fatiguing exercise took place at regular intervals in-between cursor-to-target trials. In the first session we observed no declines in performance as a function of use, even after the long period of use. In the second session, we observed clear changes in cursor trajectories, paired with a target-specific decrease in hit rates.

  16. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    NASA Technical Reports Server (NTRS)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  17. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    NASA Astrophysics Data System (ADS)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  18. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    SciTech Connect

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  19. The design of an intelligent human-computer interface for the test, control and monitor system

    NASA Technical Reports Server (NTRS)

    Shoaff, William D.

    1988-01-01

    The graphical intelligence and assistance capabilities of a human-computer interface for the Test, Control, and Monitor System at Kennedy Space Center are explored. The report focuses on how a particular commercial off-the-shelf graphical software package, Data Views, can be used to produce tools that build widgets such as menus, text panels, graphs, icons, windows, and ultimately complete interfaces for monitoring data from an application; controlling an application by providing input data to it; and testing an application by both monitoring and controlling it. A complete set of tools for building interfaces is described in a manual for the TCMS toolkit. Simple tools create primitive widgets such as lines, rectangles and text strings. Intermediate level tools create pictographs from primitive widgets, and connect processes to either text strings or pictographs. Other tools create input objects; Data Views supports output objects directly, thus output objects are not considered. Finally, a set of utilities for executing, monitoring use, editing, and displaying the content of interfaces is included in the toolkit.

  20. Effect of age on human-computer-interface control via neck electromyography.

    PubMed

    Hands, Gabrielle L; Stepp, Cara E

    2016-01-01

    The purpose of this study was to determine the effect of age on visuomotor tracking using submental and anterior neck surface electromyography (sEMG) to assess feasibility of computer control via neck musculature, which allows people with little remaining motor function to interact with computers. Thirty-two healthy adults participated: sixteen younger adults aged 18 - 29 years and sixteen older adults aged 69 - 85 years. Participants modulated sEMG to achieve targets presented at different amplitudes using real-time visual feedback. Root-mean-squared (RMS) error was used to quantify tracking performance. RMS error was increased for older adults relative to younger adults. Older adults demonstrated more RMS error than younger adults as a function of increasing target amplitude. The differential effects of age found on static tracking performance in anterior neck musculature suggest more difficult translation of human-computer-interfaces controlled using anterior neck musculature for static tasks to older populations.

  1. U.S. Army weapon systems human-computer interface style guide. Version 2

    SciTech Connect

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  2. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces

    PubMed Central

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-01-01

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain–computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles. PMID:28644398

  3. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  4. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  5. Selection of suitable hand gestures for reliable myoelectric human computer interface.

    PubMed

    Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

    2015-04-09

    Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

  6. Using minimal human-computer interfaces for studying the interactive development of social awareness.

    PubMed

    Froese, Tom; Iizuka, Hiroyuki; Ikegami, Takashi

    2014-01-01

    According to the enactive approach to cognitive science, perception is essentially a skillful engagement with the world. Learning how to engage via a human-computer interface (HCI) can therefore be taken as an instance of developing a new mode of experiencing. Similarly, social perception is theorized to be primarily constituted by skillful engagement between people, which implies that it is possible to investigate the origins and development of social awareness using multi-user HCIs. We analyzed the trial-by-trial objective and subjective changes in sociality that took place during a perceptual crossing experiment in which embodied interaction between pairs of adults was mediated over a minimalist haptic HCI. Since that study required participants to implicitly relearn how to mutually engage so as to perceive each other's presence, we hypothesized that there would be indications that the initial developmental stages of social awareness were recapitulated. Preliminary results reveal that, despite the lack of explicit feedback about task performance, there was a trend for the clarity of social awareness to increase over time. We discuss the methodological challenges involved in evaluating whether this trend was characterized by distinct developmental stages of objective behavior and subjective experience.

  7. Using minimal human-computer interfaces for studying the interactive development of social awareness

    PubMed Central

    Froese, Tom; Iizuka, Hiroyuki; Ikegami, Takashi

    2014-01-01

    According to the enactive approach to cognitive science, perception is essentially a skillful engagement with the world. Learning how to engage via a human-computer interface (HCI) can therefore be taken as an instance of developing a new mode of experiencing. Similarly, social perception is theorized to be primarily constituted by skillful engagement between people, which implies that it is possible to investigate the origins and development of social awareness using multi-user HCIs. We analyzed the trial-by-trial objective and subjective changes in sociality that took place during a perceptual crossing experiment in which embodied interaction between pairs of adults was mediated over a minimalist haptic HCI. Since that study required participants to implicitly relearn how to mutually engage so as to perceive each other's presence, we hypothesized that there would be indications that the initial developmental stages of social awareness were recapitulated. Preliminary results reveal that, despite the lack of explicit feedback about task performance, there was a trend for the clarity of social awareness to increase over time. We discuss the methodological challenges involved in evaluating whether this trend was characterized by distinct developmental stages of objective behavior and subjective experience. PMID:25309490

  8. Human-Computer Interface Controlled by Horizontal Directional Eye Movements and Voluntary Blinks Using AC EOG Signals

    NASA Astrophysics Data System (ADS)

    Kajiwara, Yusuke; Murata, Hiroaki; Kimura, Haruhiko; Abe, Koji

    As a communication support tool for cases of amyotrophic lateral sclerosis (ALS), researches on eye gaze human-computer interfaces have been active. However, since voluntary and involuntary eye movements cannot be distinguished in the interfaces, their performance is still not sufficient for practical use. This paper presents a high performance human-computer interface system which unites high quality recognitions of horizontal directional eye movements and voluntary blinks. The experimental results have shown that the number of incorrect inputs is decreased by 35.1% in an existing system which equips recognitions of horizontal and vertical directional eye movements in addition to voluntary blinks and character inputs are speeded up by 17.4% from the existing system.

  9. A practical efficient human computer interface based on saccadic eye movements for people with disabilities.

    PubMed

    Soltani, Sima; Mahnam, Amin

    2016-03-01

    Human computer interfaces (HCI) provide new channels of communication for people with severe motor disabilities to state their needs, and control their environment. Some HCI systems are based on eye movements detected from the electrooculogram. In this study, a wearable HCI, which implements a novel adaptive algorithm for detection of saccadic eye movements in eight directions, was developed, considering the limitations that people with disabilities have. The adaptive algorithm eliminated the need for calibration of the system for different users and in different environments. A two-stage typing environment and a simple game for training people with disabilities to work with the system were also developed. Performance of the system was evaluated in experiments with the typing environment performed by six participants without disabilities. The average accuracy of the system in detecting eye movements and blinking was 82.9% at first tries with an average typing rate of 4.5cpm. However an experienced user could achieve 96% accuracy and 7.2cpm typing rate. Moreover, the functionality of the system for people with movement disabilities was evaluated by performing experiments with the game environment. Six people with tetraplegia and significant levels of speech impairment played with the computer game several times. The average success rate in performing the necessary eye movements was 61.5%, which increased significantly with practice up to 83% for one participant. The developed system is 2.6×4.5cm in size and weighs only 15g, assuring high level of comfort for the users.

  10. Sensing and controlling model for eye-gaze input human-computer interface

    NASA Astrophysics Data System (ADS)

    Tu, DaWei; Zhao, Qijie; Yin, Hairong

    2004-01-01

    A kind of eye-gaze input sensing model based on the pupil"s relative offset to the reflection point on the cornea (Purkinje spot) from an assistant infrared light source in front of the user"s head, has been completely set up. A set of control strategy that can adapt to the user"s head movement, has also been put forwards. It can effectively overcome the difficulties existing in the similar systems that the user"s head should be stationary or just allowed a little movement while the system works. Therefore it is a spontaneous and harmonious human-computer interaction system. Moreover, this system has the obvious merits such as convenient to use, non-contact, non-wear, non-interference, non-restraint, and so on. An experiment eye-gaze input platform for human-computer interaction has been built, and the sensing and controlling model have been verified.

  11. Design of an efficient framework for fast prototyping of customized human-computer interfaces and virtual environments for rehabilitation.

    PubMed

    Avola, Danilo; Spezialetti, Matteo; Placidi, Giuseppe

    2013-06-01

    Rehabilitation is often required after stroke, surgery, or degenerative diseases. It has to be specific for each patient and can be easily calibrated if assisted by human-computer interfaces and virtual reality. Recognition and tracking of different human body landmarks represent the basic features for the design of the next generation of human-computer interfaces. The most advanced systems for capturing human gestures are focused on vision-based techniques which, on the one hand, may require compromises from real-time and spatial precision and, on the other hand, ensure natural interaction experience. The integration of vision-based interfaces with thematic virtual environments encourages the development of novel applications and services regarding rehabilitation activities. The algorithmic processes involved during gesture recognition activity, as well as the characteristics of the virtual environments, can be developed with different levels of accuracy. This paper describes the architectural aspects of a framework supporting real-time vision-based gesture recognition and virtual environments for fast prototyping of customized exercises for rehabilitation purposes. The goal is to provide the therapist with a tool for fast implementation and modification of specific rehabilitation exercises for specific patients, during functional recovery. Pilot examples of designed applications and preliminary system evaluation are reported and discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces.

    PubMed

    Oguz, S O; Kucukyilmaz, A; Sezgin, Tevfik Metin; Basdogan, C

    2012-01-01

    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audio-visual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.

  13. Development and Evaluation of a Head-Controlled Human-Computer Interface with Mouse-Like Functions for Physically Disabled Users

    PubMed Central

    Pereira, César Augusto Martins; Neto, Raul Bolliger; Reynaldo, Ana Carolina; de Miranda Luzo, Maria Cândida; Oliveira, Reginaldo Perilo

    2009-01-01

    OBJECTIVES The objectives of this study were to develop a pointing device controlled by head movement that had the same functions as a conventional mouse and to evaluate the performance of the proposed device when operated by quadriplegic users. METHODS Ten individuals with cervical spinal cord injury participated in functional evaluations of the developed pointing device. The device consisted of a video camera, computer software, and a target attached to the front part of a cap, which was placed on the user’s head. The software captured images of the target coming from the video camera and processed them with the aim of determining the displacement from the center of the target and correlating this with the movement of the computer cursor. Evaluation of the interaction between each user and the proposed device was carried out using 24 multidirectional tests with two degrees of difficulty. RESULTS According to the parameters of mean throughput and movement time, no statistically significant differences were observed between the repetitions of the tests for either of the studied levels of difficulty. CONCLUSIONS The developed pointing device adequately emulates the movement functions of the computer cursor. It is easy to use and can be learned quickly when operated by quadriplegic individuals. PMID:19841704

  14. Designing the user interface: strategies for effective human-computer interaction

    NASA Astrophysics Data System (ADS)

    Shneiderman, B.

    1998-03-01

    In revising this popular book, Ben Shneiderman again provides a complete, current and authoritative introduction to user-interface design. The user interface is the part of every computer system that determines how people control and operate that system. When the interface is well designed, it is comprehensible, predictable, and controllable; users feel competent, satisfied, and responsible for their actions. Shneiderman discusses the principles and practices needed to design such effective interaction. Based on 20 years experience, Shneiderman offers readers practical techniques and guidelines for interface design. He also takes great care to discuss underlying issues and to support conclusions with empirical results. Interface designers, software engineers, and product managers will all find this book an invaluable resource for creating systems that facilitate rapid learning and performance, yield low error rates, and generate high user satisfaction. Coverage includes the human factors of interactive software (with a new discussion of diverse user communities), tested methods to develop and assess interfaces, interaction styles such as direct manipulation for graphical user interfaces, and design considerations such as effective messages, consistent screen design, and appropriate color.

  15. A Review and Reappraisal of Adaptive Human-Computer Interfaces in Complex Control Systems

    DTIC Science & Technology

    2006-08-01

    composed of 3 stages: 1) EEG , ECG, and changing rate of SPR, are measured as original biological signals and physiological indices are extracted by...This portable Adaptive Brain Interface (ABI) is based on the on-line analysis of spontaneous electroencephalogram ( EEG ) signals measured with eight...computer interface (BCI) is based on the analysis of EEG signals associated with spontaneous mental activity. The analysis is concerned with local

  16. A dual-mode human computer interface combining speech and tongue motion for people with severe disabilities.

    PubMed

    Huo, Xueliang; Park, Hangue; Kim, Jeonghee; Ghovanloo, Maysam

    2013-11-01

    We are presenting a new wireless and wearable human computer interface called the dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities to use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users' tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact and ergonomic wireless headset. It also captures the users' voice wirelessly using a small microphone embedded in the same headset. Preliminary evaluation results based on 14 able-bodied subjects and three individuals with high level spinal cord injuries at level C3-C5 indicated that the dTDS headset, combined with a commercially available speech recognition (SR) software, can provide end users with significantly higher performance than either unimodal forms based on the tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry.

  17. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  18. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    NASA Technical Reports Server (NTRS)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants' mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99% in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text "chat" communications, manipulation of procedures/checklists, cataloguing/annotating images, scientific note taking, human-robot interaction, and control of suit and/or other EVA systems.

  19. Certification for civil flight decks and the human-computer interface

    NASA Technical Reports Server (NTRS)

    Mcclumpha, Andrew J.; Rudisill, Marianne

    1994-01-01

    This paper will address the issue of human factor aspects of civil flight deck certification, with emphasis on the pilot's interface with automation. In particular, three questions will be asked that relate to this certification process: (1) are the methods, data, and guidelines available from human factors to adequately address the problems of certifying as safe and error tolerant the complex automated systems of modern civil transport aircraft; (2) do aircraft manufacturers effectively apply human factors information during the aircraft flight deck design process; and (3) do regulatory authorities effectively apply human factors information during the aircraft certification process?

  20. The Mind-Writing Pupil: A Human-Computer Interface Based on Decoding of Covert Attention through Pupillometry

    PubMed Central

    Mathôt, Sebastiaan; Melmi, Jean-Baptiste; van der Linden, Lotje; Van der Stigchel, Stefan

    2016-01-01

    We present a new human-computer interface that is based on decoding of attention through pupillometry. Our method builds on the recent finding that covert visual attention affects the pupillary light response: Your pupil constricts when you covertly (without looking at it) attend to a bright, compared to a dark, stimulus. In our method, participants covertly attend to one of several letters with oscillating brightness. Pupil size reflects the brightness of the selected letter, which allows us–with high accuracy and in real time–to determine which letter the participant intends to select. The performance of our method is comparable to the best covert-attention brain-computer interfaces to date, and has several advantages: no movement other than pupil-size change is required; no physical contact is required (i.e. no electrodes); it is easy to use; and it is reliable. Potential applications include: communication with totally locked-in patients, training of sustained attention, and ultra-secure password input. PMID:26848745

  1. Functional near-infrared spectroscopy for adaptive human-computer interfaces

    NASA Astrophysics Data System (ADS)

    Yuksel, Beste F.; Peck, Evan M.; Afergan, Daniel; Hincks, Samuel W.; Shibata, Tomoki; Kainerstorfer, Jana; Tgavalekos, Kristen; Sassaroli, Angelo; Fantini, Sergio; Jacob, Robert J. K.

    2015-03-01

    We present a brain-computer interface (BCI) that detects, analyzes and responds to user cognitive state in real-time using machine learning classifications of functional near-infrared spectroscopy (fNIRS) data. Our work is aimed at increasing the narrow communication bandwidth between the human and computer by implicitly measuring users' cognitive state without any additional effort on the part of the user. Traditionally, BCIs have been designed to explicitly send signals as the primary input. However, such systems are usually designed for people with severe motor disabilities and are too slow and inaccurate for the general population. In this paper, we demonstrate with previous work1 that a BCI that implicitly measures cognitive workload can improve user performance and awareness compared to a control condition by adapting to user cognitive state in real-time. We also discuss some of the other applications we have used in this field to measure and respond to cognitive states such as cognitive workload, multitasking, and user preference.

  2. Modeling strategic use of human computer interfaces with novel hidden Markov models

    PubMed Central

    Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.

    2015-01-01

    Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID

  3. Modeling strategic use of human computer interfaces with novel hidden Markov models.

    PubMed

    Mariano, Laura J; Poore, Joshua C; Krum, David M; Schwartz, Jana L; Coskren, William D; Jones, Eric M

    2015-01-01

    Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit.

  4. Design and development of data glove based on printed polymeric sensors and Zigbee networks for Human-Computer Interface.

    PubMed

    Tongrod, Nattapong; Lokavee, Shongpun; Watthanawisuth, Natthapol; Tuantranont, Adisorn; Kerdcharoen, Teerakiat

    2013-03-01

    Current trends in Human-Computer Interface (HCI) have brought on a wave of new consumer devices that can track the motion of our hands. These devices have enabled more natural interfaces with computer applications. Data gloves are commonly used as input devices, equipped with sensors that detect the movements of hands and communication unit that interfaces those movements with a computer. Unfortunately, the high cost of sensor technology inevitably puts some burden to most general users. In this research, we have proposed a low-cost data glove concept based on printed polymeric sensor to make pressure and bending sensors fabricated by a consumer ink-jet printer. These sensors were realized using a conductive polymer (poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) [PEDOT:PSS]) thin film printed on glossy photo paper. Performance of these sensors can be enhanced by addition of dimethyl sulfoxide (DMSO) into the aqueous dispersion of PEDOT:PSS. The concept of surface resistance was successfully adopted for the design and fabrication of sensors. To demonstrate the printed sensors, we constructed a data glove using such sensors and developed software for real time hand tracking. Wireless networks based on low-cost Zigbee technology were used to transfer data from the glove to a computer. To our knowledge, this is the first report on low cost data glove based on paper pressure sensors. This low cost implementation of both sensors and communication network as proposed in this paper should pave the way toward a widespread implementation of data glove for real-time hand tracking applications.

  5. A Dual-Mode Human Computer Interface Combining Speech and Tongue Motion for People with Severe Disabilities

    PubMed Central

    Huo, Xueliang; Park, Hangue; Kim, Jeonghee; Ghovanloo, Maysam

    2015-01-01

    We are presenting a new wireless and wearable human computer interface called the dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities to use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users’ tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact and ergonomic wireless headset. It also captures the users’ voice wirelessly using a small microphone embedded in the same headset. Preliminary evaluation results based on 14 able-bodied subjects and three individuals with high level spinal cord injuries at level C3–C5 indicated that the dTDS headset, combined with a commercially available speech recognition (SR) software, can provide end users with significantly higher performance than either unimodal forms based on the tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry. PMID:23475380

  6. Top ten list of user-hostile interface design: The ten most frequent mistakes made in human--computer interface design

    SciTech Connect

    Miller, D.P.

    1995-05-01

    This article describes ten of the most frequent ergonomic problems found in human--computer interfaces (HCIs) associated with complex industrial machines. In contrast with being thought of a ``user friendly,`` many of these machines are seen as exhibiting ``user-hostile`` attributes by the author. The historical lack of consistent application of ergonomic principles in the HCIs has led to a breed of very sophisticated, complex manufacturing systems that few people can operate without extensive orientation, training, or experience. This design oversight has produced the need for customized training programs and help documentation, unnecessary machine downtime, and reduced productivity resulting from operator stress and confusion. The ten issues are treated in a problem--solution format with real-world graphic examples of good and poor design. Intended for a diverse audience, the article avoids technical jargon, and is appropriate reading for those involved in software, product engineering, marketing, and management. {copyright} {ital 1995} {ital American} {ital Vacuum} {ital Society}

  7. A comparative evaluation plan for the Maintenance, Inventory, and Logistics Planning (MILP) System Human-Computer Interface (HCI)

    NASA Technical Reports Server (NTRS)

    Overmyer, Scott P.

    1993-01-01

    The primary goal of this project was to develop a tailored and effective approach to the design and evaluation of the human-computer interface (HCI) to the Maintenance, Inventory and Logistics Planning (MILP) System in support of the Mission Operations Directorate (MOD). An additional task that was undertaken was to assist in the review of Ground Displays for Space Station Freedom (SSF) by attending the Ground Displays Interface Group (GDIG), and commenting on the preliminary design for these displays. Based upon data gathered over the 10 week period, this project has hypothesized that the proper HCI concept for navigating through maintenance databases for large space vehicles is one based upon a spatial, direct manipulation approach. This dialogue style can be then coupled with a traditional text-based DBMS, after the user has determined the general nature and location of the information needed. This conclusion is in contrast with the currently planned HCI for MILP which uses a traditional form-fill-in dialogue style for all data access and retrieval. In order to resolve this difference in HCI and dialogue styles, it is recommended that comparative evaluation be performed which combines the use of both subjective and objective metrics to determine the optimal (performance-wise) and preferred approach for end users. The proposed plan has been outlined in the previous paragraphs and is available in its entirety in the Technical Report associated with this project. Further, it is suggested that several of the more useful features of the Maintenance Operations Management System (MOMS), especially those developed by the end-users, be incorporated into MILP to save development time and money.

  8. Learning an Intermittent Control Strategy for Postural Balancing Using an EMG-Based Human-Computer Interface

    PubMed Central

    Asai, Yoshiyuki; Tateyama, Shota; Nomura, Taishin

    2013-01-01

    It has been considered that the brain stabilizes unstable body dynamics by regulating co-activation levels of antagonist muscles. Here we critically reexamined this established theory of impedance control in a postural balancing task using a novel EMG-based human-computer interface, in which subjects were asked to balance a virtual inverted pendulum using visual feedback information on the pendulum's position. The pendulum was actuated by a pair of antagonist joint torques determined in real-time by activations of the corresponding pair of antagonist ankle muscles of subjects standing upright. This motor-task raises a frustrated environment; a large feedback time delay in the sensorimotor loop, as a source of instability, might favor adopting the non-reactive, preprogrammed impedance control, but the ankle muscles are relatively hard to co-activate, which hinders subjects from adopting the impedance control. This study aimed at discovering how experimental subjects resolved this frustrated environment through motor learning. One third of subjects adapted to the balancing task in a way of the impedance-like control. It was remarkable, however, that the majority of subjects did not adopt the impedance control. Instead, they acquired a smart and energetically efficient strategy, in which two muscles were inactivated simultaneously at a sequence of optimal timings, leading to intermittent appearance of periods of time during which the pendulum was not actively actuated. Characterizations of muscle inactivations and the pendulum¡Çs sway showed that the strategy adopted by those subjects was a type of intermittent control that utilizes a stable manifold of saddle-type unstable upright equilibrium that appeared in the state space of the pendulum when the active actuation was turned off. PMID:23717398

  9. A robust Kalman algorithm to facilitate human-computer interaction for people with cerebral palsy, using a new interface based on inertial sensors.

    PubMed

    Raya, Rafael; Rocon, Eduardo; Gallego, Juan A; Ceres, Ramón; Pons, Jose L

    2012-01-01

    This work aims to create an advanced human-computer interface called ENLAZA for people with cerebral palsy (CP). Although there are computer-access solutions for disabled people in general, there are few evidences from motor disabled community (e.g., CP) using these alternative interfaces. The proposed interface is based on inertial sensors in order to characterize involuntary motion in terms of time, frequency and range of motion. This characterization is used to design a filtering technique that reduces the effect of involuntary motion on person-computer interaction. This paper presents a robust Kalman filter (RKF) design to facilitate fine motor control based on the previous characterization. The filter increases mouse pointer directivity and the target acquisition time is reduced by a factor of ten. The interface is validated with CP users who were unable to control the computer using other interfaces. The interface ENLAZA and the RKF enabled them to use the computer.

  10. A Robust Kalman Algorithm to Facilitate Human-Computer Interaction for People with Cerebral Palsy, Using a New Interface Based on Inertial Sensors

    PubMed Central

    Raya, Rafael; Rocon, Eduardo; Gallego, Juan A.; Ceres, Ramón; Pons, Jose L.

    2012-01-01

    This work aims to create an advanced human-computer interface called ENLAZA for people with cerebral palsy (CP). Although there are computer-access solutions for disabled people in general, there are few evidences from motor disabled community (e.g., CP) using these alternative interfaces. The proposed interface is based on inertial sensors in order to characterize involuntary motion in terms of time, frequency and range of motion. This characterization is used to design a filtering technique that reduces the effect of involuntary motion on person-computer interaction. This paper presents a robust Kalman filter (RKF) design to facilitate fine motor control based on the previous characterization. The filter increases mouse pointer directivity and the target acquisition time is reduced by a factor of ten. The interface is validated with CP users who were unable to control the computer using other interfaces. The interface ENLAZA and the RKF enabled them to use the computer. PMID:22736992

  11. An Analysis of the Human/Computer Interface of the Program Manager Support System: Capps(Contract Appraisal System) and Gat(Government Activity Tasking)

    DTIC Science & Technology

    1988-12-01

    instance, users in a nuclear power plant might need a system that provides: quick response times, display screens that use color and sounds for warnings...34TJS CRAMl .............. DFC 14v P),. . A-V. AFIT/GIR/LSY/88D-6 AN ANALYSIS OF THE HUMAN/COMPUTER INTERFACE OF THE PROGRAM MANAGER SUPPORT SYSTEM...events is through employing attention getting 15 t ~chniques. Rubinstein, Hersh, and Shneiderman suggest using the subsequent methods (Rubinstein and Hersh

  12. The Control System Design of Rotary Li/MnO2 Button Battery Product Line Based on Human-computer interface

    NASA Astrophysics Data System (ADS)

    Xiangyang, Qi; Shuzhong, Lin; Lixin, Sun

    In this paper, human-computer interaction was applied as the main features. Human-computer interface programs and programmable logic controller (plc) programs were combined application and address matching and the same address communication mode were adopted. Through the buttons of human-computer interface to trigger the read address set. And further corresponding virtual relays of the plc addresses area were activated. In this paper, Li/MnO2 button battery product line control system was introduced as an example. In the system, according to the process route of product line some areas were set including parameter setting area, single working place debugging area, equipment running area and the alarm browsing area. In order to protect the normal operation of the product line, interlock programs were used to ban parameters modification and single station debugging, avoiding the occurrence of malfunction. At the same time, the fault diagnosis technology was used to have designed a more optimal alarm pattern. The control system has been tested in practice to get good results.

  13. Learning to modulate the partial powers of a single sEMG power spectrum through a novel human-computer interface.

    PubMed

    Skavhaug, Ida-Maria; Lyons, Kenneth R; Nemchuk, Anna; Muroff, Shira D; Joshi, Sanjay S

    2016-06-01

    New human-computer interfaces that use bioelectrical signals as input are allowing study of the flexibility of the human neuromuscular system. We have developed a myoelectric human-computer interface which enables users to navigate a cursor to targets through manipulations of partial powers within a single surface electromyography (sEMG) signal. Users obtain two-dimensional control through simultaneous adjustments of powers in two frequency bands within the sEMG spectrum, creating power profiles corresponding to cursor positions. It is unlikely that these types of bioelectrical manipulations are required during routine muscle contractions. Here, we formally establish the neuromuscular ability to voluntarily modulate single-site sEMG power profiles in a group of naïve subjects under restricted and controlled conditions using a wrist muscle. All subjects used the same pre-selected frequency bands for control and underwent the same training, allowing a description of the average learning progress throughout eight sessions. We show that subjects steadily increased target hit rates from 48% to 71% and exhibited greater control of the cursor's trajectories following practice. Our results point towards an adaptable neuromuscular skill, which may allow humans to utilize single muscle sites as limited general-purpose signal generators. Ultimately, the goal is to translate this neuromuscular ability to practical interfaces for the disabled by using a spared muscle to control external machines. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    ERIC Educational Resources Information Center

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  15. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    ERIC Educational Resources Information Center

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  16. Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Bhagwani, Akhilesh; Sengar, Chitransh; Talwaniper, Jyotsna; Sharma, Shaan

    2012-08-01

    The paper basically deals with the study of HCI (Human computer interaction) or BCI(Brain-Computer-Interfaces) Technology that can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments. The HCI is based as a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.The paper also deals with many advantages of BCI Technology along with some of its applications and some major drawbacks.

  17. Embedded human computer interaction.

    PubMed

    Baber, Christopher; Baumann, Konrad

    2002-05-01

    In this paper, human interaction with embedded or ubiquitous technology is considered. The techniques focus on the use of what might be termed "everyday" objects and actions as a means of controlling (or otherwise interacting with) technology. While this paper is not intended to be an exhaustive review, it does present a view of the immediate future of human-computer interaction (HCI) in which users move beyond the desktop to where interacting with technology becomes merged with other activity. At one level this places HCI in the context of other forms of personal and domestic technologies. At another level, this raises questions as to how people will interact with technologies of the future. Until now, HCI had often relied on people learning obscure command sets or learning to recognise words and objects on their computer screen. The most significant advance in HCI (the invention of the WIMP interface) is already some 40 years old. Thus, the future of HCI might be one in which people are encouraged (or at least allowed) to employ the skills that they have developed during their lives in order to interact with technology, rather than being forced to learn and perfect new skills.

  18. Human Computers 1947

    NASA Technical Reports Server (NTRS)

    1947-01-01

    Langley's human computers at work in 1947. The female presence at Langley, who performed mathematical computations for male staff. Photograph published in Winds of Change, 75th Anniversary NASA publication (page 48), by James Schultz.

  19. Language evolution and human-computer interaction

    NASA Technical Reports Server (NTRS)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  20. Language evolution and human-computer interaction

    NASA Technical Reports Server (NTRS)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  1. On the Rhetorical Contract in Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Wenger, Michael J.

    1991-01-01

    An exploration of the rhetorical contract--i.e., the expectations for appropriate interaction--as it develops in human-computer interaction revealed that direct manipulation interfaces were more likely to establish social expectations. Study results suggest that the social nature of human-computer interactions can be examined with reference to the…

  2. A User Task Analysis for Command and Control Systems and its Use in Human-Computer Interaction Research

    DTIC Science & Technology

    1993-09-03

    The Advanced Interfaces Section of the Human - Computer Interaction (HCI) Laboratory at the Naval Research Laboratory (NRL) is engaged in creating and...computer interfaces. Human - computer interaction , Task analysis, Command and control.

  3. Natural Human-Computer Interaction

    NASA Astrophysics Data System (ADS)

    D'Amico, Gianpaolo; Del Bimbo, Alberto; Dini, Fabrizio; Landucci, Lea; Torpei, Nicola

    Research work in relation to Natural Human-Computer Interaction concerns the theorization and development of systems that understand and recognize human communicative actions in order to engage people in a dialogue between them and their surroundings.

  4. Humanities Computing 25 Years Later.

    ERIC Educational Resources Information Center

    Raben, Joseph

    1991-01-01

    Provides an overview of the development of humanities computing during the past 25 years. Mentions the major applications of the computer to humanities disciplines including the generation of concordances, attempts at dating works of major authors, proving authorship, defining style, and compiling indexes. Discusses lexicographical uses and…

  5. Human Computer Interface Design Criteria. Volume 1. User Interface Requirements

    DTIC Science & Technology

    2010-03-19

    112 11.4.5 Support for Printing ...................................................................................... 113... Printing ......................................................................................................... 160 17.4.10 Internationalized Web Sites...a Copy… Revert <separator> Page Setup Print Preview… Print … Send To… <separator> Properties <separator> <list of recently

  6. An intelligent multi-media human-computer dialogue system

    NASA Technical Reports Server (NTRS)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  7. Human computer interaction issues in Clinical Trials Management Systems.

    PubMed

    Starren, Justin B; Payne, Philip R O; Kaufman, David R

    2006-01-01

    Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system.

  8. Human Computer Interaction Issues in Clinical Trials Management Systems

    PubMed Central

    Starren, Justin B.; Payne, Philip R.O.; Kaufman, David R.

    2006-01-01

    Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system. PMID:17238728

  9. Development of the Next Generation of Adaptive Interfaces

    DTIC Science & Technology

    2015-03-01

    user interfaces and human - computer interaction will be extended by this tailored interface innovation to investigate Soldier performance benefits of...individual differences, human - computer interaction , cognitive styles, user interface, UAV 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...information without the added complexity of current adaptive interfaces. The field of adaptive user interfaces (UIs) and human - computer interaction

  10. Work-Centered Support System Technology: A New Interface Client Technology for the Battlespace Infosphere

    DTIC Science & Technology

    2000-01-01

    Decision Aiding, Direct Manipulation, Graphical User Interface, Human - Computer Interaction , Information Overload, Interface Agents, Network-Centered...Collaborative Support, Decision Aiding, Direct Manipulation, Graphical User Interface, Human - Computer Interaction , Information Overload, Interface Agents

  11. Metaphors for Interface Design.

    ERIC Educational Resources Information Center

    Hutchins, Edwin

    This discussion of the utilization by computer designers and users of metaphors as organizing structures for dealing with the complexity of behavior of human/computer interfaces begins by identifying three types of metaphor that describe various aspects of human-computer interface design, i.e., activity, mode of interaction, and task domain. The…

  12. Human computer interaction using hand gesture.

    PubMed

    Wan, Silas; Nguyen, Hung T

    2008-01-01

    Hand gesture is a very natural form of human interaction and can be used effectively in human computer interaction (HCI). This project involves the design and implementation of a HCI using a small hand-worn wireless module with a 3-axis accelerometer as the motion sensor. The small stand-alone unit contains an accelerometer and a wireless Zigbee transceiver with microcontroller. To minimize intrusiveness to the user, the module is designed to be small (3cm by 4 cm). A time-delay neural network algorithm is developed to analyze the time series data from the 3-axis accelerometer. Power consumption is reduced by the non-continuous transmission of data and the use of low-power components, efficient algorithm and sleep mode between sampling for the wireless module. A home control interface is designed so that the user can control home appliances by moving through menus. The results demonstrate the feasibility of controlling home appliances using hand gestures and would present an opportunity for a section of the aging population and disabled people to lead a more independent life.

  13. The Quantum Human Computer (QHC) Hypothesis

    ERIC Educational Resources Information Center

    Salmani-Nodoushan, Mohammad Ali

    2008-01-01

    This article attempts to suggest the existence of a human computer called Quantum Human Computer (QHC) on the basis of an analogy between human beings and computers. To date, there are two types of computers: Binary and Quantum. The former operates on the basis of binary logic where an object is said to exist in either of the two states of 1 and…

  14. Enhancing Learning through Human Computer Interaction

    ERIC Educational Resources Information Center

    McKay, Elspeth, Ed.

    2007-01-01

    Enhancing Learning Through Human Computer Interaction is an excellent reference source for human computer interaction (HCI) applications and designs. This "Premier Reference Source" provides a complete analysis of online business training programs and e-learning in the higher education sector. It describes a range of positive outcomes for linking…

  15. Enhancing Learning through Human Computer Interaction

    ERIC Educational Resources Information Center

    McKay, Elspeth, Ed.

    2007-01-01

    Enhancing Learning Through Human Computer Interaction is an excellent reference source for human computer interaction (HCI) applications and designs. This "Premier Reference Source" provides a complete analysis of online business training programs and e-learning in the higher education sector. It describes a range of positive outcomes for linking…

  16. The Quantum Human Computer (QHC) Hypothesis

    ERIC Educational Resources Information Center

    Salmani-Nodoushan, Mohammad Ali

    2008-01-01

    This article attempts to suggest the existence of a human computer called Quantum Human Computer (QHC) on the basis of an analogy between human beings and computers. To date, there are two types of computers: Binary and Quantum. The former operates on the basis of binary logic where an object is said to exist in either of the two states of 1 and…

  17. Applications of airborne ultrasound in human-computer interaction.

    PubMed

    Dahl, Tobias; Ealo, Joao L; Bang, Hans J; Holm, Sverre; Khuri-Yakub, Pierre

    2014-09-01

    Airborne ultrasound is a rapidly developing subfield within human-computer interaction (HCI). Touchless ultrasonic interfaces and pen tracking systems are part of recent trends in HCI and are gaining industry momentum. This paper aims to provide the background and overview necessary to understand the capabilities of ultrasound and its potential future in human-computer interaction. The latest developments on the ultrasound transducer side are presented, focusing on capacitive micro-machined ultrasonic transducers, or CMUTs. Their introduction is an important step toward providing real, low-cost multi-sensor array and beam-forming options. We also provide a unified mathematical framework for understanding and analyzing algorithms used for ultrasound detection and tracking for some of the most relevant applications. Copyright © 2014. Published by Elsevier B.V.

  18. Quantitative comparison of human computer interaction for direct prescription entry systems.

    PubMed

    Endoh, A; Minato, K; Komori, M; Inoue, Y; Nagata, S; Takahashi, T

    1995-01-01

    An objective and quantitative method is described for evaluating human-computer interaction (interface) in a direct prescription entry system. This method is based on a GOMS model and represented by a tree structure. Three different interfaces at university hospitals were compared by this evaluation method, and the differences among them were measured.

  19. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  20. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    NASA Technical Reports Server (NTRS)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  1. Occupational stress in human computer interaction.

    PubMed

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  2. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  3. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  4. Human-Computer Interaction and Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1995-01-01

    The proceedings of the Workshop on Human-Computer Interaction and Virtual Environments are presented along with a list of attendees. The objectives of the workshop were to assess the state-of-technology and level of maturity of several areas in human-computer interaction and to provide guidelines for focused future research leading to effective use of these facilities in the design/fabrication and operation of future high-performance engineering systems.

  5. Metaphors for the Nature of Human-Computer Interaction in an Empowering Environment: Interaction Style Influences the Manner of Human Accomplishment.

    ERIC Educational Resources Information Center

    Weller, Herman G.; Hartson, H. Rex

    1992-01-01

    Describes human-computer interface needs for empowering environments in computer usage in which the machine handles the routine mechanics of problem solving while the user concentrates on its higher order meanings. A closed-loop model of interaction is described, interface as illusion is discussed, and metaphors for human-computer interaction are…

  6. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    PubMed

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  7. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation

    PubMed Central

    Hondori, Hossein Mousavi; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V.; Cramer, Steven C.

    2015-01-01

    Background and objective Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, i.e., a personal computer (PC) with a mouse. Methods Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The two versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Results Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (p=0.0001), 19% faster reaching times (p=0.0001), and 15% less movement variability (p=0.0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Conclusions Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but preferred for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. PMID:26138411

  8. Towards Better Human Robot Interaction: Understand Human Computer Interaction in Social Gaming Using a Video-Enhanced Diary Method

    NASA Astrophysics Data System (ADS)

    See, Swee Lan; Tan, Mitchell; Looi, Qin En

    This paper presents findings from a descriptive research on social gaming. A video-enhanced diary method was used to understand the user experience in social gaming. From this experiment, we found that natural human behavior and gamer’s decision making process can be elicited and speculated during human computer interaction. These are new information that we should consider as they can help us build better human computer interfaces and human robotic interfaces in future.

  9. Human Computer Interaction in the ALMA Control Room

    NASA Astrophysics Data System (ADS)

    Schilling, M.; Primet, R.; Pietriga, E.; Schwarz, J.

    2012-09-01

    The article describes the ALMA Operations Monitoring and Control (OMC) software and its next generation user interfaces, used by operators and astronomers to monitor and control the observing facility. These user interfaces bring state-of-the-art Human Computer Interaction (HCI) techniques to the ALMA Control Room: map visualisation, semantic zooming, navigation gestures, multiple coordinated views, and decrease of time-to-point. They enable users to stay in control of dozens of antennas, hundreds of devices, and thousands of baselines. The Atacama Large Millimeter/submillimeter Array (ALMA), an international radio-astronomy facility, is a partnership of North America, Europe and East Asia in cooperation with the Republic of Chile. It is located at the Altiplano de Chajnantor and is being operated from the Operations Support Facilities (OSF) near San Pedro de Atacama.

  10. An overview of human-computer interaction.

    PubMed

    Beaudouin-Lafon, M

    1993-01-01

    This article presents an overview of the field of human-computer interaction. This branch of computer science concerns the design, implementation and analysis of interactive computer systems. We show that this field is multidisciplinary in essence, involving social scientists as well as computer scientists, experts of application domains, graphics designers, etc. Once the fundamental aspects of human-computer interaction are presented, we take a practical approach in order to introduce the methods, tools and techniques that are available today for the design and implementation of interactive computer systems. Finally, we present the main directions of research in this domain.

  11. Factors in Human-Computer Interface Design (A Pilot Study).

    DTIC Science & Technology

    1994-12-01

    such as misunderstanding the directions, confusion over what actions must be accomplished, inability of student to correct typographical errors, etc...Versus Poor Visual Clarity In Figure 2, one of the features which produces poor visual clarity is the combination of italicized writing with a script ...system shell , VP-Expert. The two primary books used to develop the expert system were An Introduction to Expert Systems by Robert J. Mockler and D.G

  12. A Human-Computer Interface Vision for Naval Transformation

    DTIC Science & Technology

    2003-06-01

    architectures to accommodate open systems have been developed, e.g., Sun J2EE® (Java 2 Platform, Enterprise Edition), Microsoft® .NET, and open-source, web ...in modern 6 Web -based computing afford designs with a separate HCI presentation layer, which can use adapters to access task-relevant information from...application. The presentation and application tiers connect to corporate information management systems via Web Services and custom adapters. Presentation

  13. User Language Considerations in Military Human-Computer Interface Design

    DTIC Science & Technology

    1988-06-30

    Hospital SPersonnel carrier " , Vehiculo do transporte do personal 23 APPENDIX B SAMPLE TASK STIMULI 24 Hl lIIW lid Thk Uti litiv S rip t Flari 1( ipmilt QN...interseccl6n cruce hacia al este. "Despuks de cruzar la segunda calle, en el lado sur del camino. "Encuentre el VEHICULO DE TRANSPORTE DE PERSONAL trial: 1 6 5...43 trial: 1 6 6 PESCM CARRIER trial:,- 1 6 7 PESMM CARRIER trials tX 6 8 VEHICULO DE TRANSPOCRTE DE PERSONAL trial: 4% 1 6 9 trial: 1 6 10 icn camment

  14. Unmanned Surface Vehicle Human-Computer Interface for Amphibious Operations

    DTIC Science & Technology

    2013-08-01

    of the MOCU HCI and functions must account for limitations in human performance and offer enhancements to guide attention during multi-tasking...Mitigating Complexity  Decision Biases  Attention Allocation  Supervisory Monitoring of Operators  Trust and Reliability  Accountability For the...which serves to validate the model results as well as defining system success. Workload management is a key design issue that must account for the

  15. Exploring the Human Computer Interface and Photic Driving.

    DTIC Science & Technology

    1999-03-01

    34radio" name=" I ex B" value="PLIK "></td> <td width=" 5% "> QLIK </td> <td width=" 9%">5<input type="radio" name="I-exB" value=" QLIK "></td> <td width=Ř

  16. Rule-Based Programming for Human-Computer Interface Specification.

    DTIC Science & Technology

    1982-01-01

    additional matching convenience is provided by wild - card Imatches. Two symbols, * and ?, are used to allow matching on arbitrary objects in the...language. The * wild - card will match an arbitrary list structure, relation with arguments, or a Iconstant; in addition, if *occurs more than once, each b...matched value is not retained. The ? wild - card restricts the matching process more than *; it cannot match arbitrary lists and relations, rather it is

  17. Developing a Digital Human-Computer Interaction Laboratory

    DTIC Science & Technology

    2005-07-01

    Human - Computer Interaction Laboratory Lieutenant Colonel Terence S. Andre, USAF IITA Research...a Digital Human - Computer Interaction Laboratory 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...design, and human - computer interaction . He also oversees the human - computer interaction laboratory and directs cooperative agreements with

  18. Human-Computer Interaction. Second Edition.

    ERIC Educational Resources Information Center

    Dix, Alan J.; Finlay, Janet E.; Abowd, Gregory D.; Beale, Russell

    This book examines human-computer interaction (HCI), with a focus on designing computer technology to be more usable by people. The book provides a multi-disciplinary approach to the subject through a synthesis of computer science, cognitive science, psychology, and sociology, and stresses a principled approach to interactive systems design that…

  19. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  20. Applying Human Computation Methods to Information Science

    ERIC Educational Resources Information Center

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  1. Human-Computer Interaction. Second Edition.

    ERIC Educational Resources Information Center

    Dix, Alan J.; Finlay, Janet E.; Abowd, Gregory D.; Beale, Russell

    This book examines human-computer interaction (HCI), with a focus on designing computer technology to be more usable by people. The book provides a multi-disciplinary approach to the subject through a synthesis of computer science, cognitive science, psychology, and sociology, and stresses a principled approach to interactive systems design that…

  2. Human-Computer Interaction: A Review of the Research on Its Affective and Social Aspects.

    ERIC Educational Resources Information Center

    Deaudelin, Colette; Dussault, Marc; Brodeur, Monique

    2003-01-01

    Discusses a review of 34 qualitative and non-qualitative studies related to affective and social aspects of student-computer interactions. Highlights include the nature of the human-computer interaction (HCI); the interface, comparing graphic and text types; and the relation between variables linked to HCI, mainly trust, locus of control,…

  3. Human-Computer Interaction (HCI) in Educational Environments: Implications of Understanding Computers as Media.

    ERIC Educational Resources Information Center

    Berg, Gary A.

    2000-01-01

    Reviews literature in the field of human-computer interaction (HCI) as it applies to educational environments. Topics include the origin of HCI; human factors; usability; computer interface design; goals, operations, methods, and selection (GOMS) models; command language versus direct manipulation; hypertext; visual perception; interface…

  4. Human-Computer Interaction: A Review of the Research on Its Affective and Social Aspects.

    ERIC Educational Resources Information Center

    Deaudelin, Colette; Dussault, Marc; Brodeur, Monique

    2003-01-01

    Discusses a review of 34 qualitative and non-qualitative studies related to affective and social aspects of student-computer interactions. Highlights include the nature of the human-computer interaction (HCI); the interface, comparing graphic and text types; and the relation between variables linked to HCI, mainly trust, locus of control,…

  5. Validation and Application of COGNET Model of Human-Computer Interaction in Naval Air ASW

    DTIC Science & Technology

    1990-05-31

    cyclically, but this produced side effects which could not be controlled in a straightforward manner. The immediate problem was solved by checking the triggers...design of more effective human-computer interfaces. This research developed the COGNET (COGnitive Network of Tasks) RTMT modeling framework, as an...Accomplished by Interface .................... 5-26 Figure 5-17. Build Tritac Subgoal Accomplished by Intorface ............................... 5-27 III

  6. Prosodic alignment in human-computer interaction

    NASA Astrophysics Data System (ADS)

    Suzuki, N.; Katagiri, Y.

    2007-06-01

    Androids that replicate humans in form also need to replicate them in behaviour to achieve a high level of believability or lifelikeness. We explore the minimal social cues that can induce in people the human tendency for social acceptance, or ethopoeia, toward artifacts, including androids. It has been observed that people exhibit a strong tendency to adjust to each other, through a number of speech and language features in human-human conversational interactions, to obtain communication efficiency and emotional engagement. We investigate in this paper the phenomena related to prosodic alignment in human-computer interactions, with particular focus on human-computer alignment of speech characteristics. We found that people exhibit unidirectional and spontaneous short-term alignment of loudness and response latency in their speech in response to computer-generated speech. We believe this phenomenon of prosodic alignment provides one of the key components for building social acceptance of androids.

  7. Human-Computer Interactions and Decision Behavior

    DTIC Science & Technology

    1984-01-01

    Narang A. Cohill J. Pittman J. Elkerton M. Revesman R. Fainter C. Rieger L. Folley J. Schurick M. Hakkinen A. Siochi D. Johnson T. Spine C. Ku M. Sti...W., Yunten, T., , Johnson , D. H. DMS: A comprehensive system for managing human- computer dialogue. In Proceedings of Human Factors in Computer...interactive system. Wel! known software metrics are used in this analysis. 3. The Dialogue Author a. Reports Johnson , D. H., Hartson, H. R. The role

  8. Human-Computer Interaction in Smart Environments

    PubMed Central

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  9. Symposium on Human-Computer Information Retrieval.

    PubMed

    Tunkelang, Daniel; Capra, Robert; Golovchinsky, Gene; Kules, Bill; Smith, Catherine; White, Ryen

    2013-03-01

    Human-computer information retrieval (HCIR) is the study of information retrieval techniques that integrate human intelligence and algorithmic search to help people explore, understand, and use information. Since 2007, we have held an annual gathering of researchers and practitioners to advance the state of the art in this field. This meeting report summarizes the history of the HCIR symposium and emphasizes its relevance to the data science community.

  10. The GOURD model of human-computer interaction

    SciTech Connect

    Goldbogen, G.

    1996-12-31

    This paper presents a model, the GOURD model, that can be used to measure the goodness of {open_quotes}interactivity{close_quotes} of an interface design and qualifies how to improve the design. The GOURD model describes what happens to the computer and to the human during a human-computer interaction. Since the interaction is generally repeated, the traversal of the model repeatedly is similar to a loop programming structure. Because the model measures interaction over part or all of the application, it can also be used as a classifier of the part or the whole application. But primarily, the model is used as a design guide and a predictor of effectiveness.

  11. Human-computer interaction: psychology as a science of design.

    PubMed

    Carroll, J M

    1997-01-01

    Human-computer interaction (HCI) study is the region of intersection between psychology and the social sciences, on the one hand, and computer science and technology, on the other. HCI researchers analyze and design specific user interface technologies (e.g. pointing devices). They study and improve the processes of technology development (e.g. task analysis, design rationale). They develop and evaluate new applications of technology (e.g. word processors, digital libraries). Throughout the past two decades, HCI has progressively integrated its scientific concerns with the engineering goal of improving the usability of computer systems and applications, which has resulted in a body of technical knowledge and methodology. HCI continues to provide a challenging test domain for applying and developing psychological and social theory in the context of technology development and use.

  12. Head-controlled assistive telerobot with extended physiological proprioception capability

    NASA Astrophysics Data System (ADS)

    Salganicoff, Marcos; Rahman, Tariq; Mahoney, Ricardo; Pino, D.; Jayachandran, Vijay; Kumar, Vijay; Chen, Shoupu; Harwin, William S.

    1995-12-01

    People with disabilities such as quadriplegia can use mouth-sticks and head-sticks as extension devices to perform desired manipulations. These extensions provide extended proprioception which allows users to directly feel forces and other perceptual cues such as texture present at the tip of the mouth-stick. Such devices are effective for two principle reasons: because of their close contact with the user's tactile and proprioceptive sensing abilities; and because they tend to be lightweight and very stiff, and can thus convey tactile and kinesthetic information with high-bandwidth. Unfortunately, traditional mouth-sticks and head-sticks are limited in workspace and in the mechanical power that can be transferred because of user mobility and strength limitations. We describe an alternative implementation of the head-stick device using the idea of a virtual head-stick: a head-controlled bilateral force-reflecting telerobot. In this system the end-effector of the slave robot moves as if it were at the tip of an imaginary extension of the user's head. The design goal is for the system is to have the same intuitive operation and extended proprioception as a regular mouth-stick effector but with augmentation of workspace volume and mechanical power. The input is through a specially modified six DOF master robot (a PerForceTM hand-controller) whose joints can be back-driven to apply forces at the user's head. The manipulation tasks in the environment are performed by a six degree-of-freedom slave robot (the Zebra-ZEROTM) with a built-in force sensor. We describe the prototype hardware/software implementation of the system, control system design, safety/disability issues, and initial evaluation tasks.

  13. Fingertips detection for human computer interaction system

    NASA Astrophysics Data System (ADS)

    Alam, Md. Jahangir; Nasierding, Gulisong; Sajjanhar, Atul; Chowdhury, Morshed

    2014-01-01

    Fingertips of human hand play an important role in hand-based interaction with computers. Identification of fingertips' positions in hand images is vital for developing a human computer interaction system. This paper proposes a novel method for detecting fingertips of a hand image analyzing the concept of the geometrical structural information of fingers. The research is divided into three parts: First, hand image is segmented for detecting hand; Second, invariant features (curvature zero-crossing points) are extracted from the boundary of the hand; Third, fingertips are detected. Experimental results show that the proposed approach is promising.

  14. Formalisms for user interface specification and design

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent J.

    1989-01-01

    The application of formal methods to the specification and design of human-computer interfaces is described. A broad outline of human-computer interface problems, a description of the field of cognitive engineering and two relevant research results, the appropriateness of formal specification techniques, and potential NASA application areas are described.

  15. Improving the human-computer dialogue with increased temporal predictability.

    PubMed

    Weber, Florian; Haering, Carola; Thomaschke, Roland

    2013-10-01

    An experiment was conducted to investigate the impacts of length and variability of system response time (SRT) on user behavior and user experience (UX) in sequential computing tasks. Length is widely considered to be the most important aspect of SRTs in human-computer interaction. Research on temporal attention shows that humans adjust to temporal structures and that performance substantially improves with temporal predictability. Participants performed a sequential task with simulated office software. Duration and variability, that is, the number of different SRTs, was manipulated. Lower variability came at the expense of on average higher durations. User response times, task execution times, and failure rates were measured to assess user performance. UX was measured with a questionnaire. A reduction in variability improved user performance significantly. Whereas task load and failure rates remained constant, responses were significantly faster. Although a reduction in variability came along with, on average, increased SRTs, no difference in UX was found. Considering SRT variability when designing software can yield considerable performance benefits for the users. Although reduced variability comes at the expense of overall longer SRTs, the interface is not subjectively evaluated to be less satisfactory or demanding. Time design should aim not only at reducing average SRT length but also at finding the optimum balance of length and variability. Our findings can easily be applied in any user interface for sequential tasks. User performance can be improved without loss of satisfaction by selectively prolonging particular SRTs to reduce variability.

  16. Grasping force measurement of a 6-DOF haptic device for human-computer interaction

    NASA Astrophysics Data System (ADS)

    Wu, Z. C.; Wang, Zengfu; Ge, Yu

    2003-04-01

    A haptic interface device has been presented for human-computer-interaction (HCI) in this paper, which uses a 6 degree-of-freedom (DOF) force sensor for six axis force and torque (F/T) measurement. With this device, the user could grasp a moveable handle to interact with simulated 3D environments in real-time for production design, simulation and predication. As a human-computer-interface device, its work mainly depend on the 6 DOF force and torque applied by the user to the handle. The 6DOF sensor structure and its measurement principle are given in our work in detail. The whole system is consisted by signal amplifiers, control motors, integrated 6 DOF force sensor and power supplies needed for operation.

  17. Early NACA human computers at work

    NASA Technical Reports Server (NTRS)

    1949-01-01

    The women of the Computer Department at NACA High-Speed Flight Research Station are shown busy with test flight calculations. The computers under the direction of Roxanah Yancey were responsible for accurate calculations on the research test flights made at the Station. There were no mechanical computers at the station in 1949, but data was reduced by human computers. Shown in this photograph starting at the left are: Geraldine Mayer and Mary (Tut) Hedgepeth with Friden calculators on the their desks; Emily Stephens conferring with engineer John Mayer; Gertrude (Trudy) Valentine is working on an oscillograph recording reducing the data from a flight. Across the desk is Dorothy Clift Hughes using a slide rule to complete data calculations. Roxanah Yancey completes the picture as she fills out engineering requests for further data.

  18. SIG -- The Role of Human-Computer Interaction in Next-Generation Control Rooms

    SciTech Connect

    Ronald L. Boring; Jacques Hugo; Christian Richard; Donald D. Dudenhoeffer

    2005-04-01

    The purpose of this CHI Special Interest Group (SIG) is to facilitate the convergence between human-computer interaction (HCI) and control room design. HCI researchers and practitioners actively need to infuse state-of-the-art interface technology into control rooms to meet usability, safety, and regulatory requirements. This SIG outlines potential HCI contributions to instrumentation and control (I&C) and automation in control rooms as well as to general control room design.

  19. Human-Computer Interaction in Tactical Operations: Designing for Effective Human-Computer Dialogue

    DTIC Science & Technology

    1990-09-01

    761 (DOD, 1985) presents human engineering guidelines for management information systems . The process for analyzing, designing and testing UCI...user may encounter. For example, if the system is to be designed to be used with a commercially-available word processor, data base manager or a...typing. 47 Data bases. Data base management systems , and in particular languages for querying data bases, are a specialized area of human-computer

  20. Multimodal neuroelectric interface development

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Rosipal, Roman; Clanton, Sam T.; Matthews, Bryan; Hibbs, Andrew D.; Matthews, Robert; Krupka, Michael

    2003-01-01

    We are developing electromyographic and electroencephalographic methods, which draw control signals for human-computer interfaces from the human nervous system. We have made progress in four areas: 1) real-time pattern recognition algorithms for decoding sequences of forearm muscle activity associated with control gestures; 2) signal-processing strategies for computer interfaces using electroencephalogram (EEG) signals; 3) a flexible computation framework for neuroelectric interface research; and d) noncontact sensors, which measure electromyogram or EEG signals without resistive contact to the body.

  1. Multimodal neuroelectric interface development

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Rosipal, Roman; Clanton, Sam T.; Matthews, Bryan; Hibbs, Andrew D.; Matthews, Robert; Krupka, Michael

    2003-01-01

    We are developing electromyographic and electroencephalographic methods, which draw control signals for human-computer interfaces from the human nervous system. We have made progress in four areas: 1) real-time pattern recognition algorithms for decoding sequences of forearm muscle activity associated with control gestures; 2) signal-processing strategies for computer interfaces using electroencephalogram (EEG) signals; 3) a flexible computation framework for neuroelectric interface research; and d) noncontact sensors, which measure electromyogram or EEG signals without resistive contact to the body.

  2. Discoveries and developments in human-computer interaction.

    PubMed

    Boehm-Davis, Deborah A

    2008-06-01

    This paper describes contributions made to the science and practice of human-computer interaction (HCI), primarily through Human Factors and the society's annual proceedings. Research in HCI began to appear in publications associated with the Society around 1980 and has continued through the present. A search of the literature appearing in either the journal or the proceedings was done to identify the specific contributions made by researchers in this area. More than 2,300 papers were identified, some comparing the actual or predicted performance of a new device, display format, or computer-based system with an existing or alternative system. Other work describes methods for evaluating systems performance. This work has had a tremendous impact, particularly the work of Fitts, Smith and Mosier, and Virzi. Work on HCI has contributed to (a) current national and international guidelines, (b) the development of user interface management systems, (c) the provision of guidance as to where best to invest resources when evaluating computing systems, and (d) the prediction of human performance using those systems.

  3. Wearable joystick for gloves-on human/computer interaction

    NASA Astrophysics Data System (ADS)

    Bae, Jaewook; Voyles, Richard M.

    2006-05-01

    In this paper, we present preliminary work on a novel wearable joystick for gloves-on human/computer interaction in hazardous environments. Interacting with traditional input devices can be clumsy and inconvenient for the operator in hazardous environments due to the bulkiness of multiple system components and troublesome wires. During a collapsed structure search, for example, protective clothing, uneven footing, and "snag" points in the environment can render traditional input devices impractical. Wearable computing has been studied by various researchers to increase the portability of devices and to improve the proprioceptive sense of the wearer's intentions. Specifically, glove-like input devices to recognize hand gestures have been developed for general-purpose applications. But, regardless of their performance, prior gloves have been fragile and cumbersome to use in rough environments. In this paper, we present a new wearable joystick to remove the wires from a simple, two-degree of freedom glove interface. Thus, we develop a wearable joystick that is low cost, durable and robust, and wire-free at the glove. In order to evaluate the wearable joystick, we take into consideration two metrics during operator tests of a commercial robot: task completion time and path tortuosity. We employ fractal analysis to measure path tortuosity. Preliminary user test results are presented that compare the performance of both a wearable joystick and a traditional joystick.

  4. User expertise in speech centered multimodal human computer interaction

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Rajesh; Dusan, Sorin; Flanagan, James L.

    2004-10-01

    Multimodal interfaces aim to permit natural communication by speech and gesture. Typically the speech modality bears the principal information in the interaction with gesture complementing spoken commands. A continuing challenge is how to correlate and interpret the simultaneous inputs to estimate meaning and user intent. User expertise and familiarity figure prominently in the interpretation. The present research studies the effect of user expertise on multimodal human computer interaction. Users are classified into experienced and inexperienced depending on the amount of their exposure and interaction with multimodal systems. Each user is asked to perform simple tasks using a multimodal system. For each task the automatically recognized speech input is time stamped and the lag or lead of the gesture input is computed with respect to this time stamp. The time interval around the time stamp in which all the users' gesture inputs occur is determined. For experienced users this interval averages 56.9% less than that for inexperienced users. The implication is that for experienced users the spoken input are the corresponding gesture input are more closely related in time than for inexperienced users. This behavior can be exploited in multimodal systems to increase efficiency and reduce time of response for the system.

  5. New Perspectives on Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Moran, Thomas P., Ed.; And Others

    1985-01-01

    Individual papers discussing various facets of human relationships with interactive computer systems present an analysis of direct manipulation interfaces; discuss notion of conceptual models shared by system and user and propose a design methodology for delivering models to users; and address the intelligibility of systems and importance of…

  6. Modeling Human-Computer Decision Making with Covariance Structure Analysis.

    ERIC Educational Resources Information Center

    Coovert, Michael D.; And Others

    Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…

  7. Perceptual-Motor Control in Human-Computer Interaction.

    DTIC Science & Technology

    1996-03-01

    This report isolates and examines some of the emergent perceptual-motor issues raised by the new style in human - computer interaction . It concerns...be studied. I also cover research from both the motor-control and the human - computer interaction literature that applies to perceptual and motor aspects of menu selection.

  8. Questioning Mechanisms During Tutoring, Conversation, and Human-Computer Interaction

    DTIC Science & Technology

    1992-10-14

    grant awarded to Arthur C. Graesser, entitled "Questioning Mechanisms during Tutoring, Conversation, and Human - Computer Interaction " (N00014-92-J-1826...Memphis State University QUESTIONING MECHANISMS DURING TUTORING, CONVERSATION, AND HUMAN - COMPUTER INTERACTION Papers in refereed iournals: Graesser, A. C

  9. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  10. Improving Human Interfaces in Military Simulation Applications

    DTIC Science & Technology

    2006-09-01

    Laboratory 2255 H Street WPAFB, OH 45433 937-255-8768 Michael.haas@wpafb.af.mil Keywords: Human - Computer Interaction , User Interface, Simulation, OneSAF, GOMS...Reading, Massachusetts 1998. [5] Card, S., & Moran, T, & Newell, A.: The Psychology of Human - Computer Interaction ,- Lawrence Erlbaum Associates, Inc., New...Jersey 1983. [6] Helander, M., & Landauer, T., & Prabhu, P.: Handbook of Human - Computer Interaction , North Holland, Amsterdam 1997. [7] Brinck, T

  11. Some aspects of optimal human-computer symbiosis in multisensor geospatial data fusion

    NASA Astrophysics Data System (ADS)

    Levin, E.; Sergeyev, A.

    Nowadays vast amount of the available geospatial data provides additional opportunities for the targeting accuracy increase due to possibility of geospatial data fusion. One of the most obvious operations is determining of the targets 3D shapes and geospatial positions based on overlapped 2D imagery and sensor modeling. 3D models allows for the extraction of such information about targets, which cannot be measured directly based on single non-fused imagery. Paper describes ongoing research effort at Michigan Tech attempting to combine advantages of human analysts and computer automated processing for efficient human computer symbiosis for geospatial data fusion. Specifically, capabilities provided by integration into geospatial targeting interfaces novel human-computer interaction method such as eye-tracking and EEG was explored. Paper describes research performed and results in more details.

  12. The Cross-Cultural Study of Human-Computer Interaction: A Review of Research Methodology, Technology Transfer, and the Diffusion of Innovation.

    ERIC Educational Resources Information Center

    Day, Donald L.

    This paper examines the methodological literature of cross-cultural research to establish whether the means exist to identify culturally biased preconceptions implicit in human-computer interfaces, and to develop interfaces more attuned to the cultural differences of the users. It is the premise of this paper that cultural conditioning affects…

  13. Eye-voice-controlled interface

    NASA Technical Reports Server (NTRS)

    Glenn, Floyd A., III; Iavecchia, Helene P.; Ross, Lorna V.; Stokes, James M.; Weiland, William J.

    1986-01-01

    The Ocular Attention-Sensing Interface System (OASIS) is an innovative human-computer interface which utilizes eye movement and voice commands to communicate messages between the operator and the system. This report initially describes some technical issues relevant to the development of such an interface. The results of preliminary experiments which evaluate alternative eye processing algorithms and feedback techniques are presented. Candidate interface applications are also discussed.

  14. A method for evaluating head-controlled computer input devices using Fitts' law.

    PubMed

    Radwin, R G; Vanderheiden, G C; Lin, M L

    1990-08-01

    The discrete movement task employed in this study consisted of moving a cursor from the center of a computer display screen to circular targets located 24.4 and 110.9 mm in eight radial directions. The target diameters were 2.7, 8.1, and 24.2 mm. Performance measures included movement time, cursor path distance, and root-mean-square cursor deviation. Ten subjects with no movement disabilities were studied using a conventional mouse and a lightweight ultrasonic head-controlled computer input pointing device. Average movement time was 306 ms greater (63%) for the head-controlled pointer than for the mouse. The effect of direction on movement time for the mouse was relatively small compared with the head-controlled pointer, which was lowest at 90 and 270 deg, corresponding to head extension and head flexion, respectively. Average path distance and root mean square displacement was lowest at off-diagonal directions (0, 90, 180, and 270 deg). This methodology was also shown to be useful for evaluating performance using an alternative head-controlled input device for two subjects having cerebral palsy, and measured subtle performance improvements after providing a disabled subject with lateral torso support.

  15. Impact of Cognitive Architectures on Human-Computer Interaction

    DTIC Science & Technology

    2014-09-01

    Army Research Laboratory Impact of Cognitive Architectures on Human-Computer Interaction by Sidney C Smith ARL-TR-7092 September 2014 Approved for...Proving Ground, MD 21005-5067 ARL-TR-7092 September 2014 Impact of Cognitive Architectures on Human-Computer Interaction Sidney C Smith Computational...PAGES 19b. TELEPHONE NUMBER (Include area code) Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 September 2014 Final Impact of

  16. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  17. Advancements in remote physiological measurement and applications in human-computer interaction

    NASA Astrophysics Data System (ADS)

    McDuff, Daniel

    2017-04-01

    Physiological signals are important for tracking health and emotional states. Imaging photoplethysmography (iPPG) is a set of techniques for remotely recovering cardio-pulmonary signals from video of the human body. Advances in iPPG methods over the past decade combined with the ubiquity of digital cameras presents the possibility for many new, lowcost applications of physiological monitoring. This talk will highlight methods for recovering physiological signals, work characterizing the impact of video parameters and hardware on these measurements, and applications of this technology in human-computer interfaces.

  18. Making intelligent systems team players: Case studies and design issues. Volume 1: Human-computer interaction design

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Woods, David D.; Potter, Scott S.; Johannesen, Leila; Holloway, Matthew; Forbus, Kenneth D.

    1991-01-01

    Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. The objective is to achieve more effective human-computer interaction (HCI) for systems with real time fault management capabilities. Intelligent fault management systems within the NASA were evaluated for insight into the design of systems with complex HCI. Preliminary results include: (1) a description of real time fault management in aerospace domains; (2) recommendations and examples for improving intelligent systems design and user interface design; (3) identification of issues requiring further research; and (4) recommendations for a development methodology integrating HCI design into intelligent system design.

  19. New Theoretical Approaches for Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Rogers, Yvonne

    2004-01-01

    Presents a critique of recent theoretical developments in the field of human-computer interaction (HCI) together with an overview of HCI practice. This chapter discusses why theoretically based approaches have had little impact on the practice of interaction design and suggests mechanisms to enable designers and researchers to better articulate…

  20. Is Human-Computer Interaction Social or Parasocial?

    ERIC Educational Resources Information Center

    Sundar, S. Shyam

    Conducted in the attribution-research paradigm of social psychology, a study examined whether human-computer interaction is fundamentally social (as in human-human interaction) or parasocial (as in human-television interaction). All 30 subjects (drawn from an undergraduate class on communication) were exposed to an identical interaction with…

  1. New Theoretical Approaches for Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Rogers, Yvonne

    2004-01-01

    Presents a critique of recent theoretical developments in the field of human-computer interaction (HCI) together with an overview of HCI practice. This chapter discusses why theoretically based approaches have had little impact on the practice of interaction design and suggests mechanisms to enable designers and researchers to better articulate…

  2. A Graphical User Interface Design for Shipboard Damage Control

    DTIC Science & Technology

    1991-08-26

    L. TATE Human Computer Interaction Branch Information Technology Division August 26, 1991 91-12853 Approved fnr public release; distribution...control, Computer graphics, Computerized 15 control systems, Human - computer interaction , Man-machine interface 16. PRICE CODE 17. SECURITY...Technology for Damage Control," David Taylor Research Center Report DTRC/PAS-90-7, June 1990. 3. A. Monk, ed., Fundamentals of Human - Computer Interaction (Academic

  3. [Relationship between the prone position and achieving head control at 3 months].

    PubMed

    Pérez-Machado, J L; Rodríguez-Fuentes, G

    2013-10-01

    Owing to the significant increase of mild motor delays and the strong intolerance of infants to be placed on prone position observed in the Physiotherapy Unit of the Maternal and Children's University Hospital of the Canaries (HUMIC), a study was conducted to determine whether positioning infants in the prone position while awake affected the achievement and quality of head control at three months. A prospective comparative practice-based study of a representative sample of 67 healthy infants born in the HUMIC, and divided into an experimental group (n = 35) and control group (n = 32). The Alberta Infant Motor Scale (AIMS) and a parent questionnaire were used as measurement tools. The intervention consisted of regular home visits to the experimental group (from the first to the third month). The two groups were evaluated in their homes at the end of 3 months. The differences in mean raw score of the AIMS at 3 months were, 16.26 in the experimental group and 10.38 in control group (P<.001). The percentile mean was 94 in the experimental group, and less than 50 (42) in the control group. All of the experimental group babies achieved the head control, with only 8 in the control group (25%). The significant findings suggest a direct relationship between the time spent in the prone position when the baby is awake and the achievement of head control at three months. Copyright © 2012 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  4. Human computers: the first pioneers of the information age.

    PubMed

    Grier, D A

    2001-03-01

    Before computers were machines, they were people. They were men and women, young and old, well educated and common. They were the workers who convinced scientists that large-scale calculation had value. Long before Presper Eckert and John Mauchly built the ENIAC at the Moore School of Electronics, Philadelphia, or Maurice Wilkes designed the EDSAC for Manchester University, human computers had created the discipline of computation. They developed numerical methodologies and proved them on practical problems. These human computers were not savants or calculating geniuses. Some knew little more than basic arithmetic. A few were near equals of the scientists they served and, in a different time or place, might have become practicing scientists had they not been barred from a scientific career by their class, education, gender or ethnicity.

  5. Some Applications of String Algorithms in Human-Computer Interaction

    NASA Astrophysics Data System (ADS)

    Räihä, Kari-Jouko

    Two applications of string algorithms in human-computer interaction are reviewed: one for comparing error rates of text entry techniques, another for abstracting collections of scan paths (paths of eye movements). For both applications, the classic string edit distance algorithm proves useful. For the latter application shortest common supersequences provide one option for further development. Applying them as such could be misleading, but a suitable approximation could provide a useful representation of a set of scan paths.

  6. Eye Tracking in Human-Computer Interaction and Usability Research

    NASA Astrophysics Data System (ADS)

    Strandvall, Tommy

    The objective of the tutorial is to give an overview on how eye tracking is currently used and how it can be used as a method in human computer interaction research and especially in usability research. An eye tracking system records how the eyes move while a subject is completing a task for example on a web site. By analyzing these eye movements we are able to gain an objective insight into the behavior of that person.

  7. Human-Computer Interaction in the School of Computer Science

    DTIC Science & Technology

    1992-10-01

    The School of Computer Science (SCS) faculty who are interested in Human-Computer Interaction (HCI) present their position on what role HCI can play...in Carnegie Mellon’s School of Computer Science . The authors present a short description of the need for HCI research and recommend a task/human...organizations at CMU. The authors recommend that the Computer Science Department form a new area in HCI. Research around the periphery of the task

  8. Kinematic Measurement of 12-week Head Control Correlates with 12-month Neurodevelopment in Preterm Infants

    PubMed Central

    Bentzley, Jessica P; Coker-Bolt, Patty; Moreau, Noelle; Hope, Kathryn; Ramakrishnan, Viswanathan; Brown, Truman; Mulvihill, Denise; Jenkins, Dorothea

    2015-01-01

    Background Although new interventions treating neonatal brain injury show great promise, our current ability to predict clinical functional outcomes is poor. Quantitative biomarkers of long-term neurodevelopmental outcome are critically needed to gauge treatment efficacy. Kinematic measures derived from commonly used developmental tasks may serve as early objective markers of future motor outcomes. Aim To develop reliable kinematic markers of head control at 12 weeks corrected gestational age (CGA) from two motor tasks: head lifting in prone and pull-to-sit Study design and subjects Prospective observational study of 22 preterm infants born between 24 and 34 weeks of gestation Outcome measures Bayley Scales of Infant Development III (Bayley) motor scores Results Intrarater and interrater reliability of prone head lift angles and pull-to-sit head angles were excellent. Prone head lift angles at 12 weeks CGA correlated with white matter NAA/Cho, concurrent Test of Infant Motor Performance (TIMP) scores, and 12-month Bayley motor scores. Head angles during pull-to-sit at 12-weeks CGA correlated with TIMP scores. Conclusions Poor ability to lift the head in prone and an inability to align the head with the trunk during the pull-to-sit task were associated with poorer future motor outcome scores. Kinematic measurements of head control in early infancy may serve as reliable objective quantitative markers of future motor impairment and neurodevelopmental outcome. PMID:25621433

  9. Ocular attention-sensing interface system

    NASA Technical Reports Server (NTRS)

    Zaklad, Allen; Glenn, Floyd A., III; Iavecchia, Helene P.; Stokes, James M.

    1986-01-01

    The purpose of the research was to develop an innovative human-computer interface based on eye movement and voice control. By eliminating a manual interface (keyboard, joystick, etc.), OASIS provides a control mechanism that is natural, efficient, accurate, and low in workload.

  10. An Architectural Experience for Interface Design

    ERIC Educational Resources Information Center

    Gong, Susan P.

    2016-01-01

    The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…

  11. An Architectural Experience for Interface Design

    ERIC Educational Resources Information Center

    Gong, Susan P.

    2016-01-01

    The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…

  12. Visual User Interfaces for Information Exploration.

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1991-01-01

    Discussion of human-computer interfaces focuses on the use of graphical and direct manipulation approaches to improve the user interface. Topics discussed include information seeking; computerized search capabilities, including full-text string searches, index searches, and hypertext; Boolean expressions; dynamic or direct manipulation queries;…

  13. Design of a new human-computer interactive device for projection display

    NASA Astrophysics Data System (ADS)

    Xu, Wei; Liu, Xiangdong; Meng, Xiao

    2005-02-01

    Projection displays are widely applied as tools for multimedia in conference room presentation, education center, R&D center and more places. To provide a more interactive environment, a new kind of human-computer interactive device is designed and presented. A two-dimensional CCD is the sensor of the unit. Through optical filter, CCD exports full video signal including a series of isolated positive pulse caused by the specific light-spot target generated from a specific light-pen. Through a video sync separator, combinational logic and sequential logic process of the full video signal, the target image's two-dimensional position on the light sensitive layer of CCD can be gained. The specific light-pen also sends the function logic message to the controller part through wireless communication. A microcontroller will combine the position information and function message, and then send it to computer through RS-232 of USB interface. The software in computer will process these messages. The specific light-spot's relative coordinates in the projection screen is gained. With the coordinate and the function message, the software will drive the computer to implement certain functions. With the specific light-pen, one can control the computer, take notes and shape his desire in the screen. Now the device is applied in LCD projection displays and it also can be applied in any large screen display. With the improvement of the system and the software, the function will be more powerful and provide a more interactive human computer interface (HCI).

  14. Human-computer Interaction System Based on GRBF & HMM

    NASA Astrophysics Data System (ADS)

    Juan, Wang

    People have made many researches on computer vision, but the accuracy and speed were not satisfactory. This paper introduced a Human-computer interaction system based on GRBF and HMM. The paper used GRBF Artificial Neural Networks define the position of head, and HMM define the position of fingers. We combined the line of sight and the direction of fingers to ensure the uses' input focus. And the results showed that the recognition accuracy and speed of the system had been increased greatly in this way.

  15. Human-Computer Interaction, Tourism and Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  16. Facilitation handlings induce increase in electromyographic activity of muscles involved in head control of cerebral palsy children.

    PubMed

    Simon, Anelise de Saldanha; do Pinho, Alexandre Severo; Grazziotin Dos Santos, Camila; Pagnussat, Aline de Souza

    2014-10-01

    This study aimed to investigate the electromyographic (EMG) activation of the main cervical muscles involved in the head control during two postures widely used for the facilitation of head control in children with Cerebral Palsy (CP). A crossover trial involving 31 children with clinical diagnosis of CP and spastic quadriplegia was conducted. Electromyography was used to measure muscular activity in randomized postures. Three positions were at rest: (a) lateral decubitus, (b) ventral decubitus on the floor and (c) ventral decubitus on the wedge. Handlings for facilitating the head control were performed using the hip joint as key point of control in two postures: (a) lateral decubitus and (b) ventral decubitus on wedge. All children underwent standardized handlings, performed by the same researcher with experience in the neurodevelopmental treatment. EMG signal was recorded from muscles involved in the head control (paraspinal and sternocleidomastoid muscles) in sagittal, frontal and transverse planes, at the fourth cervical vertebra (C4), tenth thoracic vertebra (T10) and sternocleidomastoid muscle (SCM) levels. The results showed a significant increase in muscle activation when handling was performed in the lateral decubitus at C4 (P<0.001), T10 (P<0.001) and SCM (P=0.02) levels. A significant higher muscle activation was observed when handling was performed in lateral decubitus when compared to ventral decubitus at C4 level (P<0.001). Handling in ventral decubitus also induced an increase in EMG activation at T10 (P=0.018) and SCM (P=0.004) levels but not at C4 level (P=0.38). In conclusion, handlings performed in both positions may induce the facilitation of head control, as evaluated by the activity of cervical and upper trunk muscles. Handling performed in lateral decubitus may induce a slightly better facilitation of head control. These findings contribute to evidence-based physiotherapy practice for the rehabilitation of severely spastic quadriplegic CP

  17. CDROM User Interface Evaluation: The Appropriateness of GUIs.

    ERIC Educational Resources Information Center

    Bosch, Victoria Manglano; Hancock-Beaulieu, Micheline

    1995-01-01

    Assesses the appropriateness of GUIs (graphical user interfaces), more specifically Windows-based interfaces for CD-ROM. An evaluation model is described that was developed to carry out an expert evaluation of the interfaces of seven CD-ROM products. Results are discussed in light of HCI (human-computer interaction) usability criteria and design…

  18. Psychological Dimensions of User-Computer Interfaces. ERIC Digest.

    ERIC Educational Resources Information Center

    Marchionini, Gary

    This digest highlights several psychological dimensions of user-computer interfaces. First, the psychological theory behind interface design and the field of human-computer interaction (HCI) are discussed. Two psychological models, the information processing model of cognition and the mental model--both of which contribute to interface design--are…

  19. CDROM User Interface Evaluation: The Appropriateness of GUIs.

    ERIC Educational Resources Information Center

    Bosch, Victoria Manglano; Hancock-Beaulieu, Micheline

    1995-01-01

    Assesses the appropriateness of GUIs (graphical user interfaces), more specifically Windows-based interfaces for CD-ROM. An evaluation model is described that was developed to carry out an expert evaluation of the interfaces of seven CD-ROM products. Results are discussed in light of HCI (human-computer interaction) usability criteria and design…

  20. Safety Metrics for Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  1. How should Fitts' Law be applied to human-computer interaction?

    NASA Technical Reports Server (NTRS)

    Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

    1992-01-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  2. Delays and user performance in human-computer-network interaction tasks.

    PubMed

    Caldwell, Barrett S; Wang, Enlie

    2009-12-01

    This article describes a series of studies conducted to examine factors affecting user perceptions, responses, and tolerance for network-based computer delays affecting distributed human-computer-network interaction (HCNI) tasks. HCNI tasks, even with increasing computing and network bandwidth capabilities, are still affected by human perceptions of delay and appropriate waiting times for information flow latencies. Conducted were 6 laboratory studies with university participants in China (Preliminary Experiments 1 through 3) and the United States (Experiments 4 through 6) to examine users' perceptions of elapsed time, effect of perceived network task performance partners on delay tolerance, and expectations of appropriate delays based on task, situation, and network conditions. Results across the six experiments indicate that users' delay tolerance and estimated delay were affected by multiple task and expectation factors, including task complexity and importance, situation urgency and time availability, file size, and network bandwidth capacity. Results also suggest a range of user strategies for incorporating delay tolerance in task planning and performance. HCNI user experience is influenced by combinations of task requirements, constraints, and understandings of system performance; tolerance is a nonlinear function of time constraint ratios or decay. Appropriate user interface tools providing delay feedback information can help modify user expectations and delay tolerance. These tools are especially valuable when delay conditions exceed a few seconds or when task constraints and system demands are high. Interface designs for HCNI tasks should consider assistant-style presentations of delay feedback, information freshness, and network characteristics. Assistants should also gather awareness of user time constraints.

  3. How should Fitts' Law be applied to human-computer interaction?

    NASA Technical Reports Server (NTRS)

    Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

    1992-01-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  4. How should Fitts' Law be applied to human-computer interaction?

    PubMed

    Gillan, D J; Holden, K; Adam, S; Rudisill, M; Magee, L

    1992-12-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  5. MESO-Adaptation Based on Model Oriented Reengineering Process for Human-Computer Interface (MESOMORPH)

    DTIC Science & Technology

    2004-02-01

    best focus. We add visual color perception to the model to accommodate users who may be colorblind . Audio amplitude and frequency are determinants of...Visual accomodation None Range (diopters) Normal Visual color perception Colorblind red-green Colorblind other Normal Audio amplitude None

  6. Rapid, Agile Modeling Support for Human-Computer Interface Conceptual Design

    DTIC Science & Technology

    2008-12-01

    information (see Chi, Pirolli, and Pitkow, 2000). 5.7 LATENT SEMANTIC ANALYSIS (LSA) In CoLiDeS, semantic similarity is determined by Latent Semantic Analysis...P. W. Foltz, and D. Laham. 1998. “An Introduction to Latent Semantic Analysis,” Discourse Processes, vol. 25, pp. 259–284. Mannes, S. M. and W...1998. “Learning and Representing Verbal Meaning: Latent Semantic Analysis Theory,” Current Directions in Psychological Science, vol. 7, pp. 161–164

  7. The Role and Tools of a Dialogue Author in Creating Human-Computer Interfaces.

    DTIC Science & Technology

    1982-05-01

    DOCUMENTATION PAGE BEFORE COMPLETING FORM 1. REORT MMSIR2. GOVT ACCESIN04 3. RECIPI9MT’S CATALOG MUM114R CSIE-82-8 4. TITLE ( end SuAiitt.) S. TYrPE OF...are typically written by programers who generally have little or no formal training or even intuitive feeling for what consti- tutes an effective human...those of the application programmer and the end user of the system. These two types, however, frequently had severe communication problems. The

  8. Design of the human computer interface on the telerobotic small emplacement excavator

    SciTech Connect

    Thompson, D.H.; Killough, S.M.; Burks, B.L.; Draper, J.V.

    1995-12-31

    The small emplacement excavator (SEE) is a ruggedized military vehicle with backhoe and front loader used by the U.S. Army for explosive ordinance disposal (EOD) and general utility excavation activities. This project resulted from a joint need in the U.S. Department of Energy (DOE) for a remote controlled excavator for buried waste operations and the U.S. Department of Defense for remote EOD operations. To evaluate the feasibility of removing personnel from the SEE vehicle during high-risk excavation tasks, a development and demonstration project was initiated. Development of a telerobotic SEE (TSEE) was performed by the Oak Ridge National Laboratory in a project funded jointly by the U.S. Army and the DOE. The TSEE features teleoperated driving, a telerobotic backhoe with four degrees of freedom, and a teleoperated front loader with two degrees of freedom on the bucket. Remote capabilities include driving (forward, reverse, brake, steering), power takeoff shifting to enable digging modes, deploying stabilizers, excavation, and computer system booting.

  9. Why are Human-Computer Interfaces Difficult to Design and Implement

    DTIC Science & Technology

    1993-07-01

    mean when multiple people are using the same software? Advanced input devices, such as pen-based gesture recognition , speech, or DataGloves, also raise...Difficult to Design and Implement? -13 evolve, such as speech and gesture recognition , intelligent agents, and 3-D visualization, the amount of effort

  10. A practical EMG-based human-computer interface for users with motor disabilities.

    PubMed

    Barreto, A B; Scargle, S D; Adjouadi, M

    2000-01-01

    In line with the mission of the Assistive Technology Act of 1998 (ATA), this study proposes an integrated assistive real-time system which "affirms that technology is a valuable tool that can be used to improve the lives of people with disabilities." An assistive technology device is defined by the ATA as "any item, piece of equipment, or product system, whether acquired commercially, modified, or customized, that is used to increase, maintain, or improve the functional capabilities of individuals with disabilities." The purpose of this study is to design and develop an alternate input device that can be used even by individuals with severe motor disabilities. This real-time system design utilizes electromyographic (EMG) biosignals from cranial muscles and electroencephalographic (EEG) biosignals from the cerebrum's occipital lobe, which are transformed into controls for two-dimensional (2-D) cursor movement, the left-click (Enter) command, and an ON/OFF switch for the cursor-control functions. This HCI system classifies biosignals into "mouse" functions by applying amplitude thresholds and performing power spectral density (PSD) estimations on discrete windows of data. Spectral power summations are aggregated over several frequency bands between 8 and 500 Hz and then compared to produce the correct classification. The result is an affordable DSP-based system that, when combined with an on-screen keyboard, enables the user to fully operate a computer without using any extremities.

  11. Foundations of an Age-Differentiated Adaptation of the Human-Computer Interface

    ERIC Educational Resources Information Center

    Schneider, N.; Schreiber, S.; Wilkes, J.; Grandt, M.; Schlick, C. M.

    2008-01-01

    An important issue of the demographic change in the German population is the maintenance and promotion of the employability of aging workforces. However, there are hardly any suitable concepts or usable tools available to realize this goal. Possible approaches should push the individual strengths of the aging workers to the foreground and…

  12. Designing for Performance: A Cognitive Systems Engineering Approach to Modifying an AWACS Human Computer Interface

    DTIC Science & Technology

    1993-03-01

    Radar dots ane same color for enmy and friendly Colorin Radar Dots "* Cannot track who is who in fuwT-ball (often because of same color radar dots) Caldse...management. Proceedings of the 11th Biennial DoD Psyhology Conference Colorado Springs, CO. Lipshitz, R. (1989). Decision making as arguent driven

  13. An investigation of the relationship of drooling with nutrition and head control in individuals with quadriparetic cerebral palsy

    PubMed Central

    Taş, Seda Ayaz; Çankaya, Tamer

    2015-01-01

    [Purpose] The aim of the present study was to investigate the relationship of drooling, nutrition, and head control in individuals with quadriparetic cerebral palsy. [Subjects and Methods] Fifty-six individuals between the ages 2 and 15 diagnosed with spastic quadriparetic cerebral palsy and their families/caretakers were included in the study. Drooling severity and frequency of individuals was evaluated by using the scale developed by Thomas-Stonell and Greenberg (Drooling Severity and Frequency Scale). Individuals having a drooling severity value of 1 were included in the not drooling group (group 2) (n=27). Individuals having a drooling severity of 2, 3, 4, or 5 were included in the drooling group (group 1) (n=29). The evaluations were applied to both groups. [Results] There were significant differences between the two groups in terms of gestational age, nutrition behavior, eating abilities, head control, gagging, nutritional status (inadequate nutrition, normal nutrition, over weight-obese), and low weight. It was established that as head control increased, drooling severity diminished, and as drooling severity increased, BMI index decreased. Independence of eating ability was found to be greater in the group having better drooling control. [Conclusion] In the present study, it was determined that drooling control affected nutritional functions and that drooling control was affected by head control. PMID:26696723

  14. Tongue-Supported Human-Computer Interaction systems: a review.

    PubMed

    Khan, Masood Mehmood; Sherazi, Hammad I; Quain, Rohan

    2014-01-01

    The tongue can substitute human sensory systems and has been used as a medium of input to help impaired patients communicate with the world. Innovative techniques have been employed to realize tongue movement, sense its position and exploit tongue dexterity, in order to achieve Tongue Supported Human Computer Interaction (TSHCI). This paper examines various approaches of using tongue dexterousness in TSHCI systems and introduces two infrared signal supported minimally-invasive TSHCI systems developed at Curtin University. Methods of sensing tongue movement and position are especially discussed and depending on the employed methods, TSHCI systems are categorized as either invasive or minimally-invasive. A set of system usability criteria is proposed to help build more effective TSHCI systems in future.

  15. Developing a Framework for Intuitive Human-Computer Interaction

    PubMed Central

    O’Brien, Marita A.; Rogers, Wendy A.; Fisk, Arthur D.

    2014-01-01

    Many technology marketing materials tout the intuitive nature of products, but current human-computer interaction (HCI) guidelines provide limited methods to help designers create this experience beyond making them easy to use. This paper proposes a definition for intuitive interaction with specific attributes to allow designers to create products that elicit the target experience. Review of relevant literatures provides empirical evidence for the suggested working definition of intuitive HCI: interactions between humans and high technology in lenient learning environments that allow the human to use a combination of prior experience and feedforward methods to achieve an individual’s functional and abstract goals. Core concepts supporting this definition were compiled into an organizational framework that includes: seeking user goals, performing well-learned behavior, determining what to do next, metacognition, knowledge in the head, and knowledge in the world. This paper describes these concepts and proposes design approaches that could facilitate intuitive behavior and suggests areas for further research. PMID:25552895

  16. Developing a Framework for Intuitive Human-Computer Interaction.

    PubMed

    O'Brien, Marita A; Rogers, Wendy A; Fisk, Arthur D

    2008-09-01

    Many technology marketing materials tout the intuitive nature of products, but current human-computer interaction (HCI) guidelines provide limited methods to help designers create this experience beyond making them easy to use. This paper proposes a definition for intuitive interaction with specific attributes to allow designers to create products that elicit the target experience. Review of relevant literatures provides empirical evidence for the suggested working definition of intuitive HCI: interactions between humans and high technology in lenient learning environments that allow the human to use a combination of prior experience and feedforward methods to achieve an individual's functional and abstract goals. Core concepts supporting this definition were compiled into an organizational framework that includes: seeking user goals, performing well-learned behavior, determining what to do next, metacognition, knowledge in the head, and knowledge in the world. This paper describes these concepts and proposes design approaches that could facilitate intuitive behavior and suggests areas for further research.

  17. A behavioral biometric system based on human-computer interaction

    NASA Astrophysics Data System (ADS)

    Gamboa, Hugo; Fred, Ana

    2004-08-01

    In this paper we describe a new behavioural biometric technique based on human computer interaction. We developed a system that captures the user interaction via a pointing device, and uses this behavioural information to verify the identity of an individual. Using statistical pattern recognition techniques, we developed a sequential classifier that processes user interaction, according to which the user identity is considered genuine if a predefined accuracy level is achieved, and the user is classified as an impostor otherwise. Two statistical models for the features were tested, namely Parzen density estimation and a unimodal distribution. The system was tested with different numbers of users in order to evaluate the scalability of the proposal. Experimental results show that the normal user interaction with the computer via a pointing device entails behavioural information with discriminating power, that can be explored for identity authentication.

  18. Bracing of the trunk and neck has a differential effect on head control during gait

    PubMed Central

    Russell, D. M.; Kelleran, K.; Walker, M. L.

    2015-01-01

    During gait, the trunk and neck are believed to play an important role in dissipating the transmission of forces from the ground to the head. This attenuation process is important to ensure head control is maintained. The aim of the present study was to assess the impact of externally restricting the motion of the trunk and/or neck segments on acceleration patterns of the upper body and head and related trunk muscle activity. Twelve healthy adults performed three walking trials on a flat, straight 65-m walkway, under four different bracing conditions: 1) control-no brace; 2) neck-braced; 3) trunk-braced; and 4) neck-trunk braced. Three-dimensional acceleration from the head, neck (C7) and lower trunk (L3) were collected, as was muscle activity from trunk. Results revealed that, when the neck and/or trunk were singularly braced, an overall decrease in the ability of the trunk to attenuate gait-related oscillations was observed, which led to increases in the amplitude of vertical acceleration for all segments. However, when the trunk and neck were braced together, acceleration amplitude across all segments decreased in line with increased attenuation from the neck to the head. Bracing was also reflected by increased activity in erector spinae, decreased abdominal muscle activity and lower trunk muscle coactivation. Overall, it would appear that the neuromuscular system of young, healthy individuals was able to maintain a consistent pattern of head acceleration, irrespective of the level of bracing, and that priority was placed over the control of vertical head accelerations during these gait tasks. PMID:26180113

  19. Bracing of the trunk and neck has a differential effect on head control during gait.

    PubMed

    Morrison, S; Russell, D M; Kelleran, K; Walker, M L

    2015-09-01

    During gait, the trunk and neck are believed to play an important role in dissipating the transmission of forces from the ground to the head. This attenuation process is important to ensure head control is maintained. The aim of the present study was to assess the impact of externally restricting the motion of the trunk and/or neck segments on acceleration patterns of the upper body and head and related trunk muscle activity. Twelve healthy adults performed three walking trials on a flat, straight 65-m walkway, under four different bracing conditions: 1) control-no brace; 2) neck-braced; 3) trunk-braced; and 4) neck-trunk braced. Three-dimensional acceleration from the head, neck (C7) and lower trunk (L3) were collected, as was muscle activity from trunk. Results revealed that, when the neck and/or trunk were singularly braced, an overall decrease in the ability of the trunk to attenuate gait-related oscillations was observed, which led to increases in the amplitude of vertical acceleration for all segments. However, when the trunk and neck were braced together, acceleration amplitude across all segments decreased in line with increased attenuation from the neck to the head. Bracing was also reflected by increased activity in erector spinae, decreased abdominal muscle activity and lower trunk muscle coactivation. Overall, it would appear that the neuromuscular system of young, healthy individuals was able to maintain a consistent pattern of head acceleration, irrespective of the level of bracing, and that priority was placed over the control of vertical head accelerations during these gait tasks.

  20. User stress detection in human-computer interactions.

    PubMed

    Zhai, Jing; Barreto, Armando B; Chin, Craig; Li, Chao

    2005-01-01

    The emerging research area of Affective Computing seeks to advance the field of Human-Computer Interaction (HCI) by enabling computers to interact with users in ways appropriate to their affective states. Affect recognition, including the use of psychophysiologcal measures (e.g. heart rate), facial expressions, speech recognition etc. to derive an assessment of user affective state based on factors from the current task context, is an important foundation required for the development of Affective Computing. Our research focuses on the use of three physiological signals: Blood Volume Pulse (BVP), Galvanic Skin Response (GSR) and Pupil Diameter (PD), to automatically monitor the level of stress in computer users. This paper reports on the hardware and software instrumentation development and signal processing approach used to detect the stress level of a subject interacting with a computer, within the framework of a specific experimental task, which is called the 'Stroop Test'. For this experiment, a computer game was implemented and adapted to make the subject experience the Stroop Effect, evoked by the mismatch between the font color and the meaning of a certain word (name of a color) displayed, while his/her BVP, GSR and PD signals were continuously recorded. Several data processing techniques were applied to extract effective attributes of the stress level of the subjects throughout the experiment. Current results indicate that there exists interesting similarity among changes in those three signals and the shift in the emotional states when stress stimuli are applied to the interaction environment.

  1. Institutionalizing human-computer interaction for global health.

    PubMed

    Gulliksen, Jan

    2017-06-01

    Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations' new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable.

  2. Human-Computer Interaction with Medical Decisions Support Systems

    NASA Technical Reports Server (NTRS)

    Adolf, Jurine A.; Holden, Kritina L.

    1994-01-01

    Decision Support Systems (DSSs) have been available to medical diagnosticians for some time, yet their acceptance and use have not increased with advances in technology and availability of DSS tools. Medical DSSs will be necessary on future long duration space missions, because access to medical resources and personnel will be limited. Human-Computer Interaction (HCI) experts at NASA's Human Factors and Ergonomics Laboratory (HFEL) have been working toward understanding how humans use DSSs, with the goal of being able to identify and solve the problems associated with these systems. Work to date consists of identification of HCI research areas, development of a decision making model, and completion of two experiments dealing with 'anchoring'. Anchoring is a phenomenon in which the decision maker latches on to a starting point and does not make sufficient adjustments when new data are presented. HFEL personnel have replicated a well-known anchoring experiment and have investigated the effects of user level of knowledge. Future work includes further experimentation on level of knowledge, confidence in the source of information and sequential decision making.

  3. Human-Computer Interaction with Medical Decisions Support Systems

    NASA Technical Reports Server (NTRS)

    Adolf, Jurine A.; Holden, Kritina L.

    1994-01-01

    Decision Support Systems (DSSs) have been available to medical diagnosticians for some time, yet their acceptance and use have not increased with advances in technology and availability of DSS tools. Medical DSSs will be necessary on future long duration space missions, because access to medical resources and personnel will be limited. Human-Computer Interaction (HCI) experts at NASA's Human Factors and Ergonomics Laboratory (HFEL) have been working toward understanding how humans use DSSs, with the goal of being able to identify and solve the problems associated with these systems. Work to date consists of identification of HCI research areas, development of a decision making model, and completion of two experiments dealing with 'anchoring'. Anchoring is a phenomenon in which the decision maker latches on to a starting point and does not make sufficient adjustments when new data are presented. HFEL personnel have replicated a well-known anchoring experiment and have investigated the effects of user level of knowledge. Future work includes further experimentation on level of knowledge, confidence in the source of information and sequential decision making.

  4. Eucalyptus: Integrating Natural Language Input with a Graphical User Interface

    DTIC Science & Technology

    1994-02-25

    these is discourse processing. the human ability to track and maintain continuity of topic, reference, and reasoning in extended sequences of natural...interface medium of choice. The success of the graphical user interface (GUI) in the intervening years now suggests that each of these interface media has...implicit understanding of the principles of effective communication, a human-computer interface having these capabilities falls into a category that in

  5. Adaptive interface for spoken dialog

    NASA Astrophysics Data System (ADS)

    Dusan, Sorin; Flanagan, James

    2002-05-01

    Speech has become increasingly important in human-computer interaction. Spoken dialog interfaces rely on automatic speech recognition, speech synthesis, language understanding, and dialog management. A main issue in dialog systems is that they typically are limited to pre-programmed vocabularies and sets of sentences. The research reported here focuses on developing an adaptive spoken dialog interface capable of acquiring new linguistic units and their corresponding semantics during the human-computer interaction. The adaptive interface identifies unknown words and phrases in the users utterances and asks the user for the corresponding semantics. The user can provide the meaning or the semantic representation of the new linguistic units through multiple modalities, including speaking, typing, pointing, touching, or showing. The interface then stores the new linguistic units in a semantic grammar and creates new objects defining the corresponding semantic representation. This process takes place during natural interaction between user and computer and, thus, the interface does not have to be rewritten and compiled to incorporate the newly acquired language. Users can personalize the adaptive spoken interface for different domain applications, or according to their personal preferences. [Work supported by NSF.

  6. HCI∧2 framework: a software framework for multimodal human-computer interaction systems.

    PubMed

    Shen, Jie; Pantic, Maja

    2013-12-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a shared-memory-based data transport protocol for message delivery and a TCP-based system management protocol. The latter ensures that the integrity of system structure is maintained at runtime. With the inclusion of bridging modules, the HCI∧2 Framework is interoperable with other software frameworks including Psyclone and ActiveMQ. In addition to the core communication middleware, we also present the integrated development environment (IDE) of the HCI∧2 Framework. It provides a complete graphical environment to support every step in a typical MHCI system development process, including module development, debugging, packaging, and management, as well as the whole system management and testing. The quantitative evaluation indicates that our framework outperforms other similar tools in terms of average message latency and maximum data throughput under a typical single PC scenario. To demonstrate HCI∧2 Framework's capabilities in integrating heterogeneous modules, we present several example modules working with a variety of hardware and software. We also present an example of a full system developed using the proposed HCI∧2 Framework, which is called the CamGame system and represents a computer game based on hand-held marker(s) and low-cost camera(s).

  7. Appearance-based human gesture recognition using multimodal features for human computer interaction

    NASA Astrophysics Data System (ADS)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  8. The experience of agency in human-computer interactions: a review

    PubMed Central

    Limerick, Hannah; Coyle, David; Moore, James W.

    2014-01-01

    The sense of agency is the experience of controlling both one’s body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied “real-life” situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces. PMID:25191256

  9. The experience of agency in human-computer interactions: a review.

    PubMed

    Limerick, Hannah; Coyle, David; Moore, James W

    2014-01-01

    The sense of agency is the experience of controlling both one's body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied "real-life" situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces.

  10. Multi-step EMG Classification Algorithm for Human-Computer Interaction

    NASA Astrophysics Data System (ADS)

    Ren, Peng; Barreto, Armando; Adjouadi, Malek

    A three-electrode human-computer interaction system, based on digital processing of the Electromyogram (EMG) signal, is presented. This system can effectively help disabled individuals paralyzed from the neck down to interact with computers or communicate with people through computers using point-and-click graphic interfaces. The three electrodes are placed on the right frontalis, the left temporalis and the right temporalis muscles in the head, respectively. The signal processing algorithm used translates the EMG signals during five kinds of facial movements (left jaw clenching, right jaw clenching, eyebrows up, eyebrows down, simultaneous left & right jaw clenching) into five corresponding types of cursor movements (left, right, up, down and left-click), to provide basic mouse control. The classification strategy is based on three principles: the EMG energy of one channel is typically larger than the others during one specific muscle contraction; the spectral characteristics of the EMG signals produced by the frontalis and temporalis muscles during different movements are different; the EMG signals from adjacent channels typically have correlated energy profiles. The algorithm is evaluated on 20 pre-recorded EMG signal sets, using Matlab simulations. The results show that this method provides improvements and is more robust than other previous approaches.

  11. Viewer: a User Interface for Failure Region Analysis

    DTIC Science & Technology

    1990-12-01

    manipulation of the diagrams * have a readily revisable notation. (Fitter and Green, 1981) Designing a human-computer interface through a standard...construction/tool kits available for the design of human-computer interfaces. The interface is separate from the application, but usually this is an...Live Equiv Dead %Enable %EScore %GScore A A ------- ---------------------- --- A Aall 64 64 11 17.2 0 53 100.0 82.8 82.8 Acca 30 30 4 13.3 0 26 100.0

  12. A Language Use Perspective on the Design of Human-Computer Interaction

    DTIC Science & Technology

    2002-01-01

    language use approach to human - computer interaction are outlined, and a range of both noncomputational and computational implications for the design of...interactive systems is examined. In particular, human - computer interaction is recast as a genuine instance of language use between the user and the system

  13. Intelligent Support for Human Computer Interaction and Decision-Making in Distribution Planning and Scheduling Systems

    DTIC Science & Technology

    1993-02-28

    transportation planning in the Army. The work addressed frameworks and tools for human - computer interaction in systems involving large amounts of...diverse information and development of decision making models. Research on human - computer interaction involved: (1) dynamic display generation for

  14. Evidence Report: Risk of Inadequate Human-Computer Interaction

    NASA Technical Reports Server (NTRS)

    Holden, Kritina; Ezer, Neta; Vos, Gordon

    2013-01-01

    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls.

  15. Capacitive facial movement detection for human-computer interaction to click by frowning and lifting eyebrows: assistive technology.

    PubMed

    Rantanen, Ville; Niemenlehto, Pekka-Henrik; Verho, Jarmo; Lekkala, Jukka

    2010-01-01

    A capacitive facial movement detection method designed for human-computer interaction is presented. Some point-and-click interfaces use facial electromyography for clicking. The presented method provides a contactless alternative. Electrodes with no galvanic coupling to the face are used to form electric fields. Changes in the electric fields due to facial movements are detected by measuring capacitances between the electrodes. A prototype device for measuring a capacitance signal affected by frowning and lifting eyebrows was constructed. A commercial integrated circuit for capacitive touch sensors is used in the measurement. The applied movement detection algorithm uses an adaptive approach to provide operation capability in noisy and dynamic environments. Experimentation with 10 test subjects proved that, under controlled circumstances, the movements are detected with good efficiency, but characterizing the movements into frowns and eyebrow lifts is more problematic. Integration with a two-dimensional (2D) pointing solution and further experiments are still required.

  16. MovExp: A Versatile Visualization Tool for Human-Computer Interaction Studies with 3D Performance and Biomechanical Data.

    PubMed

    Palmas, Gregorio; Bachynskyi, Myroslav; Oulasvirta, Antti; Seidel, Hans-Peter; Weinkauf, Tina

    2014-12-01

    In Human-Computer Interaction (HCI), experts seek to evaluate and compare the performance and ergonomics of user interfaces. Recently, a novel cost-efficient method for estimating physical ergonomics and performance has been introduced to HCI. It is based on optical motion capture and biomechanical simulation. It provides a rich source for analyzing human movements summarized in a multidimensional data set. Existing visualization tools do not sufficiently support the HCI experts in analyzing this data. We identified two shortcomings. First, appropriate visual encodings are missing particularly for the biomechanical aspects of the data. Second, the physical setup of the user interface cannot be incorporated explicitly into existing tools. We present MovExp, a versatile visualization tool that supports the evaluation of user interfaces. In particular, it can be easily adapted by the HCI experts to include the physical setup that is being evaluated, and visualize the data on top of it. Furthermore, it provides a variety of visual encodings to communicate muscular loads, movement directions, and other specifics of HCI studies that employ motion capture and biomechanical simulation. In this design study, we follow a problem-driven research approach. Based on a formalization of the visualization needs and the data structure, we formulate technical requirements for the visualization tool and present novel solutions to the analysis needs of the HCI experts. We show the utility of our tool with four case studies from the daily work of our HCI experts.

  17. A mobile Nursing Information System based on human-computer interaction design for improving quality of nursing.

    PubMed

    Su, Kuo-Wei; Liu, Cheng-Li

    2012-06-01

    A conventional Nursing Information System (NIS), which supports the role of nurse in some areas, is typically deployed as an immobile system. However, the traditional information system can't response to patients' conditions in real-time, causing delays on the availability of this information. With the advances of information technology, mobile devices are increasingly being used to extend the human mind's limited capacity to recall and process large numbers of relevant variables and to support information management, general administration, and clinical practice. Unfortunately, there have been few studies about the combination of a well-designed small-screen interface with a personal digital assistant (PDA) in clinical nursing. Some researchers found that user interface design is an important factor in determining the usability and potential use of a mobile system. Therefore, this study proposed a systematic approach to the development of a mobile nursing information system (MNIS) based on Mobile Human-Computer Interaction (M-HCI) for use in clinical nursing. The system combines principles of small-screen interface design with user-specified requirements. In addition, the iconic functions were designed with metaphor concept that will help users learn the system more quickly with less working-memory. An experiment involving learnability testing, thinking aloud and a questionnaire investigation was conducted for evaluating the effect of MNIS on PDA. The results show that the proposed MNIS is good on learning and higher satisfaction on symbol investigation, terminology and system information.

  18. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    future MAC-enabled systems. A human-computer interaction ( HCI ) Index, originally applied to multi-function displays was applied to the prototype Vigilant...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...two modified interface designs. The modified HCI Index incorporates the Hick-Hyman decision time, Fitts’ Law time, and the physical actions

  19. GT-MSOCC - A domain for research on human-computer interaction and decision aiding in supervisory control systems. [Georgia Tech - Multisatellite Operations Control Center

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1987-01-01

    The Georgia Tech-Multisatellite Operations Control Center (GT-MSOCC), a real-time interactive simulation of the operator interface to a NASA ground control system for unmanned earth-orbiting satellites, is described. The GT-MSOCC program for investigating a range of modeling, decision aiding, and workstation design issues related to the human-computer interaction is discussed. A GT-MSOCC operator function model is described in which operator actions, both cognitive and manual, are represented as the lowest level discrete control network nodes, and operator action nodes are linked to information needs or system reconfiguration commands.

  20. GT-MSOCC - A domain for research on human-computer interaction and decision aiding in supervisory control systems. [Georgia Tech - Multisatellite Operations Control Center

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1987-01-01

    The Georgia Tech-Multisatellite Operations Control Center (GT-MSOCC), a real-time interactive simulation of the operator interface to a NASA ground control system for unmanned earth-orbiting satellites, is described. The GT-MSOCC program for investigating a range of modeling, decision aiding, and workstation design issues related to the human-computer interaction is discussed. A GT-MSOCC operator function model is described in which operator actions, both cognitive and manual, are represented as the lowest level discrete control network nodes, and operator action nodes are linked to information needs or system reconfiguration commands.

  1. Program Predicts Time Courses of Human/Computer Interactions

    NASA Technical Reports Server (NTRS)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  2. Gestural interfaces for immersive environments

    NASA Astrophysics Data System (ADS)

    Margolis, Todd

    2014-02-01

    We are witnessing an explosion of new forms of Human Computer Interaction devices lately for both laboratory research and home use. With these new affordance in user interfaces (UI), how can gestures be used to improve interaction for large scale immersive display environments. Through the investigation of full body, head and hand tracking, this paper will discuss various modalities of gesture recognition and compare their usability to other forms of interactivity. We will explore a specific implementation of hand gesture tracking within a large tiled display environment for use with common collaborative media interaction activities.

  3. A microswitch-cluster program to foster adaptive responses and head control in students with multiple disabilities: replication and validation assessment.

    PubMed

    Lancioni, Giulio E; Singh, Nirbhay N; O'Reilly, Mark F; Sigafoos, Jeff; Oliva, Doretta; Gatti, Michela; Manfredi, Francesco; Megna, Gianfranco; La Martire, Maria L; Tota, Alessia; Smaldone, Angela; Groeneweg, Jop

    2008-01-01

    A program relying on microswitch clusters (i.e., combinations of microswitches) and preferred stimuli was recently developed to foster adaptive responses and head control in persons with multiple disabilities. In the last version of this program, preferred stimuli (a) are scheduled for adaptive responses occurring in combination with head control (i.e., head upright) and (b) last through the scheduled time only if head control is maintained for that time. The first of the present two studies was aimed at replicating this program with three new participants with multiple disabilities adding to the three reported by Lancioni et al. [Lancioni, G. E., Singh, N. N., O'Reilly, M. F., Sigafoos, J., Didden, R., Oliva, D., et al. (2007). Fostering adaptive responses and head control in students with multiple disabilities through a microswitch-based program: Follow-up assessment and program revision. Research in Developmental Disabilities, 28, 187-196]. The second of the two studies served to carry out an expert validation of the program's effects on head control and general physical condition with the three participants of Study I as well as the three participants involved in the Lancioni et al. study mentioned above. The expert raters were 72 new physiotherapists and 72 experienced physiotherapists. The results of Study I supported previous data and indicated that the program was effective in helping the participants increase the frequency of adaptive responses in combination with head control and the length of such control. The results of Study II showed that the raters found the effects of the new program more positive than those of other intervention conditions and also considered such program a useful complement to formal motor rehabilitation programs.

  4. Improved temporal resolution heart rate variability monitoring-pilot results of non-laboratory experiments targeting future assessment of human-computer interaction.

    PubMed

    Hercegfi, Károly

    2011-01-01

    This paper outlines the INTERFACE software ergonomic evaluation methodology and presents new validation results. The INTERFACE methodology is based on a simultaneous assessment of heart rate variability, skin conductance, and other data. The results of using this methodology on-site, in a non-laboratory environment indicate that it is potentially capable of identifying quality attributes of elements of software with a temporal resolution of only a few seconds. This paper presents pilot results supporting this hypothesis, showing empirical evidence in spite of the definitely non-laboratory environment: they indicate that the method is robust enough for practical usability tests. Naturally, in the future these pilot results will have to be followed with further laboratory-based verification and refinement. This paper focuses only on some characteristics of this method, not on an actual analysis of human-computer interaction; however, its results can establish a future practical and objective event-related analysis of software use.

  5. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    SciTech Connect

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  6. Three-dimensional human computer interaction based on 3D widgets for medical data visualization

    NASA Astrophysics Data System (ADS)

    Xue, Jian; Tian, Jie; Zhao, Mingchang

    2005-04-01

    Three-dimensional human computer interaction plays an important role in 3-dimensional visualization. It is important for clinicians to accurately use and easily handle the result of medical data visualization in order to assist diagnosis and surgery simulation. A 3D human computer interaction software platform based on 3D widgets has been designed in traditional object-oriented fashion with some common design patterns and implemented by using ANSI C++, including all function modules and some practical widgets. A group of application examples are exhibited as well. The ultimate objective is to provide a flexible, reliable and extensible 3-D interaction platform for medical image processing and analyzing.

  7. Ten Design Points for the Human Interface to Instructional Multimedia.

    ERIC Educational Resources Information Center

    McFarland, Ronald D.

    1995-01-01

    Ten ways to design an effective Human-Computer Interface are explained. Highlights include material delivery that relates to user knowledge; appropriate screen presentations; attention value versus learning and recall; the relationship of packaging and message; the effectiveness of visuals and text; the use of color to enhance communication; the…

  8. An Enhanced User Interface for the SABER Wargame

    DTIC Science & Technology

    1992-12-01

    A238825). 16. Molich, Rolf and Jakob Nielsen . "Improving a Human-Computer Dialogue," Commu- nications of the ACM, 33:338-348 (March 1990). 100 17. Ness...AU), Wright-Patterson AFB OH, June 1990 (AD-A223087). 18. Nielsen , Jakob . "Traditional Dialogue Design Applied to Modern User Interfaces

  9. Ten Design Points for the Human Interface to Instructional Multimedia.

    ERIC Educational Resources Information Center

    McFarland, Ronald D.

    1995-01-01

    Ten ways to design an effective Human-Computer Interface are explained. Highlights include material delivery that relates to user knowledge; appropriate screen presentations; attention value versus learning and recall; the relationship of packaging and message; the effectiveness of visuals and text; the use of color to enhance communication; the…

  10. Designing Interactions for Learning: Physicality, Interactivity, and Interface Effects in Digital Environments

    ERIC Educational Resources Information Center

    Hoffman, Daniel L.

    2013-01-01

    The purpose of the study is to better understand the role of physicality, interactivity, and interface effects in learning with digital content. Drawing on work in cognitive science, human-computer interaction, and multimedia learning, the study argues that interfaces that promote physical interaction can provide "conceptual leverage"…

  11. Designing Interactions for Learning: Physicality, Interactivity, and Interface Effects in Digital Environments

    ERIC Educational Resources Information Center

    Hoffman, Daniel L.

    2013-01-01

    The purpose of the study is to better understand the role of physicality, interactivity, and interface effects in learning with digital content. Drawing on work in cognitive science, human-computer interaction, and multimedia learning, the study argues that interfaces that promote physical interaction can provide "conceptual leverage"…

  12. A Microswitch-Cluster Program to Foster Adaptive Responses and Head Control in Students with Multiple Disabilities: Replication and Validation Assessment

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Oliva, Doretta; Gatti, Michela; Manfredi, Francesco; Megna, Gianfranco; La Martire, Maria L.; Tota, Alessia; Smaldone, Angela; Groeneweg, Jop

    2008-01-01

    A program relying on microswitch clusters (i.e., combinations of microswitches) and preferred stimuli was recently developed to foster adaptive responses and head control in persons with multiple disabilities. In the last version of this program, preferred stimuli (a) are scheduled for adaptive responses occurring in combination with head control…

  13. A Microswitch-Cluster Program to Foster Adaptive Responses and Head Control in Students with Multiple Disabilities: Replication and Validation Assessment

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Oliva, Doretta; Gatti, Michela; Manfredi, Francesco; Megna, Gianfranco; La Martire, Maria L.; Tota, Alessia; Smaldone, Angela; Groeneweg, Jop

    2008-01-01

    A program relying on microswitch clusters (i.e., combinations of microswitches) and preferred stimuli was recently developed to foster adaptive responses and head control in persons with multiple disabilities. In the last version of this program, preferred stimuli (a) are scheduled for adaptive responses occurring in combination with head control…

  14. Integrating HCI into IDT: Charting the Human Computer Interaction Competencies Necessary for Instructional Media Production Coursework

    ERIC Educational Resources Information Center

    Brown, Abbie; Sugar, William

    2004-01-01

    A report on the efforts made to describe the range of human-computer interaction skills necessary to complete a program of study in Instructional Design Technology. Educators responsible for instructional media production courses have not yet articulated which among the wide range of possible interactions students must master for instructional…

  15. Personality Factors in Human-Computer Interaction: A Review of the Literature.

    ERIC Educational Resources Information Center

    Pocius, Kym E.

    1991-01-01

    Reviews studies investigating the relation between personality characteristics and human-computer interaction. The review is divided into three areas: (1) how personality traits are related to programing aptitude and achievement; (2) personality traits of people who use program skills in their profession; and (3) the relation between personality…

  16. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    ERIC Educational Resources Information Center

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  17. A Project-Based Learning Setting to Human-Computer Interaction for Teenagers

    ERIC Educational Resources Information Center

    Geyer, Cornelia; Geisler, Stefan

    2012-01-01

    Knowledge of fundamentals of human-computer interaction resp. usability engineering is getting more and more important in technical domains. However this interdisciplinary field of work and corresponding degree programs are not broadly known. Therefore at the Hochschule Ruhr West, University of Applied Sciences, a program was developed to give…

  18. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    ERIC Educational Resources Information Center

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  19. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    ERIC Educational Resources Information Center

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  20. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    ERIC Educational Resources Information Center

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  1. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    ERIC Educational Resources Information Center

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  2. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    ERIC Educational Resources Information Center

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  3. Advanced Perceptual User Interfaces: Applications for Disabled and Elderly People

    NASA Astrophysics Data System (ADS)

    López, Francisco J. Perales

    The research of new human-computer interfaces has become a growing field in computer science, which aims to attain the development of more natural, intuitive, unobtrusive and efficient interfaces. This objective has come up with the concept of Perceptual User Interfaces (PUIs) that are turning out to be very popular as they seek to make the user interface more natural and compelling by taking advantage of the ways in which people naturally interact with each other and with the world. PUIs can use speech and sound recognition and generation, computer vision, graphical animation and visualization, language understanding, touch-based sensing and feedback (haptics), learning, user modeling and dialog management.

  4. High throughput screening for mammography using a human-computer interface with rapid serial visual presentation (RSVP)

    NASA Astrophysics Data System (ADS)

    Hope, Chris; Sterr, Annette; Elangovan, Premkumar; Geades, Nicholas; Windridge, David; Young, Ken; Wells, Kevin

    2013-03-01

    The steady rise of the breast cancer screening population, coupled with data expansion produced by new digital screening technologies (tomosynthesis/CT) motivates the development of new, more efficient image screening processes. Rapid Serial Visual Presentation (RSVP) is a new fast-content recognition approach which uses electroencephalography to record brain activity elicited by fast bursts of image data. These brain responses are then subjected to machine classification methods to reveal the expert's `reflex' response to classify images according to their presence or absence of particular targets. The benefit of this method is that images can be presented at high temporal rates (~10 per second), faster than that required for fully conscious detection, facilitating a high throughput of image (screening) material. In the present paper we present the first application of RSVP to medical image data, and demonstrate how cortically coupled computer vision can be successfully applied to breast cancer screening. Whilst prior RSVP work has utilised multichannel approaches, we also present the first RSVP results demonstrating discriminatory response on a single electrode with a ROC area under the curve of 0.62- 0.86 using a simple Fisher discriminator for classification. This increases to 0.75 - 0.94 when multiple electrodes are used in combination.

  5. Design of the Human-Computer Interface for a Computer Aided Design Tool for the Normalization of Relations.

    DTIC Science & Technology

    1985-12-01

    engineered interactive dialoque based on function keys or a command syntax . However, even experienced users can forget functions and commands (1:27...34 - S, . -°,. , 23. LANGUAGE: The set of vocabulary, syntax , and gramma - tical rules used to interact with the computer system. simple I I _li... UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio _ _._ _Tho_ _as_ _C__ _ _ a _ a

  6. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters.

    PubMed

    Rempel, David; Camilleri, Matt J; Lee, David L

    2015-10-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input.

  7. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters

    PubMed Central

    Rempel, David; Camilleri, Matt J.; Lee, David L.

    2015-01-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input. PMID:26028955

  8. Human-computer interaction in freeform object design and simultaneous manufacturing

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Lin, Heng; Ma, Liang; Chen, Delin

    2004-03-01

    Freeform object design and simultaneous manufacturing is a novel virtual design and manufacturing method that aims to enable creative and individualized product geometry design and rapid manufacturing of the designed model. The geometry is defined through the process of "virtual sculpting" during which the designer can touch and visualize the designed object in a virtual environment. Natural human-computer interaction is a key issue for this method. This paper first briefly reviewed the principle of the method, including the system configuration, data flow, and fundamental algorithm. Then an input/output device was developed to achieve natural human-computer interaction. Structure of the device and algorithms of calculating the input coordinates and output force were presented. Finally a feedback model was proposed and discussed to apply force feedback during virtual sculpting design.

  9. Semisupervised learning of classifiers: theory, algorithms, and their application to human-computer interaction.

    PubMed

    Cohen, Ira; Cozman, Fabio G; Sebe, Nicu; Cirelo, Marcelo C; Huang, Thomas S

    2004-12-01

    Automatic classification is one of the basic tasks required in any pattern recognition and human computer interaction application. In this paper, we discuss training probabilistic classifiers with labeled and unlabeled data. We provide a new analysis that shows under what conditions unlabeled data can be used in learning to improve classification performance. We also show that, if the conditions are violated, using unlabeled data can be detrimental to classification performance. We discuss the implications of this analysis to a specific type of probabilistic classifiers, Bayesian networks, and propose a new structure learning algorithm that can utilize unlabeled data to improve classification. Finally, we show how the resulting algorithms are successfully employed in two applications related to human-computer interaction and pattern recognition: facial expression recognition and face detection.

  10. Collaborative Human-Computer Decision Making for Command and Control Resource Allocation

    DTIC Science & Technology

    2007-08-01

    modifying other assignments at higher priority levels. In the experiment, six subjects participated in a cognitive walkthrough of the mission planning...students with extensive backgrounds in UAV operation and Human-Computer Interaction, two of them being USAF 2nd Lieutenants. A cognitive walkthrough ... evaluates how well a skilled user can perform novel or occasionally performed tasks. In this usability inspection method, ease of learning, ease of

  11. Intelligent Context-Aware and Adaptive Interface for Mobile LBS

    PubMed Central

    Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results. PMID:26457077

  12. Intelligent Context-Aware and Adaptive Interface for Mobile LBS.

    PubMed

    Feng, Jiangfan; Liu, Yanhong

    2015-01-01

    Context-aware user interface plays an important role in many human-computer Interaction tasks of location based services. Although spatial models for context-aware systems have been studied extensively, how to locate specific spatial information for users is still not well resolved, which is important in the mobile environment where location based services users are impeded by device limitations. Better context-aware human-computer interaction models of mobile location based services are needed not just to predict performance outcomes, such as whether people will be able to find the information needed to complete a human-computer interaction task, but to understand human processes that interact in spatial query, which will in turn inform the detailed design of better user interfaces in mobile location based services. In this study, a context-aware adaptive model for mobile location based services interface is proposed, which contains three major sections: purpose, adjustment, and adaptation. Based on this model we try to describe the process of user operation and interface adaptation clearly through the dynamic interaction between users and the interface. Then we show how the model applies users' demands in a complicated environment and suggested the feasibility by the experimental results.

  13. Visual design for the user interface, Part 1: Design fundamentals.

    PubMed

    Lynch, P J

    1994-01-01

    Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.

  14. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

    2004-01-01

    Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

  15. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

    2004-01-01

    Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

  16. Portable tongue-supported human computer interaction system design and implementation.

    PubMed

    Quain, Rohan; Khan, Masood Mehmood

    2014-01-01

    Tongue supported human-computer interaction (TSHCI) systems can help critically ill patients interact with both computers and people. These systems can be particularly useful for patients suffering injuries above C7 on their spinal vertebrae. Despite recent successes in their application, several limitations restrict performance of existing TSHCI systems and discourage their use in real life situations. This paper proposes a low-cost, less-intrusive, portable and easy to use design for implementing a TSHCI system. Two applications of the proposed system are reported. Design considerations and performance of the proposed system are also presented.

  17. User interface guidelines for the Integrated Booking System prototype (IBS-P)

    SciTech Connect

    Truett, T.; Yow, T. ); Wheeler, V.; Stamm, S.; Valentine, D. . Transportation Center)

    1991-05-01

    The User Interface Guidelines for the Integrated Booking System -- Prototype (IBS-P) describes the design requirements for the human- computer interface. The user interface design conforms to standards reported in the open literature as well as to standards provided through Department of Defense guidelines. The IBS-P interface was evaluated by personnel at Headquarters Military Traffic Management Command (MTMC) and at each of the MTMC Area Commands. As a result of comments received during demonstrations of the prototype at these sites, modifications to the design were made, as appropriate. The user interfaces was well accepted by the end users.

  18. Evolutionary adaptive eye tracking for low-cost human computer interaction applications

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Shin, Hak Chul; Sung, Won Jun; Khim, Sarang; Kim, Honglak; Rhee, Phill Kyu

    2013-01-01

    We present an evolutionary adaptive eye-tracking framework aiming for low-cost human computer interaction. The main focus is to guarantee eye-tracking performance without using high-cost devices and strongly controlled situations. The performance optimization of eye tracking is formulated into the dynamic control problem of deciding on an eye tracking algorithm structure and associated thresholds/parameters, where the dynamic control space is denoted by genotype and phenotype spaces. The evolutionary algorithm is responsible for exploring the genotype control space, and the reinforcement learning algorithm organizes the evolved genotype into a reactive phenotype. The evolutionary algorithm encodes an eye-tracking scheme as a genetic code based on image variation analysis. Then, the reinforcement learning algorithm defines internal states in a phenotype control space limited by the perceived genetic code and carries out interactive adaptations. The proposed method can achieve optimal performance by compromising the difficulty in the real-time performance of the evolutionary algorithm and the drawback of the huge search space of the reinforcement learning algorithm. Extensive experiments were carried out using webcam image sequences and yielded very encouraging results. The framework can be readily applied to other low-cost vision-based human computer interactions in solving their intrinsic brittleness in unstable operational environments.

  19. Hand gesture recognition based on motion history images for a simple human-computer interaction system

    NASA Astrophysics Data System (ADS)

    Timotius, Ivanna K.; Setyawan, Iwan

    2013-03-01

    A human-computer interaction can be developed using several kind of tools. One choice is using images captured using a camera. This paper proposed a simple human-computer interaction system based on hand movement captured by a web camera. The system aims to classify the captured movement into one of three classes. The first two classes contain hand movements to the left and right, respectively. The third class contains non-hand movements or hand movements to other directions. The method used in this paper is based on Motion History Images (MHIs) and nearest neighbor classifier. The resulting MHIs are processed in two manners, namely by summing the pixel values along the vertical axis and reshaping into vectors. We also use two distance criteria in this paper, respectively the Euclidian distance and cross correlation. This paper compared the performance of the combinations of different MHI data processing and distance criteria using 10 runs of 2-fold cross validation. Our experiments show that reshaping the MHI data into vectors combined with a Euclidean distance criterion gives the highest average accuracy, namely 55.67%.

  20. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  1. A Cognitive Model of Human-Computer Interaction in Naval Air ASW Mission Management

    DTIC Science & Technology

    1989-12-15

    interface through which the a task would be performed. Using the performance-time predictions, alternative interface designs can be evaluated. Kieras has...extended the use of GOMS models for user interface design (Bovair, Kieras , & Poison, 1988; Kieras , 1988), deriving both quantitative and qualitative...Elkerton & Palmiter, 1989). Much of the work done to date with GOMS by Card, Newell, John, Kieras , and others has focused on the lowest level goals and

  2. An acoustic interface for triggering actions in virtual environments

    NASA Astrophysics Data System (ADS)

    Li, Yinlin; Groenegress, Christoph; Denzinger, Jochen; Strauss, Wolfgang; Fleischmann, Monika

    2004-03-01

    Currently one of the main research issues in Human Computer Interaction (HCI) is to develop more intuitive, multimodal and natural interfaces. Among them, the interface for triggering simple actions or selecting objects in virtual environments (VEs) is one of the concerned areas. In this paper we describe an acoustic interface which uses finger snap or hand clap sounds as the input command to initiate events for VE applications. We developed a sophisticated algorithm based on the wavelet transform and neural network techniques, which separate the environment noise from the snap and clap sound. The acoustic interface could be integrated with other interfaces like optical tracking systems to provide a more natural, easy-to-use, efficient and boyd-centered multimodal interaction for virtual reality applications.

  3. The Electronic Mirror: Human-Computer Interaction and Change in Self-Appraisals.

    ERIC Educational Resources Information Center

    De Laere, Kevin H.; Lundgren, David C.; Howe, Steven R.

    1998-01-01

    Compares humanlike versus machinelike interactional styles of computer interfaces, testing hypotheses that evaluative feedback conveyed through a humanlike interface will have greater impact on individuals' self-appraisals. Reflected appraisals were more influenced by computer feedback than were self-appraisals. Humanlike and machinelike interface…

  4. Non-Speech Sound in Human-Computer Interaction: A Review and Design Guidelines.

    ERIC Educational Resources Information Center

    Hereford, James; Winn, William

    1994-01-01

    Reviews research on uses of computer sound and suggests how sound might be used effectively by instructional and interface designers. Topics include principles of interface design; the perception of sound; earcons, both symbolic and iconic; sound in data analysis; sound in virtual environments; and guidelines for using sound. (70 references) (LRW)

  5. Within the Interface: Visual Rhetoric, Pedagogy, and Writing Center Website Design

    ERIC Educational Resources Information Center

    Myatt, Alice J.

    2010-01-01

    My dissertation examines the theory and praxis of taking an expanded concept of the human-computer interface (HCI) and working with the resulting concept to foster a more conversational approach for online tutoring sessions and the design of the writing center websites that facilitate online tutoring. For the purposes of my research, I describe…

  6. Within the Interface: Visual Rhetoric, Pedagogy, and Writing Center Website Design

    ERIC Educational Resources Information Center

    Myatt, Alice J.

    2010-01-01

    My dissertation examines the theory and praxis of taking an expanded concept of the human-computer interface (HCI) and working with the resulting concept to foster a more conversational approach for online tutoring sessions and the design of the writing center websites that facilitate online tutoring. For the purposes of my research, I describe…

  7. A Hybrid Human-Computer Approach to the Extraction of Scientific Facts from the Literature

    PubMed Central

    Tchoua, Roselyne B.; Chard, Kyle; Audus, Debra; Qin, Jian; de Pablo, Juan; Foster, Ian

    2017-01-01

    A wealth of valuable data is locked within the millions of research articles published each year. Reading and extracting pertinent information from those articles has become an unmanageable task for scientists. This problem hinders scientific progress by making it hard to build on results buried in literature. Moreover, these data are loosely structured, encoded in manuscripts of various formats, embedded in different content types, and are, in general, not machine accessible. We present a hybrid human-computer solution for semi-automatically extracting scientific facts from literature. This solution combines an automated discovery, download, and extraction phase with a semi-expert crowd assembled from students to extract specific scientific facts. To evaluate our approach we apply it to a challenging molecular engineering scenario, extraction of a polymer property: the Flory-Huggins interaction parameter. We demonstrate useful contributions to a comprehensive database of polymer properties. PMID:28649288

  8. A Hybrid Human-Computer Approach to the Extraction of Scientific Facts from the Literature.

    PubMed

    Tchoua, Roselyne B; Chard, Kyle; Audus, Debra; Qin, Jian; de Pablo, Juan; Foster, Ian

    2016-01-01

    A wealth of valuable data is locked within the millions of research articles published each year. Reading and extracting pertinent information from those articles has become an unmanageable task for scientists. This problem hinders scientific progress by making it hard to build on results buried in literature. Moreover, these data are loosely structured, encoded in manuscripts of various formats, embedded in different content types, and are, in general, not machine accessible. We present a hybrid human-computer solution for semi-automatically extracting scientific facts from literature. This solution combines an automated discovery, download, and extraction phase with a semi-expert crowd assembled from students to extract specific scientific facts. To evaluate our approach we apply it to a challenging molecular engineering scenario, extraction of a polymer property: the Flory-Huggins interaction parameter. We demonstrate useful contributions to a comprehensive database of polymer properties.

  9. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training.

    PubMed

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-21

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant ("skin-like") electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  10. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation.

    PubMed

    Saleem, Jason J; Haggstrom, David A; Militello, Laura G; Flanagan, Mindy; Kiess, Chris L; Arbuckle, Nicole; Doebbeling, Bradley N

    2011-11-29

    Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice.

  11. Human-computer interaction: psychological aspects of the human use of computing.

    PubMed

    Olson, Gary M; Olson, Judith S

    2003-01-01

    Human-computer interaction (HCI) is a multidisciplinary field in which psychology and other social sciences unite with computer science and related technical fields with the goal of making computing systems that are both useful and usable. It is a blend of applied and basic research, both drawing from psychological research and contributing new ideas to it. New technologies continuously challenge HCI researchers with new options, as do the demands of new audiences and uses. A variety of usability methods have been developed that draw upon psychological principles. HCI research has expanded beyond its roots in the cognitive processes of individual users to include social and organizational processes involved in computer usage in real environments as well as the use of computers in collaboration. HCI researchers need to be mindful of the longer-term changes brought about by the use of computing in a variety of venues.

  12. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    NASA Astrophysics Data System (ADS)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  13. Standardized task strain and system response times in human-computer interaction.

    PubMed

    Thum, M; Boucsein, W; Kuhmann, W; Ray, W J

    1995-07-01

    Involuntary delays in human-computer interaction, for example, system response times (SRTs) can increase stress. In the present study, 40 college-age subjects were randomly divided into an 'incentive' and a 'non-incentive' group'. Subjects performed a computer task with SRTs of 0.5, 1.5, and 4.5s. Physiological, subjective, and performance data were collected during the task. The computer task was designed to individually set difficulty level (i.e., mental strain), thus standardizing the task for all subjects. By using this procedure, changes resulting from SRT duration can be separated from the effects related to task difficulty. The results indicate that both short and long SRTs produced differential psychophysiological changes consistent with different types of stress responses. Short SRTs resulted in higher autonomic and somatic activity, increased positive self-reported emotional states but poorer performance. Long SRTs resulted in increased electrodermal activity, negative self-reported emotional states and better performance.

  14. Cognition friendly interaction: A concept of partnership in human computer interaction

    NASA Astrophysics Data System (ADS)

    Das, Balaram

    2001-09-01

    This paper identifies yet another field of research, the discipline of human computer interaction, where the concept of self-similar fluctuations can play a vital role. A concept of interaction between computation and cognition is developed that is friendly toward the cognitive process. It is argued that friendly interactions must have a memory and be antipersistent. To cast this in a mathematical form, fluctuations in the interactions recorded over a period of time are studied, and it is shown that these fluctuations must necessarily be self-similar with the value of the self-similarity parameter confined to the interval (0, 1/2), for the interaction to be friendly. A statistical measure of complexity, of the interaction process, is also formulated as a function of the self-similarity parameter. Finally the question is raised as how to build a friendly software and a possible evolutionary process through which friendly softwares may emerge is indicated.

  15. An Overview of a Decade of Journal Publications about Culture and Human-Computer Interaction (HCI)

    NASA Astrophysics Data System (ADS)

    Clemmensen, Torkil; Roese, Kerstin

    In this paper, we analyze the concept of human-computer interaction in cultural and national contexts. Building and extending upon the framework for understanding research in usability and culture by Honold [3], we give an overview of publications in culture and HCI between 1998 and 2008, with a narrow focus on high-level journal publications only. The purpose is to review current practice in how cultural HCI issues are studied, and to analyse problems with the measures and interpretation of this studies. We find that Hofstede's cultural dimensions has been the dominating model of culture, participants have been picked because they could speak English, and most studies have been large scale quantitative studies. In order to balance this situation, we recommend that more researchers and practitioners do qualitative, empirical work studies.

  16. Road tracking in aerial images based on human-computer interaction and Bayesian filtering

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Bischof, Walter F.; Caelli, Terry

    A typical way to update map road layers is to compare recent aerial images with existing map data, detect new roads and add them as cartographic entities to the road layer. This method cannot be fully automated because computer vision algorithms are still not sufficiently robust and reliable. More importantly, maps require final checking by a human due to the legal implications of errors. In this paper we introduce a road tracking system based on human-computer interactions (HCI) and Bayesian filtering. Bayesian filters, specifically, extended Kalman filters and particle filters, are used in conjunction with human inputs to estimate road axis points and update the tracking algorithms. Experimental results show that this approach is efficient and reliable and that it produces substantial savings over the traditional manual map revision approach. The main contribution of the paper is to propose a general and practical system that optimizes the performance of road tracking when both human and computer resources are involved.

  17. Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation

    NASA Astrophysics Data System (ADS)

    Obermayer, Richard W.; Nugent, William A.

    2000-11-01

    The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.

  18. Human-computer interaction and expert systems for three-dimensional studies of biomedical images.

    PubMed

    Perkins, W J; Jordan, M M; Shepherd, A M

    1989-01-01

    Three-dimensional reconstruction of serial sections, observed by microscopy, is an important technique in medicine and biology. To view any part of a structure clearly, that part has to be identified and clearly highlighted in its relationship to other features in a structure. The identification process can be time consuming and tedious if many sections are involved, especially for routine applications, since human identification is required. In this paper we describe how image processing, together with other information on the shape and position of features relative to each other on any one section and throughout the structure, could be incorporated into an expert system and we also show how such a system could be designed. An important feature is the use of human-computer interaction to allow the system to evolve under the guidance of the biological or medical expert. An example of feature identification in a plant-parasitic nematode is used.

  19. Nonstationary color tracking for vision-based human-computer interaction.

    PubMed

    Wu, Ying; Huang, T S

    2002-01-01

    Skin color offers a strong cue for efficient localization and tracking of human body parts in video sequences for vision-based human-computer interaction. Color-based target localization could be achieved by analyzing segmented skin color regions. However, one of the challenges of color-based target tracking is that color distributions would change in different lighting conditions such that fixed color models would be inadequate to capture nonstationary color distributions over time. Meanwhile, using a fixed skin color model trained by the data of a specific person would probably not work well for other people. Although some work has been done on adaptive color models, this problem still needs further studies. We present our investigation of color-based image segmentation and nonstationary color-based target tracking, by studying two different representations for color distributions. We propose the structure adaptive self-organizing map (SASOM) neural network that serves as a new color model. Our experiments show that such a representation is powerful for efficient image segmentation. Then, we formulate the nonstationary color tracking problem as a model transduction problem, the solution of which offers a way to adapt and transduce color classifiers in nonstationary color distributions. To fulfill model transduction, we propose two algorithms, the SASOM transduction and the discriminant expectation-maximization (EM), based on the SASOM color model and the Gaussian mixture color model, respectively. Our extensive experiments on the task of real-time face/hand localization show that these two algorithms can successfully handle some difficulties in nonstationary color tracking. We also implemented a real-time face/hand localization system based on such algorithms for vision-based human-computer interaction.

  20. Multimodal Neuroelectric Interface Development

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Totah, Joseph (Technical Monitor)

    2001-01-01

    This project aims to improve performance of NASA missions by developing multimodal neuroelectric technologies for augmented human-system interaction. Neuroelectric technologies will add completely new modes of interaction that operate in parallel with keyboards, speech, or other manual controls, thereby increasing the bandwidth of human-system interaction. We recently demonstrated the feasibility of real-time electromyographic (EMG) pattern recognition for a direct neuroelectric human-computer interface. We recorded EMG signals from an elastic sleeve with dry electrodes, while a human subject performed a range of discrete gestures. A machine-teaming algorithm was trained to recognize the EMG patterns associated with the gestures and map them to control signals. Successful applications now include piloting two Class 4 aircraft simulations (F-15 and 757) and entering data with a "virtual" numeric keyboard. Current research focuses on on-line adaptation of EMG sensing and processing and recognition of continuous gestures. We are also extending this on-line pattern recognition methodology to electroencephalographic (EEG) signals. This will allow us to bypass muscle activity and draw control signals directly from the human brain. Our system can reliably detect P-rhythm (a periodic EEG signal from motor cortex in the 10 Hz range) with a lightweight headset containing saline-soaked sponge electrodes. The data show that EEG p-rhythm can be modulated by real and imaginary motions. Current research focuses on using biofeedback to train of human subjects to modulate EEG rhythms on demand, and to examine interactions of EEG-based control with EMG-based and manual control. Viewgraphs on these neuroelectric technologies are also included.

  1. Consolidated findings from 6 years research on the age-differentiated design of human-computer interaction.

    PubMed

    Vetter, Sebastian; Bützler, Jennifer; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    The fast aging of many western and eastern societies and their increasing reliance on information technology create a compelling need to reconsider older users' interactions with computers. This paper summarizes the results of 6 years of research on the age-differentiated design of human-computer interaction. The well-known model of human information processing served as the theoretical framework. The model components ''sensory processing'', ''perception'', ''working memory'', ''decision and response selection'' and ''response execution'' were analyzed exemplarily in task settings on project management. In seven empirical studies with a total number of 405 participants between 20 and 77 years the human-computer interaction was analyzed regarding effectiveness, efficiency and user satisfaction. For most but not all studies the results reveal that age-induced differences in human-computer interaction can best be compensated by an ergonomic ''design for all''. In some cases however an age-specific approach is favorable.

  2. Body language user interface (BLUI)

    NASA Astrophysics Data System (ADS)

    Brody, Arthur W.; Olmsted, Coert

    1998-07-01

    We analyze a 3D skeletal representation of the user in spatial and temporal domains as a tool necessary to recognizing the gestures of drawing, picking and grabbing. The mechanisms of visual perception that are called upon in the imaginative process of artistic creation use those same tactile and kinesthetic pathways and structures in the brain which are employed when we manipulate the 3D world. We see, in fact, with our sensual bodies as well as with our eyes. Our interface is built on an analysis of pointing and gesturing and how they related to the perception of form in space. We report on our progress in implementing a body language user interface for artistic computer interaction, i.e., an human/computer interaction based on an analysis of how an artist uses her body in the act of creation. Using two synchronous TV cameras, we have videotaped an environment into which an artist moves, assumes a canonical (Da Vinci) pose and subsequently makes a series of simple gestures. The video images are processed to generate an animated 3D skeleton that corresponds to the skeleton of the artist. The locus of the path taken by the drawing hand is the source of a trace of particles. Our presentation shows the two simultaneous videos, the associated animated 3D skeleton, that skeleton as an instance of motion capture for a constrained model of a human skeleton and the trace of the path taken by the drawing hand.

  3. Soft Interfaces

    NASA Astrophysics Data System (ADS)

    Gilles de Gennes, Pierre; Edwards, Introduction By Sam

    1997-04-01

    Paul Adrien Maurice Dirac, one of the greatest physicists of the twentieth century, died in 1984. Dirac's college, St. John's of Cambridge, generously endowed annual lectures to be held at Cambridge University in his memory. This volume contains a much expanded version of the 1994 Dirac Lecture by Nobel Laureate Pierre Gilles de Gennes. The book presents an impressionistic tour of the physics of soft interfaces. Full of insight and interesting asides, it not only provides an accessible introduction to this topic, but also lays down many markers and signposts that will be of interest to researchers in physics or chemistry. Features discussions of wetting and dewetting, the dynamics of different types of interface and adhesion and polymer/polymer welding.

  4. "Don't" Do This--Pitfalls in Using Anti-Patterns in Teaching Human-Computer Interaction Principles

    ERIC Educational Resources Information Center

    Kotze, Paula; Renaud, Karen; van Biljon, Judy

    2008-01-01

    This paper explores the use of design patterns and anti-patterns in teaching human-computer interaction principles. Patterns are increasingly popular and are seen as an efficient knowledge transfer mechanism in many fields, including software development in the field of software engineering, and more recently in the field of human-computer…

  5. Human-Computer Interaction and Sociological Insight: A Theoretical Examination and Experiment in Building Affinity in Small Groups

    ERIC Educational Resources Information Center

    Oren, Michael Anthony

    2011-01-01

    The juxtaposition of classic sociological theory and the, relatively, young discipline of human-computer interaction (HCI) serves as a powerful mechanism for both exploring the theoretical impacts of technology on human interactions as well as the application of technological systems to moderate interactions. It is the intent of this dissertation…

  6. Human-Computer Interaction and Sociological Insight: A Theoretical Examination and Experiment in Building Affinity in Small Groups

    ERIC Educational Resources Information Center

    Oren, Michael Anthony

    2011-01-01

    The juxtaposition of classic sociological theory and the, relatively, young discipline of human-computer interaction (HCI) serves as a powerful mechanism for both exploring the theoretical impacts of technology on human interactions as well as the application of technological systems to moderate interactions. It is the intent of this dissertation…

  7. "Don't" Do This--Pitfalls in Using Anti-Patterns in Teaching Human-Computer Interaction Principles

    ERIC Educational Resources Information Center

    Kotze, Paula; Renaud, Karen; van Biljon, Judy

    2008-01-01

    This paper explores the use of design patterns and anti-patterns in teaching human-computer interaction principles. Patterns are increasingly popular and are seen as an efficient knowledge transfer mechanism in many fields, including software development in the field of software engineering, and more recently in the field of human-computer…

  8. Escaping from Babel: Improving the Terminology of Mental Models in the Literature of Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Turner, James M.; Belanger, Francois Papik

    1996-01-01

    Discusses the problem of establishing terminology for mental models, attempts to sort out the various meanings found in the literature, and offers definitions that could help in developing a more standardized terminology for discussing issues of concern to researchers in human-computer interaction. The role of mental models in learning and using…

  9. Cynicism, anger and cardiovascular reactivity during anger recall and human-computer interaction.

    PubMed

    Why, Yong Peng; Johnston, Derek W

    2008-06-01

    Cynicism moderated by interpersonal anger has been found to be related to cardiovascular reactivity. This paper reports two studies; Study 1 used an Anger Recall task, which aroused interpersonal anger, while participants in Study 2 engaged in a multitasking computer task, which aroused non-interpersonal anger via systematic manipulation of the functioning of the computer mouse. The Cynicism by State Anger interaction was significant for blood pressure arousal in Study 2 but not for Study 1: in Study 2, when State Anger was high, cynicism was positively related to blood pressure arousal but when State Anger was low, cynicism was negatively related to blood pressure arousal. For both studies, when State Anger was low, cynicism was positively related to cardiac output arousal and negatively related to vascular arousal. The results suggest that Cynicism-State Anger interaction can be generalised to non-social anger-arousing situations for hemodynamic processes but blood pressure reactivity is task-dependent. The implication for the role of job control and cardiovascular health during human-computer interactions is discussed.

  10. Brain potentials after clicking a mouse: a new psychophysiological approach to human-computer interaction.

    PubMed

    Nittono, Hiroshi; Hamada, Aya; Hori, Tadao

    As a first step in developing a new psychophysiological technique to assess mental workload in human-computer interaction (HCI), we recorded event-related brain potentials for visual stimuli triggered by voluntary mouse clicks. Twelve university students clicked a mouse button at their own pace. Each click triggered 1 of 3 alphabetic letters assigned to frequent standard, rare target, and rare nontarget stimuli. Counting target stimuli was required. Both rare stimuli elicited a P3 (P300) wave, the amplitude of which was larger when the stimuli were triggered by mouse clicks than when the same stimuli were presented automatically without mouse clicks. Postmotor potentials associated with clicking were small in amplitude (<2 microV) and did not temporally overlap with the P3. The findings suggest that the P3 can be recorded for a computer's response to the user's intentional action and may be used as a measure of perceptual-central processing resources allocated to the HCI task. Actual or potential applications of this research include the evaluation of the user's attentional state during HCI byrecording brain potentials in the "mouse click" or action-perception paradigm.

  11. Adaptation of hybrid human-computer interaction systems using EEG error-related potentials.

    PubMed

    Chavarriaga, Ricardo; Biasiucci, Andrea; Forster, Killian; Roggen, Daniel; Troster, Gerhard; Millan, Jose Del R

    2010-01-01

    Performance improvement in both humans and artificial systems strongly relies in the ability of recognizing erroneous behavior or decisions. This paper, that builds upon previous studies on EEG error-related signals, presents a hybrid approach for human computer interaction that uses human gestures to send commands to a computer and exploits brain activity to provide implicit feedback about the recognition of such commands. Using a simple computer game as a case study, we show that EEG activity evoked by erroneous gesture recognition can be classified in single trials above random levels. Automatic artifact rejection techniques are used, taking into account that subjects are allowed to move during the experiment. Moreover, we present a simple adaptation mechanism that uses the EEG signal to label newly acquired samples and can be used to re-calibrate the gesture recognition system in a supervised manner. Offline analysis show that, although the achieved EEG decoding accuracy is far from being perfect, these signals convey sufficient information to significantly improve the overall system performance.

  12. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process). Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Use of Human Computer Models to Influence the Design of International Space Station Propulsion Module

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Hall, Meridith L.

    1999-01-01

    The overall design for the International Space Station (ISS) Propulsion (Prop) Module consists of two bell shapes connected by a long tube having a shirt sleeve environment. The tube is to be used by the flight crew to transfer equipment and supplies from the Shuttle to ISS. Due to a desire to use existing space qualified hardware, the tube internal diameter was initially set at 38 inches, while the human engineering specification, NASA-STD-3000, required 50". Human computer modeling using the MannequinPro application was used to help make the case to enlarge the passageway to meet the specification. 3D CAD models of Prop Module were created with 38 inches, 45 inches and 50 inches passageways and human figures in the neutral body posture as well as a fetal posture were inserted into the model and systematically exercised. Results showed that only the 50 inches tube would accommodate a mid tube turn around by a large crew member, 95th percentile American males, by stature.

  14. An Human-Computer Interactive Augmented Reality System for Coronary Artery Diagnosis Planning and Training.

    PubMed

    Li, Qiming; Huang, Chen; Lv, Shengqing; Li, Zeyu; Chen, Yimin; Ma, Lizhuang

    2017-09-02

    In order to let the doctor carry on the coronary artery diagnosis and preoperative planning in a more intuitive and more natural way, and to improve the training effect for interns, an augmented reality system for coronary artery diagnosis planning and training (ARS-CADPT) is designed and realized in this paper. At first, a 3D reconstruction algorithm based on computed tomographic (CT) images is proposed to model the coronary artery vessels (CAV). Secondly, the algorithms of static gesture recognition and dynamic gesture spotting and recognition are presented to realize the real-time and friendly human-computer interaction (HCI), which is the characteristic of ARS-CADPT. Thirdly, a Sort-First parallel rendering and splicing display subsystem is developed, which greatly expands the capacity of student users. The experimental results show that, with the use of ARS-CADPT, the reconstruction accuracy of CAV model is high, the HCI is natural and fluent, and the visual effect is good. In a word, the system fully meets the application requirement.

  15. A User-Developed 3-D Hand Gesture Set for Human-Computer Interaction.

    PubMed

    Pereira, Anna; Wachs, Juan P; Park, Kunwoo; Rempel, David

    2015-06-01

    The purpose of this study was to develop a lexicon for 3-D hand gestures for common human-computer interaction (HCI) tasks by considering usability and effort ratings. Recent technologies create an opportunity for developing a free-form 3-D hand gesture lexicon for HCI. Subjects (N = 30) with prior experience using 2-D gestures on touch screens performed 3-D gestures of their choice for 34 common HCI tasks and rated their gestures on preference, match, ease, and effort. Videos of the 1,300 generated gestures were analyzed for gesture popularity, order, and response times. Gesture hand postures were rated by the authors on biomechanical risk and fatigue. A final task gesture set is proposed based primarily on subjective ratings and hand posture risk. The different dimensions used for evaluating task gestures were not highly correlated and, therefore, measured different properties of the task-gesture match. A method is proposed for generating a user-developed 3-D gesture lexicon for common HCIs that involves subjective ratings and a posture risk rating for minimizing arm and hand fatigue. © 2014, Human Factors and Ergonomics Society.

  16. Use of Human Computer Models to Influence the Design of International Space Station Propulsion Module

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Hall, Meridith L.

    1999-01-01

    The overall design for the International Space Station (ISS) Propulsion (Prop) Module consists of two bell shapes connected by a long tube having a shirt sleeve environment. The tube is to be used by the flight crew to transfer equipment and supplies from the Shuttle to ISS. Due to a desire to use existing space qualified hardware, the tube internal diameter was initially set at 38 inches, while the human engineering specification, NASA-STD-3000, required 50". Human computer modeling using the MannequinPro application was used to help make the case to enlarge the passageway to meet the specification. 3D CAD models of Prop Module were created with 38 inches, 45 inches and 50 inches passageways and human figures in the neutral body posture as well as a fetal posture were inserted into the model and systematically exercised. Results showed that only the 50 inches tube would accommodate a mid tube turn around by a large crew member, 95th percentile American males, by stature.

  17. A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction.

    PubMed

    Rantanen, Ville; Vanhala, Toni; Tuisku, Outi; Niemenlehto, Pekka-Henrik; Verho, Jarmo; Surakka, Veikko; Juhola, Martti; Lekkala, Jukka

    2011-09-01

    A light-weight, wearable, wireless gaze tracker with integrated selection command source for human-computer interaction is introduced. The prototype system combines head-mounted, video-based gaze tracking with capacitive facial movement detection that enable multimodal interaction by gaze pointing and making selections with facial gestures. The system is targeted mainly to disabled people with limited mobility over their hands. The hardware was made wireless to remove the need to take off the device when moving away from the computer, and to allow future use in more mobile contexts. The algorithms responsible for determining the eye and head orientations to map gaze direction to on-screen coordinates are presented together with the one to detect movements from the measured capacitance signal. Point-and-click experiments were conducted to assess the performance of the multimodal system. The results show decent performance in laboratory and office conditions. The overall point-and-click accuracy in the multimodal experiments is comparable to the errors in previous research on head-mounted, single modality gaze tracking that does not compensate for changes in head orientation.

  18. Eye center localization and gaze gesture recognition for human-computer interaction.

    PubMed

    Zhang, Wenhao; Smith, Melvyn L; Smith, Lyndon N; Farooq, Abdul

    2016-03-01

    This paper introduces an unsupervised modular approach for accurate and real-time eye center localization in images and videos, thus allowing a coarse-to-fine, global-to-regional scheme. The trajectories of eye centers in consecutive frames, i.e., gaze gestures, are further analyzed, recognized, and employed to boost the human-computer interaction (HCI) experience. This modular approach makes use of isophote and gradient features to estimate the eye center locations. A selective oriented gradient filter has been specifically designed to remove strong gradients from eyebrows, eye corners, and shadows, which sabotage most eye center localization methods. A real-world implementation utilizing these algorithms has been designed in the form of an interactive advertising billboard to demonstrate the effectiveness of our method for HCI. The eye center localization algorithm has been compared with 10 other algorithms on the BioID database and six other algorithms on the GI4E database. It outperforms all the other algorithms in comparison in terms of localization accuracy. Further tests on the extended Yale Face Database b and self-collected data have proved this algorithm to be robust against moderate head poses and poor illumination conditions. The interactive advertising billboard has manifested outstanding usability and effectiveness in our tests and shows great potential for benefiting a wide range of real-world HCI applications.

  19. A Language/Action Model of Human-Computer Communication in a Psychiatric Hospital

    PubMed Central

    Morelli, R. A.; Goethe, J. W.; Bronzino, J. D.

    1990-01-01

    When a staff physician says to an intern he is supervising “I think you should try medication X,” this statement may differ in meaning from the same string of words spoken between colleagues. In the first case, the statement may have the force of an order (“Do this!”), while in the latter it is merely a suggestion. In either case, the utterance sets up important expectations which constrain the future actions of the parties involved. This paper lays out an analytic framework, based on speech act theory, for representing such “conversations for action” so that they may be used to inform the design of human-computer interaction. The language/action design perspective views the information system -- in this case an expert system that monitors drug treatment -- as one of many “agents” within a broad communicative network. Speech act theory is used to model a typical psychiatric hospital unit as a system of communicative action. In addition to identifying and characterizing the primary communicative agents and speech acts, the model presents a taxonomy of key conversational patterns and shows how they may be applied to the design of a clinical monitoring system. In the final section, the advantages and implications of this design approach are discussed.

  20. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation

    PubMed Central

    2011-01-01

    Background Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice. PMID:22126324

  1. Cognition friendly interaction: A concept of partnership in human computer interaction.

    PubMed

    Das, Balaram

    2001-09-01

    This paper identifies yet another field of research, the discipline of human computer interaction, where the concept of self-similar fluctuations can play a vital role. A concept of interaction between computation and cognition is developed that is friendly toward the cognitive process. It is argued that friendly interactions must have a memory and be antipersistent. To cast this in a mathematical form, fluctuations in the interactions recorded over a period of time are studied, and it is shown that these fluctuations must necessarily be self-similar with the value of the self-similarity parameter confined to the interval (0, 1/2), for the interaction to be friendly. A statistical measure of complexity, of the interaction process, is also formulated as a function of the self-similarity parameter. Finally the question is raised as how to build a friendly software and a possible evolutionary process through which friendly softwares may emerge is indicated. (c) 2001 American Institute of Physics.

  2. Design of a compact low-power human-computer interaction equipment for hand motion

    NASA Astrophysics Data System (ADS)

    Wu, Xianwei; Jin, Wenguang

    2017-01-01

    Human-Computer Interaction (HCI) raises demand of convenience, endurance, responsiveness and naturalness. This paper describes a design of a compact wearable low-power HCI equipment applied to gesture recognition. System combines multi-mode sense signals: the vision sense signal and the motion sense signal, and the equipment is equipped with the depth camera and the motion sensor. The dimension (40 mm × 30 mm) and structure is compact and portable after tight integration. System is built on a module layered framework, which contributes to real-time collection (60 fps), process and transmission via synchronous confusion with asynchronous concurrent collection and wireless Blue 4.0 transmission. To minimize equipment's energy consumption, system makes use of low-power components, managing peripheral state dynamically, switching into idle mode intelligently, pulse-width modulation (PWM) of the NIR LEDs of the depth camera and algorithm optimization by the motion sensor. To test this equipment's function and performance, a gesture recognition algorithm is applied to system. As the result presents, general energy consumption could be as low as 0.5 W.

  3. Human-Centered Software Engineering: Software Engineering Architectures, Patterns, and Sodels for Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Seffah, Ahmed; Vanderdonckt, Jean; Desmarais, Michel C.

    The Computer-Human Interaction and Software Engineering (CHISE) series of edited volumes originated from a number of workshops and discussions over the latest research and developments in the field of Human Computer Interaction (HCI) and Software Engineering (SE) integration, convergence and cross-pollination. A first volume in this series (CHISE Volume I - Human-Centered Software Engineering: Integrating Usability in the Development Lifecycle) aims at bridging the gap between the field of SE and HCI, and addresses specifically the concerns of integrating usability and user-centered systems design methods and tools into the software development lifecycle and practices. This has been done by defining techniques, tools and practices that can fit into the entire software engineering lifecycle as well as by defining ways of addressing the knowledge and skills needed, and the attitudes and basic values that a user-centered development methodology requires. The first volume has been edited as Vol. 8 in the Springer HCI Series (Seffah, Gulliksen and Desmarais, 2005).

  4. Interface resistance

    NASA Astrophysics Data System (ADS)

    Sinkkonen, Juha

    1983-11-01

    Interface resistance is studied by using the Landauer formula which relates the resistance to the quantum mechanical transmission coefficient. A simple rederivation of the Landauer formula is given. Using a step-like potential barrier as a model for the metal-semiconductor contact an analytical expression for the effective Richardson constant is derived. As an other application the grain boundary resistance in polycrystalline semiconductors is studied. The short-range potential fluctuation associated with the grain boundary is described by a rectangular potential barrier. The results for the grain boundary limited mobility cover both the strong and weak scattering regimes.

  5. What is the Value of Embedding Artificial Emotional Prosody in Human-Computer Interactions? Implications for Theory and Design in Psychological Science.

    PubMed

    Mitchell, Rachel L C; Xu, Yi

    2015-01-01

    In computerized technology, artificial speech is becoming increasingly important, and is already used in ATMs, online gaming and healthcare contexts. However, today's artificial speech typically sounds monotonous, a main reason for this being the lack of meaningful prosody. One particularly important function of prosody is to convey different emotions. This is because successful encoding and decoding of emotions is vital for effective social cognition, which is increasingly recognized in human-computer interaction contexts. Current attempts to artificially synthesize emotional prosody are much improved relative to early attempts, but there remains much work to be done due to methodological problems, lack of agreed acoustic correlates, and lack of theoretical grounding. If the addition of synthetic emotional prosody is not of sufficient quality, it may risk alienating users instead of enhancing their experience. So the value of embedding emotion cues in artificial speech may ultimately depend on the quality of the synthetic emotional prosody. However, early evidence on reactions to synthesized non-verbal cues in the facial modality bodes well. Attempts to implement the recognition of emotional prosody into artificial applications and interfaces have perhaps been met with greater success, but the ultimate test of synthetic emotional prosody will be to critically compare how people react to synthetic emotional prosody vs. natural emotional prosody, at the behavioral, socio-cognitive and neural levels.

  6. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

    DTIC Science & Technology

    2016-09-06

    of humans and machines , we propose a conversational interface using Controlled Natural Language (CNL), which is both human readable and machine ...processable, for shared information representation. We hypothesize that this approach facilitates rapid CSU when as- sembled dynamically with machine ...experiment wherein small groups of users attempted to build CSU via social sensing, interacting with the machine via Natural Language (NL) and CNL. To

  7. Undergraduate Use of CD-ROM Databases: Observations of Human-Computer Interaction and Relevance Judgments.

    ERIC Educational Resources Information Center

    Shaw, Debora

    1996-01-01

    Describes a study that observed undergraduates as they searched bibliographic databases on a CD-ROM local area network. Topics include related research, information needs, evolution of search topics, database selection, search strategies, relevance judgments, CD-ROM interfaces, and library instruction. (Author/LRW)

  8. Towards human-computer synergetic analysis of large-scale biological data.

    PubMed

    Singh, Rahul; Yang, Hui; Dalziel, Ben; Asarnow, Daniel; Murad, William; Foote, David; Gormley, Matthew; Stillman, Jonathan; Fisher, Susan

    2013-01-01

    Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user-directed information

  9. Towards human-computer synergetic analysis of large-scale biological data

    PubMed Central

    2013-01-01

    Background Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. Results In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user

  10. Delays in Human-Computer Interaction and Their Effects on Brain Activity.

    PubMed

    Kohrs, Christin; Angenstein, Nicole; Brechmann, André

    2016-01-01

    The temporal contingency of feedback is an essential requirement of successful human-computer interactions. The timing of feedback not only affects the behavior of a user but is also accompanied by changes in psychophysiology and neural activity. In three fMRI experiments we systematically studied the impact of delayed feedback on brain activity while subjects performed an auditory categorization task. In the first fMRI experiment, we analyzed the effects of rare and thus unexpected delays of different delay duration on brain activity. In the second experiment, we investigated if users can adapt to frequent delays. Therefore, delays were presented as often as immediate feedback. In a third experiment, the influence of interaction outage was analyzed by measuring the effect of infrequent omissions of feedback on brain activity. The results show that unexpected delays in feedback presentation compared to immediate feedback stronger activate inter alia bilateral the anterior insular cortex, the posterior medial frontal cortex, the left inferior parietal lobule and the right inferior frontal junction. The strength of this activation increases with the duration of the delay. Thus, delays interrupt the course of an interaction and trigger an orienting response that in turn activates brain regions of action control. If delays occur frequently, users can adapt, delays become expectable, and the brain activity in the observed network diminishes over the course of the interaction. However, introducing rare omissions of expected feedback reduces the system's trustworthiness which leads to an increase in brain activity not only in response to such omissions but also following frequently occurring and thus expected delays.

  11. Detection of affective states from text and speech for real-time human--computer interaction.

    PubMed

    Calix, Ricardo A; Javadpour, Leili; Knapp, Gerald M

    2012-08-01

    The goal of this work is to develop and test an automated system methodology that can detect emotion from text and speech features. Affective human-computer interaction will be critical for the success of new systems that will be prevalent in the 21st century. Such systems will need to properly deduce human emotional state before they can determine how to best interact with people. Corpora and machine learning classification models are used to train and test a methodology for emotion detection. The methodology uses a stepwise approach to detect sentiment in sentences by first filtering out neutral sentences, then distinguishing among positive, negative, and five emotion classes. Results of the classification between emotion and neutral sentences achieved recall accuracies as high as 77% in the University of Illinois at Urbana-Champaign (UIUC) corpus and 61% in the Louisiana State University medical drama (LSU-MD) corpus for emotion samples. Once neutral sentences were filtered out, the methodology achieved accuracy scores for detecting negative sentences as high as 92.3%. Results of the feature analysis indicate that speech spectral features are better than speech prosodic features for emotion detection. Accumulated sentiment composition text features appear to be very important as well. This work contributes to the study of human communication by providing a better understanding of how language factors help to best convey human emotion and how to best automate this process. Results of this study can be used to develop better automated assistive systems that interpret human language and respond to emotions through 3-D computer graphics.

  12. Delays in Human-Computer Interaction and Their Effects on Brain Activity

    PubMed Central

    Kohrs, Christin; Angenstein, Nicole; Brechmann, André

    2016-01-01

    The temporal contingency of feedback is an essential requirement of successful human-computer interactions. The timing of feedback not only affects the behavior of a user but is also accompanied by changes in psychophysiology and neural activity. In three fMRI experiments we systematically studied the impact of delayed feedback on brain activity while subjects performed an auditory categorization task. In the first fMRI experiment, we analyzed the effects of rare and thus unexpected delays of different delay duration on brain activity. In the second experiment, we investigated if users can adapt to frequent delays. Therefore, delays were presented as often as immediate feedback. In a third experiment, the influence of interaction outage was analyzed by measuring the effect of infrequent omissions of feedback on brain activity. The results show that unexpected delays in feedback presentation compared to immediate feedback stronger activate inter alia bilateral the anterior insular cortex, the posterior medial frontal cortex, the left inferior parietal lobule and the right inferior frontal junction. The strength of this activation increases with the duration of the delay. Thus, delays interrupt the course of an interaction and trigger an orienting response that in turn activates brain regions of action control. If delays occur frequently, users can adapt, delays become expectable, and the brain activity in the observed network diminishes over the course of the interaction. However, introducing rare omissions of expected feedback reduces the system’s trustworthiness which leads to an increase in brain activity not only in response to such omissions but also following frequently occurring and thus expected delays. PMID:26745874

  13. Human Computation in Visualization: Using Purpose Driven Games for Robust Evaluation of Visualization Algorithms.

    PubMed

    Ahmed, N; Zheng, Ziyi; Mueller, K

    2012-12-01

    Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.

  14. Interface standardization

    NASA Technical Reports Server (NTRS)

    Spencer, R.; Wong, V.

    1983-01-01

    Central-station applications create a large and attractive market for photovoltaics in the near future. However, some significant barriers lie between the industry of today and realization of that market. Manufacturing capacity and price are two principal impediments. The Utilities, which are the future system owners, are gaining experience with central-station PV power through the Sacramento Municipal Utility District, Hesperia and similar small central-station installations. SMUD has recognized that competition must be maintained to help reduce prices. So little standardization exists that the cost is driven upward to redefine mechanical and electrical interfaces for each vendor. New structues are required for each vendor and nonoptimum field geometries result from attempts to include more than one vendor in an array field. Standards at some hardware level are required.

  15. Interface standardization

    NASA Technical Reports Server (NTRS)

    Spencer, R.; Wong, V.

    1983-01-01

    Central-station applications create a large and attractive market for photovoltaics in the near future. However, some significant barriers lie between the industry of today and realization of that market. Manufacturing capacity and price are two principal impediments. The Utilities, which are the future system owners, are gaining experience with central-station PV power through the Sacramento Municipal Utility District, Hesperia and similar small central-station installations. SMUD has recognized that competition must be maintained to help reduce prices. So little standardization exists that the cost is driven upward to redefine mechanical and electrical interfaces for each vendor. New structues are required for each vendor and nonoptimum field geometries result from attempts to include more than one vendor in an array field. Standards at some hardware level are required.

  16. Relating Interface Evolution to Interface Mechanics Based on Interface Properties

    NASA Astrophysics Data System (ADS)

    Verma, Devendra; Biswas, Sudipta; Prakash, Chandra; Tomar, Vikas

    2017-01-01

    The current article focuses on recent work done in understanding the role of processing techniques on interface evolution and connecting interface evolution to interface thickness-dependent properties. Special emphasis is placed on interface evolution during the sintering process of tungsten ( W). Sintering with additives such as nickel significantly changes grain boundary properties in W, leading to issues such as grain boundary embrittlement. When one has to mechanically describe properties of polycrystalline W with an account of the influence of grain boundary embrittlement, one must explicitly consider grain boundary properties. This issue is the focus of the present work on the mechanical properties of interfaces. Overall, a phase field modeling-based approach is shown to be an excellent computational tool for predicting the interface evolution. The influences of the interface thickness, chemistry, and orientation of phases around interfaces are analyzed using extended finite element simulations for polycrystalline W.

  17. Ontology for assessment studies of human-computer-interaction in surgery.

    PubMed

    Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen

    2015-02-01

    New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks

  18. Composite pattern structured light projection for human computer interaction in space

    NASA Astrophysics Data System (ADS)

    Guan, Chun; Hassebrook, Laurence G.; Lau, Daniel L.; Yalla, Veera Ganesh

    2005-05-01

    Interacting with computer technology while wearing a space suit is difficult at best. We present a sensor that can interpret body gestures in 3-Dimensions. Having the depth dimension allows simple thresholding to isolate the hands as well as use their positioning and orientation as input controls to digital devices such as computers and/or robotic devices. Structured light pattern projection is a well known method of accurately extracting 3-Dimensional information of a scene. Traditional structured light methods require several different patterns to recover the depth, without ambiguity and albedo sensitivity, and are corrupted by object motion during the projection/capture process. The authors have developed a methodology for combining multiple patterns into a single composite pattern by using 2-Dimensional spatial modulation techniques. A single composite pattern projection does not require synchronization with the camera so the data acquisition rate is only limited by the video rate. We have incorporated dynamic programming to greatly improve the resolution of the scan. Other applications include machine vision, remote controlled robotic interfacing in space, advanced cockpit controls and computer interfacing for the disabled. We will present performance analysis, experimental results and video examples.

  19. Eye-movements and Voice as Interface Modalities to Computer Systems

    NASA Astrophysics Data System (ADS)

    Farid, Mohsen M.; Murtagh, Fionn D.

    2003-03-01

    We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance human-computer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.

  20. FGB: A Graphical and Haptic User Interface for Creating Graphical, Haptic User Interfaces

    SciTech Connect

    ANDERSON,THOMAS G.; BRECKENRIDGE,ARTHURINE; DAVIDSON,GEORGE S.

    1999-10-18

    The emerging field of haptics represents a fundamental change in human-computer interaction (HCI), and presents solutions to problems that are difficult or impossible to solve with a two-dimensional, mouse-based interface. To take advantage of the potential of haptics, however, innovative interaction techniques and programming environments are needed. This paper describes FGB (FLIGHT GHUI Builder), a programming tool that can be used to create an application specific graphical and haptic user interface (GHUI). FGB is itself a graphical and haptic user interface with which a programmer can intuitively create and manipulate components of a GHUI in real time in a graphical environment through the use of a haptic device. The programmer can create a GHUI without writing any programming code. After a user interface is created, FGB writes the appropriate programming code to a file, using the FLIGHT API, to recreate what the programmer created in the FGB interface. FGB saves programming time and increases productivity, because a programmer can see the end result as it is created, and FGB does much of the programming itself. Interestingly, as FGB was created, it was used to help build itself. The further FGB was in its development, the more easily and quickly it could be used to create additional functionality and improve its own design. As a finished product, FGB can be used to recreate itself in much less time than it originally required, and with much less programming. This paper describes FGB's GHUI components, the techniques used in the interface, how the output code is created, where programming additions and modifications should be placed, and how it can be compared to and integrated with existing API's such as MFC and Visual C++, OpenGL, and GHOST.

  1. Human-Computer Interfaces for Tactical Decision Making, Analysis, and Assessment Using Artificially Intelligent Platforms. Volume 1. Software Design and Database Descriptions for BATMAN and ROBIN

    DTIC Science & Technology

    1991-08-01

    new user’s name and social security number. value: a string of the form : " color offset". See button-color above. user.bg.color: the color of empty slots...in the "list of users" panel. value: a string of the form : " color offset". See buttoncolor above. user.fg.color: the color of users’ names in the...34list of users" panel. value: a string of the form : " color offset". See buttoncolor above. username border color: the border color around users’ names in

  2. A Real-Time Pinch-to-Zoom Motion Detection by Means of a Surface EMG-Based Human-Computer Interface

    PubMed Central

    Kim, Jongin; Cho, Dongrae; Lee, Kwang Jin; Lee, Boreom

    2015-01-01

    In this paper, we propose a system for inferring the pinch-to-zoom gesture using surface EMG (Electromyography) signals in real time. Pinch-to-zoom, which is a common gesture in smart devices such as an iPhone or an Android phone, is used to control the size of images or web pages according to the distance between the thumb and index finger. To infer the finger motion, we recorded EMG signals obtained from the first dorsal interosseous muscle, which is highly related to the pinch-to-zoom gesture, and used a support vector machine for classification between four finger motion distances. The powers which are estimated by Welch's method were used as feature vectors. In order to solve the multiclass classification problem, we applied a one-versus-one strategy, since a support vector machine is basically a binary classifier. As a result, our system yields 93.38% classification accuracy averaged over six subjects. The classification accuracy was estimated using 10-fold cross validation. Through our system, we expect to not only develop practical prosthetic devices but to also construct a novel user experience (UX) for smart devices. PMID:25551482

  3. A real-time pinch-to-zoom motion detection by means of a surface EMG-based human-computer interface.

    PubMed

    Kim, Jongin; Cho, Dongrae; Lee, Kwang Jin; Lee, Boreom

    2014-12-29

    In this paper, we propose a system for inferring the pinch-to-zoom gesture using surface EMG (Electromyography) signals in real time. Pinch-to-zoom, which is a common gesture in smart devices such as an iPhone or an Android phone, is used to control the size of images or web pages according to the distance between the thumb and index finger. To infer the finger motion, we recorded EMG signals obtained from the first dorsal interosseous muscle, which is highly related to the pinch-to-zoom gesture, and used a support vector machine for classification between four finger motion distances. The powers which are estimated by Welch's method were used as feature vectors. In order to solve the multiclass classification problem, we applied a one-versus-one strategy, since a support vector machine is basically a binary classifier. As a result, our system yields 93.38% classification accuracy averaged over six subjects. The classification accuracy was estimated using 10-fold cross validation. Through our system, we expect to not only develop practical prosthetic devices but to also construct a novel user experience (UX) for smart devices.

  4. Mental effort with the use of different dialogue techniques in human-computer interaction.

    PubMed

    Pinkpank, T; Wandke, H

    1995-01-01

    Flexibility is an important design criterion for user interfaces of interactive computer systems. Flexibility should allow an adaptation of the system to inter- and intra-individual differences in users. Flexible design requires to know how users base their decisions when they select one of the various options for interaction offered by the system. This problem has been studied in a psychophysiological experiment. In the experiment subjects were required to create graphics on a screen. Independent variables were: experience of the subjects (varied by training sessions), dialogue technique (menues vs. command language) and tasks. Tasks were constructed and analyzed by Cognitive Complexity Theory in order to have tasks suited for either menue or command techniques. Mental effort was registered by P300-amplitude of evoked potentials in the preparation stage of interaction and by 0.10 Hz-component of heart rate variability during the execution stage. It was possible to identify decision strategies based on anticipations of cognitive effort by considering the technique preferences shown in the training sessions and the time subjects spent on the task analysis.

  5. Virtual tomography: a new approach to efficient human-computer interaction for medical imaging

    NASA Astrophysics Data System (ADS)

    Teistler, Michael; Bott, Oliver J.; Dormeier, Jochen; Pretschner, Dietrich P.

    2003-05-01

    By utilizing virtual reality (VR) technologies the computer system virtusMED implements the concept of virtual tomography for exploring medical volumetric image data. Photographic data from a virtual patient as well as CT or MRI data from real patients are visualized within a virtual scene. The view of this scene is determined either by a conventional computer mouse, a head-mounted display or a freely movable flat panel. A virtual examination probe is used to generate oblique tomographic images which are computed from the given volume data. In addition, virtual models can be integrated into the scene such as anatomical models of bones and inner organs. virtusMED has shown to be a valuable tool to learn human anaotomy and to udnerstand the principles of medical imaging such as sonography. Furthermore its utilization to improve CT and MRI based diagnosis is very promising. Compared to VR systems of the past, the standard PC-based system virtusMED is a cost-efficient and easily maintained solution providing a highly intuitive time-saving user interface for medical imaging.

  6. Hands-free human computer interaction via an electromyogram-based classification algorithm.

    PubMed

    Chin, Craig; Barreto, Armando; Li, Chao; Zhai, Jing

    2005-01-01

    A four-electrode system for hands-free computer cursor control, based on the digital processing of Electromyogram (EMG) signals is proposed. The electrodes are located over the right frontalis, the procerus, the left temporalis and the right temporalis muscles in the head. This system is meant to enable individuals paralyzed from the neck down (e.g., due to Spinal Cord Injury) to interact with computers using point-and-click graphic interfaces. The intention is to translate electromyograms derived from muscle contractions associated with specific facial movements into five cursor actions, namely: Left, Right, Up, Down and Left-click. This translation is accomplished by a digital signal processing classification algorithm that takes advantage of the divergent spectral nature of the EMG signals produced by the frontalis, temporalis, and procerus muscles, respectively. The effectiveness of the algorithm is evaluated by comparing its performance to that of a previously developed three-electrode EMG-based algorithm, using Matlab simulations. The results indicate that the algorithm classifies with great accuracy and provides a marked improvement over the previous three-electrode system.

  7. Quality of human-computer interaction--results of a national usability survey of hospital-IT in Germany.

    PubMed

    Bundschuh, Bettina B; Majeed, Raphael W; Bürkle, Thomas; Kuhn, Klaus; Sax, Ulrich; Seggewies, Christof; Vosseler, Cornelia; Röhrig, Rainer

    2011-11-09

    Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

  8. HCI and mobile health interventions: How human-computer interaction can contribute to successful mobile health interventions.

    PubMed

    Poole, Erika S

    2013-12-01

    Advances in mobile computing offer the potential to change when, where, and how health interventions are delivered. Rather than relying on occasional in-clinic interactions, mobile health (mHealth) interventions may overcome constraints due to limited clinician time, poor patient adherence, and inability to provide meaningful interventions at the most appropriate time. Technological capability, however, does not equate with user acceptance and adoption. How then can we ensure that mobile technologies for behavior change meet the needs of their target audience? In this paper, we argue that overcoming acceptance and adoption barriers requires interdisciplinary collaborations, bringing together not only technologists and health researchers but also human-computer interaction (HCI) experts. We discuss the value of human-computer interaction research to the nascent field of mHealth and demonstrate how research from HCI can offer complementary insights on the creation of mobile health interventions. We conclude with a discussion of barriers to interdisciplinary collaborations in mobile health and suggest ways to overcome them.

  9. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    PubMed Central

    2011-01-01

    Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies. PMID:22070880

  10. [Effects of waiting times within simple problems: an analogy to waiting times in human-computer interaction].

    PubMed

    Kuhmann, W; Schaefer, F; Boucsein, W

    1990-01-01

    The impact of forced intra-task waiting periods was evaluated in a laboratory study designed in analogy to forced waiting periods in human-computer interaction. Each task consisted of two lines of grouped capital letters (so-called Sterzinger lines) that had to be processed individually, separated by the experimental waiting period. The final response to both lines was taken after removal of the second line. The experiment was conducted according to a within-subjects design with the order of the two factors "duration" (2 s vs. 8 s) and "variability of the waiting periods" (constant vs. variable) controlled. The dependent variables were performance indices, heart rate, electrodermal activity, and subjective measures of mood and physical complaints. The results showed effects of the duration of waiting periods for the performance and the physiological variables, and effects of the variability factor for the subjective variables. Shorter waiting periods were associated with higher scores of heart rate and skin resistance responses, variable waiting periods seemed to confuse the subjects more than constant ones. The results are discussed with respect to the effects of the two experimental factors. With respect to dialog structures in human-computer interaction, the results lead to the conclusion that response interference could be avoided by giving the user the possibility to immediately input problem solutions. Partial and preliminary inputs should be permanently kept visible during the whole task cycle by the software.

  11. Media independent interface. Interface control document

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A Media Independent Interface (MII) is specified, using current standards in the industry. The MII is described in hierarchical fashion. At the base are IEEE/International Standards Organization (ISO) documents (standards) which describe the functionality of the software modules or layers and their interconnection. These documents describe primitives which are to transcent the MII. The intent of the MII is to provide a universal interface to one or more Media Access Contols (MACs) for the Logical Link Controller and Station Manager. This interface includes both a standardized electrical and mechanical interface and a standardized functional specification which defines the services expected from the MAC.

  12. DASHER--an efficient writing system for brain-computer interfaces?

    PubMed

    Wills, Sebastian A; MacKay, David J C

    2006-06-01

    DASHER is a human-computer interface for entering text using continuous or discrete gestures. Through its use of an internal language model, DASHER efficiently converts bits received from the user into text, and has been shown to be a competitive alternative to existing text-entry methods in situations where an ordinary keyboard cannot be used. We propose that DASHER would be well-matched to the low bit-rate, noisy output obtained from brain-computer interfaces (BCIs), and discuss the issues surrounding the use of DASHER with BCI systems.

  13. EOG-sEMG Human Interface for Communication.

    PubMed

    Tamura, Hiroki; Yan, Mingmin; Sakurai, Keiko; Tanno, Koichi

    2016-01-01

    The aim of this study is to present electrooculogram (EOG) and surface electromyogram (sEMG) signals that can be used as a human-computer interface. Establishing an efficient alternative channel for communication without overt speech and hand movements is important for increasing the quality of life for patients suffering from amyotrophic lateral sclerosis, muscular dystrophy, or other illnesses. In this paper, we propose an EOG-sEMG human-computer interface system for communication using both cross-channels and parallel lines channels on the face with the same electrodes. This system could record EOG and sEMG signals as "dual-modality" for pattern recognition simultaneously. Although as much as 4 patterns could be recognized, dealing with the state of the patients, we only choose two classes (left and right motion) of EOG and two classes (left blink and right blink) of sEMG which are easily to be realized for simulation and monitoring task. From the simulation results, our system achieved four-pattern classification with an accuracy of 95.1%.

  14. EOG-sEMG Human Interface for Communication

    PubMed Central

    Tamura, Hiroki; Yan, Mingmin; Sakurai, Keiko; Tanno, Koichi

    2016-01-01

    The aim of this study is to present electrooculogram (EOG) and surface electromyogram (sEMG) signals that can be used as a human-computer interface. Establishing an efficient alternative channel for communication without overt speech and hand movements is important for increasing the quality of life for patients suffering from amyotrophic lateral sclerosis, muscular dystrophy, or other illnesses. In this paper, we propose an EOG-sEMG human-computer interface system for communication using both cross-channels and parallel lines channels on the face with the same electrodes. This system could record EOG and sEMG signals as “dual-modality” for pattern recognition simultaneously. Although as much as 4 patterns could be recognized, dealing with the state of the patients, we only choose two classes (left and right motion) of EOG and two classes (left blink and right blink) of sEMG which are easily to be realized for simulation and monitoring task. From the simulation results, our system achieved four-pattern classification with an accuracy of 95.1%. PMID:27418924

  15. An intelligent interface for satellite operations: Your Orbit Determination Assistant (YODA)

    NASA Technical Reports Server (NTRS)

    Schur, Anne

    1988-01-01

    An intelligent interface is often characterized by the ability to adapt evaluation criteria as the environment and user goals change. Some factors that impact these adaptations are redefinition of task goals and, hence, user requirements; time criticality; and system status. To implement adaptations affected by these factors, a new set of capabilities must be incorporated into the human-computer interface design. These capabilities include: (1) dynamic update and removal of control states based on user inputs, (2) generation and removal of logical dependencies as change occurs, (3) uniform and smooth interfacing to numerous processes, databases, and expert systems, and (4) unobtrusive on-line assistance to users of concepts were applied and incorporated into a human-computer interface using artificial intelligence techniques to create a prototype expert system, Your Orbit Determination Assistant (YODA). YODA is a smart interface that supports, in real teime, orbit analysts who must determine the location of a satellite during the station acquisition phase of a mission. Also described is the integration of four knowledge sources required to support the orbit determination assistant: orbital mechanics, spacecraft specifications, characteristics of the mission support software, and orbit analyst experience. This initial effort is continuing with expansion of YODA's capabilities, including evaluation of results of the orbit determination task.

  16. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  17. A depth camera for natural human-computer interaction based on near-infrared imaging and structured light

    NASA Astrophysics Data System (ADS)

    Liu, Yue; Wang, Liqiang; Yuan, Bo; Liu, Hao

    2015-08-01

    Designing of a novel depth camera is presented, which targets close-range (20-60cm) natural human-computer interaction especially for mobile terminals. In order to achieve high precision through the working range, a two-stepping method is employed to match the near infrared intensity image to absolute depth in real-time. First, we use structured light achieved by an 808nm laser diode and a Dammann grating to coarsely quantize the output space of depth values into discrete bins. Then use a learning-based classification forest algorithm to predict the depth distribution over these bins for each pixel in the image. The quantitative experimental results show that this depth camera has 1% precision over range of 20-60cm, which show that the camera suit resource-limited and low-cost application.

  18. Graphical Interfaces for Simulation.

    ERIC Educational Resources Information Center

    Hollan, J. D.; And Others

    This document presents a discussion of the development of a set of software tools to assist in the construction of interfaces to simulations and real-time systems. Presuppositions to the approach to interface design that was used are surveyed, the tools are described, and the conclusions drawn from these experiences in graphical interface design…

  19. Human perceptual deficits as factors in computer interface test and evaluation

    SciTech Connect

    Bowser, S.E.

    1992-06-01

    Issues related to testing and evaluating human computer interfaces are usually based on the machine rather than on the human portion of the computer interface. Perceptual characteristics of the expected user are rarely investigated, and interface designers ignore known population perceptual limitations. For these reasons, environmental impacts on the equipment will more likely be defined than will user perceptual characteristics. The investigation of user population characteristics is most often directed toward intellectual abilities and anthropometry. This problem is compounded by the fact that some deficits capabilities tend to be found in higher-than-overall population distribution in some user groups. The test and evaluation community can address the issue from two primary aspects. First, assessing user characteristics should be extended to include tests of perceptual capability. Secondly, interface designs should use multimode information coding.

  20. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  1. Quantization of interface currents

    SciTech Connect

    Kotani, Motoko; Schulz-Baldes, Hermann; Villegas-Blas, Carlos

    2014-12-15

    At the interface of two two-dimensional quantum systems, there may exist interface currents similar to edge currents in quantum Hall systems. It is proved that these interface currents are macroscopically quantized by an integer that is given by the difference of the Chern numbers of the two systems. It is also argued that at the interface between two time-reversal invariant systems with half-integer spin, one of which is trivial and the other non-trivial, there are dissipationless spin-polarized interface currents.

  2. A novel device for head gesture measurement system in combination with eye-controlled human machine interface

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun

    2006-06-01

    This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.

  3. Microconical interface fitting and interface grasping tool

    NASA Technical Reports Server (NTRS)

    Gernhardt, Michael L. (Inventor); Wightman, William D. (Inventor); Johnston, Alistair P. (Inventor)

    1994-01-01

    A small and light weight microconical interface fitting may be attached to the surface of a space vehicle or equipment to provide an attachment device for an astronaut or robot to capture the space vehicle or equipment. The microconical interface fitting of the present invention has an axisymmetrical conical body having a base portion with a torque reaction surface for preventing rotation of the interface grasping tool; a cavitated, sunken or hollowed out intermediate locking portion which has a cavity shaped for receiving the latches of the grasping tool and an upper guiding portion for guiding the grasping tool into axial alignment with the microconical interface fitting. The capture is accomplished with an interface grasping tool. The grasping tool comprises an outer sleeve with a handle attached, an inner sleeve which may be raised and lowered within the outer sleeve with a plurality of latches supported at the lower end and a cam to raise and lower the inner sleeve. When the inner sleeve is at its lowest position, the latches form the largest diameter opening for surrounding the microconical fitting and the latches form the smallest diameter or a locking, grasping position when raised to the highest position within the outer sleeve. The inner sleeve may be at an intermediate, capture position which permits the latches to be biased outwardly when contacting the microconical fitting under very low forces to grasp the fitting and permits capture (soft docking) without exact alignment of the fitting and the tool.

  4. User interface support

    NASA Technical Reports Server (NTRS)

    Lewis, Clayton; Wilde, Nick

    1989-01-01

    Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.

  5. Viscoelasticity of stepped interfaces

    NASA Astrophysics Data System (ADS)

    Skirlo, S. A.; Demkowicz, M. J.

    2013-10-01

    Using molecular dynamics modeling, we show that interfaces in sputter deposited Cu-Nb superlattices exhibit time-dependent elasticity, i.e., viscoelasticity, under shear loading. In the high temperature and small strain rate limit, the interfacial shear modulus approaches a value proportional to the density of steps in the interface. It may therefore be possible to tailor the low-frequency shear moduli of interfaces by controlling their step densities.

  6. Persistent interface fluid syndrome.

    PubMed

    Hoffman, Richard S; Fine, I Howard; Packer, Mark

    2008-08-01

    We present an unusual case of persistent interface fluid that would not resolve despite normal intraocular pressure and corneal endothelial replacement with Descemet-stripping endothelial keratoplasty. Dissection, elevation, and repositioning of the laser in situ keratomileusis flap were required to resolve the interface fluid. Circumferential corneal graft-host margin scar formation acting as a mechanical strut may have been the cause of the intractable interface fluid.

  7. Turbomachine Interface Sealing

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Chupp, Raymond E.; Lattime, Scott B.; Steinetz, Bruce M.

    2005-01-01

    Sealing interfaces and coatings, like lubricants, are sacrificial, giving up their integrity for the benefit of the component. Clearance control is a major issue in power systems turbomachine design and operational life. Sealing becomes the most cost-effective way to enhance system performance. Coatings, films, and combined use of both metals and ceramics play a major role in maintaining interface clearances in turbomachine sealing and component life. This paper focuses on conventional and innovative materials and design practices for sealing interfaces.

  8. Fracture interface waves

    NASA Astrophysics Data System (ADS)

    Gu, Boliang; Nihei, Kurt T.; Myer, Larry R.; Pyrak-Nolte, Laura J.

    1996-01-01

    Interface waves on a single fracture in an elastic solid are investigated theoretically and numerically using plane wave analysis and a boundary element method. The finite mechanical stiffness of a fracture is modeled as a displacement discontinuity. Analysis for inhomogeneous plane wave propagation along a fracture yields two dispersive equations for symmetric and antisymmetric interface waves. The basic form of these equations are similar to the classic Rayleigh equation for a surface wave on a half-space, except that the displacements and velocities of the symmetric and antisymmetric fracture interface waves are each controlled by a normalized fracture stiffness. For low values of the normalized fracture stiffness, the symmetric and antisymmetric interface waves degenerate to the classic Rayleigh wave on a traction-free surface. For large values of the normalized fracture stiffness, the antisymmetric and symmetric interface waves become a body S wave and a body P wave, respectively, which propagate parallel to the fracture. For intermediate values of the normalized fracture stiffness, both interface waves are dispersive. Numerical modeling performed using a boundary element method demonstrates that a line source generates a P-type interface wave, in addition to the two Rayleigh-type interface waves. The magnitude of the normalized fracture stiffness is observed to control the velocities of the interface waves and the partitioning of seismic energy among the various waves near the fracture.

  9. Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction: a review

    NASA Astrophysics Data System (ADS)

    Quitadamo, L. R.; Cavrini, F.; Sbernini, L.; Riillo, F.; Bianchi, L.; Seri, S.; Saggio, G.

    2017-02-01

    Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

  10. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

    PubMed

    Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

    2000-10-01

    In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

  11. Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction: a review.

    PubMed

    Quitadamo, L R; Cavrini, F; Sbernini, L; Riillo, F; Bianchi, L; Seri, S; Saggio, G

    2017-02-01

    Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

  12. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  13. Spatial and temporal characteristics of rapid cursor-positioning movements with electromechanical mice in human-computer interaction.

    PubMed

    Walker, N; Meyer, D E; Smelcer, J B

    1993-09-01

    This research examines how people make movements with pointing devices during human-computer interaction. It specifically concerns the perceptual-motor processes that mediate the speed and accuracy of cursor positioning with electromechanical mice. In three experiments we investigated the spatial and temporal characteristics of positioning movements made with a mouse, analyzing subjects' speed and accuracy as a function of the types of targets that the movements had to reach. Experiment 1 required rapid and accurate horizontal movements to targets that were vertical ribbons located at various distances from the mouse's starting location. The targets for Experiments 2 and 3, respectively, were vertical lines having various heights and rectangular boxes having various heights and widths. Constraints on movement distance along the primary (that is, horizontal) line of motion had the greatest effects on total positioning times. However, constraints on movement distance along a secondary (vertical) line of motion also affected total positioning times significantly. These effects may be localized in different phases of movements (e.g., movement execution and verification). The duration of movement execution (i.e., physical motion) depends primarily on the target distance, whereas the duration of movement verification (i.e., check for endpoint accuracy) depends primarily on target height and width. A useful account of movement execution is provided by stochastic optimized-submovement models, which have significant implications for designing mice and menu-driven displays.

  14. Neural network screening of electromyographic signals as the first phase to design novel human-computer interaction.

    PubMed

    Niemenlehto, Pekka-Henrik; Juhola, Martti; Surakka, Veikko

    2005-01-01

    The present aim was to describe the first phase attempts to recognise voluntarily produced changes in electromyographic signals measured from two facial muscles. Thirty subjects voluntarily activated two facial muscles, corrugator supercilii and zygomaticus major. We designed a neural network based recognition system that screened out muscle activations from the electromyographic signals. When several subjects were tested according to the same test protocol, the neural network system was able to correctly recognise more than 95 % of all muscle activations. This is a promising result and we shall next proceed to modify the system for real-time functioning and then design its utilisation for various multimodal human-computer interaction techniques. The subsequent phase in the future will be the interaction backwards: when a computer program first recognised the use of the facial muscles, it will then follow the instructions given by the user. For instance, by using the facial muscles the subject could select or activate objects on the computer screen. This would be one of the opportunities that we develop to help, e.g., disabled persons, who are unable to use their hands.

  15. After-effects of human-computer interaction indicated by P300 of the event-related brain potential.

    PubMed

    Trimmel, M; Huber, R

    1998-05-01

    After-effects of human-computer interaction (HCI) were investigated by using the P300 component of the event-related brain potential (ERP). Forty-nine subjects (naive non-users, beginners, experienced users, programmers) completed three paper/pencil tasks (text editing, solving intelligence test items, filling out a questionnaire on sensation seeking) and three HCI tasks (text editing, executing a tutor program or programming, playing Tetris). The sequence of 7-min tasks was randomized between subjects and balanced between groups. After each experimental condition ERPs were recorded during an acoustic discrimination task at F3, F4, Cz, P3 and P4. Data indicate that: (1) mental after-effects of HCI can be detected by P300 of the ERP; (2) HCI showed in general a reduced amplitude; (3) P300 amplitude varied also with type of task, mainly at F4 where it was smaller after cognitive tasks (intelligence test/programming) and larger after emotion-based tasks (sensation seeking/Tetris); (4) cognitive tasks showed shorter latencies; (5) latencies were widely location-independent (within the range of 356-358 ms at F3, F4, P3 and P4) after executing the tutor program or programming; and (6) all observed after-effects were independent of the user's experience in operating computers and may therefore reflect short-term after-effects only and no structural changes of information processing caused by HCI.

  16. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    NASA Technical Reports Server (NTRS)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  17. Secure Distributed Human Computation

    NASA Astrophysics Data System (ADS)

    Gentry, Craig; Ramzan, Zulfikar; Stubblebine, Stuart

    In Peha’s Financial Cryptography 2004 invited talk, he described the Cyphermint PayCash system (see www.cyphermint.com), which allows people without bank accounts or credit cards (a sizeable segment of the U.S. population) to automatically and instantly cash checks, pay bills, or make Internet transactions through publicly-accessible kiosks. Since PayCash offers automated financial transactions and since the system uses (unprotected) kiosks, security is critical. The kiosk must decide whether a person cashing a check is really the person to whom the check was made out, so it takes a digital picture of the person cashing the check and transmits this picture electronically to a central office, where a human worker compares the kiosk’s picture to one that was taken when the person registered with Cyphermint. If both pictures are of the same person, then the human worker authorizes the transaction.

  18. User interface development for semiautomated imagery exploitation

    NASA Astrophysics Data System (ADS)

    O'Connor, R. P.; Bohling, Edward H.

    1991-08-01

    Operational reconnaissance technical organizations are burdened by greatly increasing workloads due to expanding capabilities for collection and delivery of large-volume near-real- time multisensor/multispectral softcopy imagery. Related to the tasking of reconnaissance platforms to provide the imagery are more stringent timelines for exploiting the imagery in response to the rapidly changing threat environment being monitored. The development of a semi-automated softcopy multisensor image exploitation capability is a critical step toward integrating existing advanced image processing techniques in conjunction with appropriate intelligence and cartographic data for next-generation image exploitation systems. This paper discusses the results of a recent effort to develop computer-assisted aids for the image analyst (IA) in order to rapidly and accurately exploit multispectral/multisensor imagery in combination with intelligence support data and cartographic information for the purpose of target detection and identification. A key challenge of the effort was to design and implement an effective human-computer interface that would satisfy any generic IA task and readily accommodate the needs of a broad range of IAs.

  19. Thread Pool Interface (TPI)

    SciTech Connect

    Edwards, H. Carter

    2008-04-01

    Thread Pool Interface (TpI) provides a simple interface for running functions written in C or C++ in a thread-parallel mode. Application or library codes may need to perform operations thread-parallel on machines with multicore processors. the TPI library provides a simple mechanism for managing thread activation, deactivation, and thread-parallel execution of application-provided subprograms.

  20. Thread Pool Interface (TPI)

    SciTech Connect

    Edwards, H. Carter

    2008-04-01

    Thread Pool Interface (TpI) provides a simple interface for running functions written in C or C++ in a thread-parallel mode. Application or library codes may need to perform operations thread-parallel on machines with multicore processors. the TPI library provides a simple mechanism for managing thread activation, deactivation, and thread-parallel execution of application-provided subprograms.

  1. Designing the Instructional Interface.

    ERIC Educational Resources Information Center

    Lohr, L. L.

    2000-01-01

    Designing the instructional interface is a challenging endeavor requiring knowledge and skills in instructional and visual design, psychology, human-factors, ergonomic research, computer science, and editorial design. This paper describes the instructional interface, the challenges of its development, and an instructional systems approach to its…

  2. Designing the Instructional Interface.

    ERIC Educational Resources Information Center

    Lohr, L. L.

    2000-01-01

    Designing the instructional interface is a challenging endeavor requiring knowledge and skills in instructional and visual design, psychology, human-factors, ergonomic research, computer science, and editorial design. This paper describes the instructional interface, the challenges of its development, and an instructional systems approach to its…

  3. Interface colloidal robotic manipulator

    DOEpatents

    Aronson, Igor; Snezhko, Oleksiy

    2015-08-04

    A magnetic colloidal system confined at the interface between two immiscible liquids and energized by an alternating magnetic field dynamically self-assembles into localized asters and arrays of asters. The colloidal system exhibits locomotion and shape change. By controlling a small external magnetic field applied parallel to the interface, structures can capture, transport, and position target particles.

  4. The essential human-machine interface for surgery: biological signals transmission.

    PubMed

    Heinrichs, W L; Lloyd, A

    1995-01-01

    The concept of a machine-augmented surgeon will become a widespread reality only after the barrier of harnessing the computer as a tool has been successfully accomplished. The prospects of surgical robots for computer-assisted surgery, for telemedicine, and for teleoperation-cybersurgery-will be greatly enhanced when computers are no longer considered a separate component that links a system together, they must lose their identity, becoming transparent. The ideal human-machine interface for surgery is one juxtaposed between surgeon and patient that derives digital biosignals directly from both bodies, transmitting them transparently without perceptible delay, and distributes them bilaterally into afferent (sensory) and efferent (operator or effector) limbs. This ideal human-computer interface will be based upon biosignal processing and will optimize the technology to the physiology, in what has been called biocybernetics. Applications of biosignal interfaces are being developed in entertainment, medicine, commerce, defense, and in sales and distribution (Table 1).

  5. ChemPreview: an augmented reality-based molecular interface.

    PubMed

    Zheng, Min; Waller, Mark P

    2017-05-01

    Human computer interfaces make computational science more comprehensible and impactful. Complex 3D structures such as proteins or DNA are magnified by digital representations and displayed on two-dimensional monitors. Augmented reality has recently opened another door to access the virtual three-dimensional world. Herein, we present an augmented reality application called ChemPreview with the potential to manipulate bio-molecular structures at an atomistic level. ChemPreview is available at https://github.com/wallerlab/chem-preview/releases, and is built on top of the Meta 1 platform https://www.metavision.com/. ChemPreview can be used to interact with a protein in an intuitive way using natural hand gestures, thereby making it appealing to computational chemists or structural biologists. The ability to manipulate atoms in real world could eventually provide new and more efficient ways of extracting structural knowledge, or designing new molecules in silico. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Turning Shortcomings into Challenges: Brain-Computer Interfaces for Games

    NASA Astrophysics Data System (ADS)

    Nijholt, Anton; Reuderink, Boris; Oude Bos, Danny

    In recent years we have seen a rising interest in brain-computer interfacing for human-computer interaction and potential game applications. Until now, however, we have almost only seen attempts where BCI is used to measure the affective state of the user or in neurofeedback games. There have hardly been any attempts to design BCI games where BCI is considered to be one of the possible input modalities that can be used to control the game. One reason may be that research still follows the paradigms of the traditional, medically oriented, BCI approaches. In this paper we discuss current BCI research from the viewpoint of games and game design. It is hoped that this survey will make clear that we need to design different games than we used to, but that such games can nevertheless be interesting and exciting.

  7. Top ten list of user-hostile interface design

    SciTech Connect

    Miller, D.P.

    1994-10-01

    This report describes ten of the most frequent ergonomic problems found in human-computer interfaces (HCIs) associated with complex industrial machines. In contrast with being thought of as ``user friendly,`` many of these machines are seen as exhibiting ``user-hostile`` attributes by the author. The historical lack of consistent application of ergonomic principles in the HCIs has led to a breed of very sophisticated, complex manufacturing equipment that few people can operate without extensive orientation, training, or experience. This design oversight has produced the need for extensive training programs and help documentation, unnecessary machine downtime, and reduced productivity resulting from operator stress and confusion. Ergonomic considerations affect industrial machines in at least three important areas: (1) the physical package including CRT and keyboard, maintenance access areas, and dedicated hardware selection, layout, and labeling; (2) the software by which the user interacts with the computer that controls the equipment; and (3) the supporting documentation.

  8. Towards New Interfaces for Pedagogy

    ERIC Educational Resources Information Center

    Stein, Murphy Martin

    2014-01-01

    Developing technology to help people teach and learn is an important topic in Human Computer Interaction (HCI). In this thesis we present three studies on this topic. In the first study, we demonstrate new games for learning mathematics and discuss the evidence for key design decisions from user studies. In the second study, we develop a real-time…

  9. Towards New Interfaces for Pedagogy

    ERIC Educational Resources Information Center

    Stein, Murphy Martin

    2014-01-01

    Developing technology to help people teach and learn is an important topic in Human Computer Interaction (HCI). In this thesis we present three studies on this topic. In the first study, we demonstrate new games for learning mathematics and discuss the evidence for key design decisions from user studies. In the second study, we develop a real-time…

  10. Operator interface for vehicles

    SciTech Connect

    Bissontz, Jay E

    2015-03-10

    A control interface for drivetrain braking provided by a regenerative brake and a non-regenerative brake is implemented using a combination of switches and graphic interface elements. The control interface comprises a control system for allocating drivetrain braking effort between the regenerative brake and the non-regenerative brake, a first operator actuated control for enabling operation of the drivetrain braking, and a second operator actuated control for selecting a target braking effort for drivetrain braking. A graphic display displays to an operator the selected target braking effort and can be used to further display actual braking effort achieved by drivetrain braking.

  11. Combining Multivariate Statistics and the Think-Aloud Protocol to Assess Human-Computer Interaction Barriers in Symptom Checkers.

    PubMed

    Marco-Ruiz, Luis; Bønes, Erlend; de la Asunción, Estela; Gabarrón, Elia; Carlos Aviles-Solis, Juan; Lee, Eunji; Traver, Vicente; Sato, Keiichi; Bellika, Johan G

    2017-09-09

    Symptom checkers are software tools that allow users to submit a set of symptoms and receive advice related to them in the form of a diagnosis list, health information or triage. The heterogeneity of their potential users and the number of different components in their user interfaces can make testing with end-users unaffordable. We designed and executed a two-phase method to test the respiratory diseases module of the symptom checker Erdusyk. Phase I consisted of an online test with a large sample of users (n = 53). In Phase I, users evaluated the system remotely and completed a questionnaire based on the Technology Acceptance Model. Principal Component Analysis was used to correlate each section of the interface with the questionnaire responses, thus identifying which areas of the user interface presented significant contributions to the technology acceptance. In the second phase, the think-aloud procedure was executed with a small number of samples (n = 15), focusing on the areas with significant contributions to analyze the reasons for such contributions. Our method was used effectively to optimize the testing of symptom checker user interfaces. The method allowed kept the cost of testing at reasonable levels by restricting the use of the think-aloud procedure while still assuring a high amount of coverage. The main barriers detected in Erdusyk were related to problems understanding time repetition patterns, the selection of levels in scales to record intensities, navigation, the quantification of some symptom attributes, and the characteristics of the symptoms. Copyright © 2017. Published by Elsevier Inc.

  12. Scalable coherent interface

    SciTech Connect

    Alnaes, K.; Kristiansen, E.H. ); Gustavson, D.B. ); James, D.V. )

    1990-01-01

    The Scalable Coherent Interface (IEEE P1596) is establishing an interface standard for very high performance multiprocessors, supporting a cache-coherent-memory model scalable to systems with up to 64K nodes. This Scalable Coherent Interface (SCI) will supply a peak bandwidth per node of 1 GigaByte/second. The SCI standard should facilitate assembly of processor, memory, I/O and bus bridge cards from multiple vendors into massively parallel systems with throughput far above what is possible today. The SCI standard encompasses two levels of interface, a physical level and a logical level. The physical level specifies electrical, mechanical and thermal characteristics of connectors and cards that meet the standard. The logical level describes the address space, data transfer protocols, cache coherence mechanisms, synchronization primitives and error recovery. In this paper we address logical level issues such as packet formats, packet transmission, transaction handshake, flow control, and cache coherence. 11 refs., 10 figs.

  13. Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Sadeh, I.; Oya, I.; Schwarz, J.; Pietriga, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction (HCI). The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.

  14. Using a web-based prototype and human-computer interaction concepts to develop a vision for a next generation patient care management system.

    PubMed

    Staggers, N; Miller, S

    2001-01-01

    This paper describes the novel use of two tools to develop requirements for a new generation patient care system: a web-based prototype and a human-computer interaction framework. These tools allowed a development team to crystallize new requirements for a patient care system, illustrate to clinicians a radical change in care process models, and begin the change management process in a large enterprise.

  15. Using a web-based prototype and human-computer interaction concepts to develop a vision for a next generation patient care management system.

    PubMed Central

    Staggers, N.; Miller, S.

    2001-01-01

    This paper describes the novel use of two tools to develop requirements for a new generation patient care system: a web-based prototype and a human-computer interaction framework. These tools allowed a development team to crystallize new requirements for a patient care system, illustrate to clinicians a radical change in care process models, and begin the change management process in a large enterprise. PMID:11825266

  16. Strength of Polymer Interfaces

    DTIC Science & Technology

    1990-11-25

    FRACTURE, FATIGUE, LAMINATION, DIFFUSION r c.~wC~&La 1 ~ 19. ABSTRACT (Continue on reverse if necessary and identify by block numbe.) "- Studies df...nature of diffuse interfaces. Several experimental methods are used to probe the weld structure and compare with theoretical scaling laws. Results are...the polymer chains, the chemical compatibility, and the fractal nature of diffuse interfaces. Several experimental methods are used to probe the weld

  17. Polarizable Ions at Interfaces

    NASA Astrophysics Data System (ADS)

    Levin, Yan

    2009-04-01

    A nonperturbative theory is presented which allows us to calculate the solvation free energy of polarizable ions near water-vapor and water-oil interfaces. The theory predicts that larger halogen anions are adsorbed at the interface, while the alkali metal cations are repelled from it. The density profiles calculated theoretically are similar to those obtained using molecular dynamics simulations with polarizable force fields.

  18. Data Reorganization Interface

    DTIC Science & Technology

    2007-11-02

    Data Reorganization Interface Kenneth Cain Mercury Computer Systems, Inc. Phone: (978)-967-1645 Email Address: kcain@mc.com Abstract...6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Mercury Computer Systems...18 © 2003 Mercury Computer Systems, Inc. Data Reorganization Interface (DRI) Kenneth Cain Jr. Mercury Computer Systems, Inc. High Performance

  19. VIRTUAL FRAME BUFFER INTERFACE

    NASA Technical Reports Server (NTRS)

    Wolfe, T. L.

    1994-01-01

    Large image processing systems use multiple frame buffers with differing architectures and vendor supplied user interfaces. This variety of architectures and interfaces creates software development, maintenance, and portability problems for application programs. The Virtual Frame Buffer Interface program makes all frame buffers appear as a generic frame buffer with a specified set of characteristics, allowing programmers to write code which will run unmodified on all supported hardware. The Virtual Frame Buffer Interface converts generic commands to actual device commands. The virtual frame buffer consists of a definition of capabilities and FORTRAN subroutines that are called by application programs. The virtual frame buffer routines may be treated as subroutines, logical functions, or integer functions by the application program. Routines are included that allocate and manage hardware resources such as frame buffers, monitors, video switches, trackballs, tablets and joysticks; access image memory planes; and perform alphanumeric font or text generation. The subroutines for the various "real" frame buffers are in separate VAX/VMS shared libraries allowing modification, correction or enhancement of the virtual interface without affecting application programs. The Virtual Frame Buffer Interface program was developed in FORTRAN 77 for a DEC VAX 11/780 or a DEC VAX 11/750 under VMS 4.X. It supports ADAGE IK3000, DEANZA IP8500, Low Resolution RAMTEK 9460, and High Resolution RAMTEK 9460 Frame Buffers. It has a central memory requirement of approximately 150K. This program was developed in 1985.

  20. Serial interface controller

    SciTech Connect

    Kandasamy, A.

    1995-04-14

    The idea of building a Serial Interface Controller (SIC) proposed by Paul O`Connor, Instrumentation Division, BNL is to determine the feasibility of incorporating a Serial Interface Controlled CMOS IC`s for charge amplification, shaping, analog storage and multiplexing used in particle detectors for high energy physics experiments. The serial data pumped into the CMOS ICs will be used to control many circuit parameters like digitally controlled gain, shaping time, precision preamplifier calibration circuits and many other parameters like timing discriminators mode of operation. The SIC board built will be tested on a Serial Interface Controlled Digital - to - Analog Convertor, which follows either Motorola`s SPI/QSPI or National Semiconductors Microwire interface technique. The DAC chosen for this was MAXIM`s MAX537, a Quad, 12-bit DAC. The function of this controller can be achieved by using some on-shelf micro-controllers like the Motorola`s MC68HC11, which offers dedicated SPI ports. The drawback encountered in using this controller is the overhead involved in putting together an user interface where the user can dynamically change its settings and load the SIC device. This is very critical in testing fewer number of CMOS IC`s having SIC. The SIC board described here takes care of this dynamic user interface issue.

  1. MER SPICE Interface

    NASA Technical Reports Server (NTRS)

    Sayfi, Elias

    2004-01-01

    MER SPICE Interface is a software module for use in conjunction with the Mars Exploration Rover (MER) mission and the SPICE software system of the Navigation and Ancillary Information Facility (NAIF) at NASA's Jet Propulsion Laboratory. (SPICE is used to acquire, record, and disseminate engineering, navigational, and other ancillary data describing circumstances under which data were acquired by spaceborne scientific instruments.) Given a Spacecraft Clock value, MER SPICE Interface extracts MER-specific data from SPICE kernels (essentially, raw data files) and calculates values for Planet Day Number, Local Solar Longitude, Local Solar Elevation, Local Solar Azimuth, and Local Solar Time (UTC). MER SPICE Interface was adapted from a subroutine, denoted m98SpiceIF written by Payam Zamani, that was intended to calculate SPICE values for the Mars Polar Lander. The main difference between MER SPICE Interface and m98SpiceIf is that MER SPICE Interface does not explicitly call CHRONOS, a time-conversion program that is part of a library of utility subprograms within SPICE. Instead, MER SPICE Interface mimics some portions of the CHRONOS code, the advantage being that it executes much faster and can efficiently be called from a pipeline of events in a parallel processing environment.

  2. Engineering Orthopedic Tissue Interfaces

    PubMed Central

    Yang, Peter J.

    2009-01-01

    While a wide variety of approaches to engineering orthopedic tissues have been proposed, less attention has been paid to the interfaces, the specialized areas that connect two tissues of different biochemical and mechanical properties. The interface tissue plays an important role in transitioning mechanical load between disparate tissues. Thus, the relatively new field of interfacial tissue engineering presents new challenges—to not only consider the regeneration of individual orthopedic tissues, but also to design the biochemical and cellular composition of the linking tissue. Approaches to interfacial tissue engineering may be distinguished based on if the goal is to recreate the interface itself, or generate an entire integrated tissue unit (such as an osteochondral plug). As background for future efforts in engineering orthopedic interfaces, a brief review of the biology and mechanics of each interface (cartilage–bone, ligament–bone, meniscus–bone, and muscle–tendon) is presented, followed by an overview of the state-of-the-art in engineering each tissue, including advances and challenges specific to regenerating the interfaces. PMID:19231983

  3. VIRTUAL FRAME BUFFER INTERFACE

    NASA Technical Reports Server (NTRS)

    Wolfe, T. L.

    1994-01-01

    Large image processing systems use multiple frame buffers with differing architectures and vendor supplied user interfaces. This variety of architectures and interfaces creates software development, maintenance, and portability problems for application programs. The Virtual Frame Buffer Interface program makes all frame buffers appear as a generic frame buffer with a specified set of characteristics, allowing programmers to write code which will run unmodified on all supported hardware. The Virtual Frame Buffer Interface converts generic commands to actual device commands. The virtual frame buffer consists of a definition of capabilities and FORTRAN subroutines that are called by application programs. The virtual frame buffer routines may be treated as subroutines, logical functions, or integer functions by the application program. Routines are included that allocate and manage hardware resources such as frame buffers, monitors, video switches, trackballs, tablets and joysticks; access image memory planes; and perform alphanumeric font or text generation. The subroutines for the various "real" frame buffers are in separate VAX/VMS shared libraries allowing modification, correction or enhancement of the virtual interface without affecting application programs. The Virtual Frame Buffer Interface program was developed in FORTRAN 77 for a DEC VAX 11/780 or a DEC VAX 11/750 under VMS 4.X. It supports ADAGE IK3000, DEANZA IP8500, Low Resolution RAMTEK 9460, and High Resolution RAMTEK 9460 Frame Buffers. It has a central memory requirement of approximately 150K. This program was developed in 1985.

  4. The impact of human-computer interaction-based comprehensive training on the cognitive functions of cognitive impairment elderly individuals in a nursing home.

    PubMed

    Zhuang, Jun-Peng; Fang, Rong; Feng, Xia; Xu, Xu-Hua; Liu, Li-Hua; Bai, Qing-Ke; Tang, Hui-Dong; Zhao, Zhen-Guo; Chen, Sheng-Di

    2013-01-01

    Given the increasing prevalence of dementia, any intervention that can effectively slow the deterioration of cognitive function is of great importance. This study investigated the efficacy of a human-computer interaction-based comprehensive cognitive training program in cognitively impaired elderly individuals living in a nursing home. All subjects, who were aged ≥70 years and had cognitive impairment, were randomly allocated to an intervention group (n = 19) or a control group (n = 14). The intervention group received human-computer interaction-based comprehensive cognitive training for 24 weeks. Neuropsychological examinations were conducted before and after this period. The intervention group was subdivided into two groups according to the scores of global cortical atrophy (GCA) to evaluate the impact of training effectiveness on GCA. After 24 weeks, neither group showed a significant change compared with baseline cognitive examinations. However, there was a tendency for greater improvement in memory, language, and visuospatial abilities for the intervention group as compared with controls. Patients with mild cognitive impairment showed improvements in language and visuospatial capacity, while patients with dementia showed improvements in attention/orientation, memory, language, and fluency. However, none of these findings were statistically significant. The results for the intervention subgroups showed that visuospatial ability improvement was significantly greater among those with a global cortical atrophy score of ≤15 (p < 0.05). Human-computer interaction-based comprehensive training may improve cognitive functions among cognitively impaired elderly individuals. The training effect was most prominent among those with milder cerebral atrophy.

  5. Environmental materials and interfaces

    SciTech Connect

    Not Available

    1991-11-01

    A workshop that explored materials and interfaces research needs relevant to national environmental concerns was conducted at Pacific Northwest Laboratory. The purposes of the workshop were to refine the scientific research directions being planned for the Materials and Interface Program in the Molecular Science Research Center (MSRC) and further define the research and user equipment to the included as part of the proposed Environmental and Molecular Science Laboratory (EMSL). Three plenary information sessions served to outline the background, objectives, and status of the MSRC and EMSL initiatives; selected specific areas with environmentally related materials; and the status of capabilities and facilities planned for the EMSL. Attention was directed to four areas where materials and interface science can have a significant impact on prevention and remediation of environmental problems: in situ detection and characterization of hazardous wastes (sensors), minimization of hazardous waste (separation membranes, ion exchange materials, catalysts), waste containment (encapsulation and barrier materials), and fundamental understanding of contaminant transport mechanisms. During all other sessions, the participants were divided into three working groups for detailed discussion and the preparation of a written report. The working groups focused on the areas of interface structure and chemistry, materials and interface stability, and materials synthesis. These recommendations and suggestions for needed research will be useful for other researchers in proposing projects and for suggesting collaborative work with MSRC researchers. 1 fig.

  6. Interface Analysis of ID Systems.

    ERIC Educational Resources Information Center

    Gentry, Castelle G.; Trimby, Madeline J.

    This chapter considers methods of interface analysis, the stage in the instructional development process that involves the identification, interpretation, and prioritization of essential points of contact among systems and subsystem boundaries. The structure of interfaces, types of interfaces, interface characteristics, and a procedural model for…

  7. Guidelines for the integration of audio cues into computer user interfaces

    SciTech Connect

    Sumikawa, D.A.

    1985-06-01

    Throughout the history of computers, vision has been the main channel through which information is conveyed to the computer user. As the complexities of man-machine interactions increase, more and more information must be transferred from the computer to the user and then successfully interpreted by the user. A logical next step in the evolution of the computer-user interface is the incorporation of sound and thereby using the sense of ''hearing'' in the computer experience. This allows our visual and auditory capabilities to work naturally together in unison leading to more effective and efficient interpretation of all information received by the user from the computer. This thesis presents an initial set of guidelines to assist interface developers in designing an effective sight and sound user interface. This study is a synthesis of various aspects of sound, human communication, computer-user interfaces, and psychoacoustics. We introduce the notion of an earcon. Earcons are audio cues used in the computer-user interface to provide information and feedback to the user about some computer object, operation, or interaction. A possible construction technique for earcons, the use of earcons in the interface, how earcons are learned and remembered, and the affects of earcons on their users are investigated. This study takes the point of view that earcons are a language and human/computer communication issue and are therefore analyzed according to the three dimensions of linguistics; syntactics, semantics, and pragmatics.

  8. High temperature interface superconductivity

    DOE PAGES

    Gozar, A.; Bozovic, I.

    2016-01-20

    High-Tc superconductivity at interfaces has a history of more than a couple of decades. In this review we focus our attention on copper-oxide based heterostructures and multi-layers. We first discuss the technique, atomic layer-by-layer molecular beam epitaxy (ALL-MBE) engineering, that enabled High-Tc Interface Superconductivity (HT-IS), and the challenges associated with the realization of high quality interfaces. Then we turn our attention to the experiments which shed light on the structure and properties of interfacial layers, allowing comparison to those of single-phase films and bulk crystals. Both ‘passive’ hetero-structures as well as surface-induced effects by external gating are discussed. Here, wemore » conclude by comparing HT-IS in cuprates and in other classes of materials, especially Fe-based superconductors, and by examining the grand challenges currently laying ahead for the field.« less

  9. High temperature interface superconductivity

    SciTech Connect

    Gozar, A.; Bozovic, I.

    2016-01-20

    High-Tc superconductivity at interfaces has a history of more than a couple of decades. In this review we focus our attention on copper-oxide based heterostructures and multi-layers. We first discuss the technique, atomic layer-by-layer molecular beam epitaxy (ALL-MBE) engineering, that enabled High-Tc Interface Superconductivity (HT-IS), and the challenges associated with the realization of high quality interfaces. Then we turn our attention to the experiments which shed light on the structure and properties of interfacial layers, allowing comparison to those of single-phase films and bulk crystals. Both ‘passive’ hetero-structures as well as surface-induced effects by external gating are discussed. Here, we conclude by comparing HT-IS in cuprates and in other classes of materials, especially Fe-based superconductors, and by examining the grand challenges currently laying ahead for the field.

  10. High temperature interface superconductivity

    NASA Astrophysics Data System (ADS)

    Gozar, A.; Bozovic, I.

    2016-02-01

    High-Tc superconductivity at interfaces has a history of more than a couple of decades. In this review we focus our attention on copper-oxide based heterostructures and multi-layers. We first discuss the technique, atomic layer-by-layer molecular beam epitaxy (ALL-MBE) engineering, that enabled High-Tc Interface Superconductivity (HT-IS), and the challenges associated with the realization of high quality interfaces. Then we turn our attention to the experiments which shed light on the structure and properties of interfacial layers, allowing comparison to those of single-phase films and bulk crystals. Both 'passive' hetero-structures as well as surface-induced effects by external gating are discussed. We conclude by comparing HT-IS in cuprates and in other classes of materials, especially Fe-based superconductors, and by examining the grand challenges currently laying ahead for the field.

  11. An Abstract Data Interface

    NASA Astrophysics Data System (ADS)

    Allan, D. J.

    The Abstract Data Interface (ADI) is a system within which both abstract data models and their mappings on to file formats can be defined. The data model system is object-oriented and closely follows the Common Lisp Object System (CLOS) object model. Programming interfaces in both C and \\fortran are supplied, and are designed to be simple enough for use by users with limited software skills. The prototype system supports access to those FITS formats most commonly used in the X-ray community, as well as the Starlink NDF data format. New interfaces can be rapidly added to the system---these may communicate directly with the file system, other ADI objects or elsewhere (e.g., a network connection).

  12. Magnetic multilayer interface anisotropy

    SciTech Connect

    Pechan, M.J.

    1992-01-01

    Ni/Mo and Ni/V multilayer magnetic anisotropy has been investigated as a function of Ni layer thickness, frequency and temperature. Variable frequency ferromagnetic resonance (FMR) measurements show, for the first time, significant frequency dependence associated with the multilayer magnetic anisotropy. The thickness dependence allows one to extract the interface contribution from the total anisotropy. Temperature dependent FMR (9 GHz) and room temperature magnetization indicate that strain between Ni and the non-magnetic layers is contributing significantly to the source of the interface anisotropy and the state of the interfacial magnetization. In order to examine the interface properties of other transition metal multilayer systems, investigations on Fe/Cu are underway and CoCr/Ag is being proposed. ESR measurements have been reported on Gd substituted YBaCuO superconductors and a novel quasi-equilibrium method has been developed to determine quickly and precisely the ransition temperature.

  13. Electronic Structure of Semiconductor Interfaces.

    DTIC Science & Technology

    1984-11-01

    no localized Interface states In the thermal gap If all the SI atoms at the Interface are saturated. In a second paper, 13 we showed how localized...OF INTERFACE STATES Various authors3 8 have called attention to the fact that there is often a sharp peak In the density of Si /Si0 2 interface states ...generating bulk amorphous Si clusters from random hard-sphere configuratlons. 7 , 8 Finally, the local electronic density of states near the interface Is

  14. Optical encryption interface

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J. (Inventor)

    1998-01-01

    An analog optical encryption system based on phase scrambling of two-dimensional optical images and holographic transformation for achieving large encryption keys and high encryption speed. An enciphering interface uses a spatial light modulator for converting a digital data stream into a two dimensional optical image. The optical image is further transformed into a hologram with a random phase distribution. The hologram is converted into digital form for transmission over a shared information channel. A respective deciphering interface at a receiver reverses the encrypting process by using a phase conjugate reconstruction of the phase scrambled hologram.

  15. Modal Interfaces in Hawaii

    NASA Technical Reports Server (NTRS)

    Wright, E. Alvey

    1974-01-01

    Hawaii, an archipelago where transportation distances are short but the interfaces are many, seeks elimination of modal changes by totally-submerged hydrofoil craft operating at the water surface directly between tourist resort destinations, by dual mode rapid transit vehicles operating directly between the deplaning bridges at Honolulu International Airport and hotel porte-cochere at Waikiki, by demand responsive vehicles for collection and distribution operating on fixed guideways for line haul, and by roll-on/roll-off inter-island ferries for all models of manually operated ground vehicles. The paper also describes facilitation of unavoidable interfaces by innovative sub-systems.

  16. Virtual interface environment workstations

    NASA Technical Reports Server (NTRS)

    Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.

    1988-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.

  17. Profile Interface Generator

    SciTech Connect

    2013-11-09

    The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.

  18. Virtual interface environment workstations

    NASA Technical Reports Server (NTRS)

    Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.

    1988-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.

  19. Modal Interfaces in Hawaii

    NASA Technical Reports Server (NTRS)

    Wright, E. Alvey

    1974-01-01

    Hawaii, an archipelago where transportation distances are short but the interfaces are many, seeks elimination of modal changes by totally-submerged hydrofoil craft operating at the water surface directly between tourist resort destinations, by dual mode rapid transit vehicles operating directly between the deplaning bridges at Honolulu International Airport and hotel porte-cochere at Waikiki, by demand responsive vehicles for collection and distribution operating on fixed guideways for line haul, and by roll-on/roll-off inter-island ferries for all models of manually operated ground vehicles. The paper also describes facilitation of unavoidable interfaces by innovative sub-systems.

  20. Soldier-Computer Interface

    DTIC Science & Technology

    2015-01-27

    understandable units. (5) Immediate Feedback : Operators should always be presented with readily understandable information so that they know...operation, system response time, and special commands. d. Feedback : Operators should always be presented with readily understandable information on...considerations (handedness, physical strength, wearing of eyeglasses, and facility of spoken English). TABLE 3. SOLDIER-COMPUTER INTERFACE CRITERIA

  1. A Thermistor Interface.

    ERIC Educational Resources Information Center

    Kamin, Gary D.; Dowden, Edward

    1987-01-01

    Describes the use of a precalibrated stainless steel thermistor, interfaced with an Apple computer, in chemistry experiments. Discusses the advantages of "instant" temperature readings in experiments requiring that readings be taken at certain intervals. Outlines such an experiment which investigates freezing point depressions. (TW)

  2. Interfacing the Digital.

    ERIC Educational Resources Information Center

    Dietz, Steve

    In the last 5 years, there has been at times heated debate not only about how best to present digital and specifically networked art in an institutional context but also whether to do so at all. Not all of the discussion revolves around issues of physical interfaces to such works, but their onsite presentation is a critical concern for both…

  3. Videodisc-Computer Interfaces.

    ERIC Educational Resources Information Center

    Zollman, Dean

    1984-01-01

    Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…

  4. Photochemistry at Interfaces

    SciTech Connect

    Eisenthal, Kenneth B

    2015-02-24

    We have advanced our capabilities to investigate ultrafast excited state dynamics at a liquid interface using a pump to excite molecules to higher electronic states and then probe the subsequent time evolution of the interfacial molecules with femtosecond time delayed vibrational SFG.

  5. Virtual interface environment

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.

    1988-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture is under development for use as a multipurpose interface environment. Initial applications of the system are in telerobotics, data-management and human factors research. System configuration and research directions are described.

  6. Interface It Yourself.

    ERIC Educational Resources Information Center

    Westling, Bruce D.; Bahe, Margaret E.

    1986-01-01

    Describes several ways to build data collection devices for microcomputers. The interface devices connect with either the computer's game port or an analog-to-digital converter. Discusses how teachers have designed the equipment and appropriate software to use with the computer in biology teaching. (TW)

  7. Semiconductor Oxide Interface States.

    DTIC Science & Technology

    1981-05-01

    essentially coincides. The density of state curves of the interface states based on the Terman method are shown in Fig. 7 for the three conditions: (i...terrestrial applications. A visit was made at the NASA Lewis Research Center with Dr. Brandhorst on August 31, 1979. The PI has attended DOE meetings on

  8. A Thermistor Interface.

    ERIC Educational Resources Information Center

    Kamin, Gary D.; Dowden, Edward

    1987-01-01

    Describes the use of a precalibrated stainless steel thermistor, interfaced with an Apple computer, in chemistry experiments. Discusses the advantages of "instant" temperature readings in experiments requiring that readings be taken at certain intervals. Outlines such an experiment which investigates freezing point depressions. (TW)

  9. Interfacing with a DMM.

    ERIC Educational Resources Information Center

    Beatty, Jim

    1985-01-01

    Suggests purchasing a digital multimer (DMM) with an IEEE-488 option to interface an instrument to a microcomputer, indicating that a DMM is well protected from overloads and is easy to connect. An example of its use in an experiment involving hydrolysis of tertiary butyl alcohol (with program listing) is given. (JN)

  10. Interfacing with a DMM.

    ERIC Educational Resources Information Center

    Beatty, Jim

    1985-01-01

    Suggests purchasing a digital multimer (DMM) with an IEEE-488 option to interface an instrument to a microcomputer, indicating that a DMM is well protected from overloads and is easy to connect. An example of its use in an experiment involving hydrolysis of tertiary butyl alcohol (with program listing) is given. (JN)

  11. the EXFOR interface

    SciTech Connect

    Brown, D. A.

    2011-03-10

    The x4i package is an interface to the EXFOR nuclear data library. It simplifies retrieval of EXFOR entries and can automatically parse them, allowing one to extract cross-section (and other) data in a simple, plot-able format. x4i also understands and can parse the entire reaction string, allowing one to build a strategy for processing the data

  12. Interface It Yourself.

    ERIC Educational Resources Information Center

    Westling, Bruce D.; Bahe, Margaret E.

    1986-01-01

    Describes several ways to build data collection devices for microcomputers. The interface devices connect with either the computer's game port or an analog-to-digital converter. Discusses how teachers have designed the equipment and appropriate software to use with the computer in biology teaching. (TW)

  13. Foreword: Quasicrystals at Interfaces

    NASA Astrophysics Data System (ADS)

    Fournée, Vincent; Ledieu, Julian; Thiel, Patricia

    2008-08-01

    The term 'quasicrystals' stands for quasiperiodic crystals and by no means signifies that they are imperfect crystals. Quasicrystals represent a well-ordered state of matter just like periodic crystals, characterized by diffraction peaks as sharp as those for nearly perfect crystals such as silicon. But their long range order is aperiodic, and therefore they cannot be described by the periodic repetition of a small unit cell like normal crystals. Instead, quasiperiodic structures can be described as the three-dimensional restriction of a periodic structure embedded in a hyperspace of dimension N > 3. For example, a six-dimensional cubic lattice is used to generate the icosahedral quasilattice in three-dimensions. This is a general property of quasiperiodic functions, an archetype being the function f(x) = cos(x) + cos(√2x), which is the sum of two periodic functions with incommensurate periods. This function can be regarded as the restriction along the line with irrational slope y = √2x of the function F(x, y) = cos(x) + cos(y), which is periodic in the (x, y) plan. Quasicrystalline materials were discovered 25 years ago by D Shechtman et al in rapidly solidified Al-Mn alloys. Many quasicrystals have been identified since then in binary and ternary systems. Most of them present non-crystallographic rotational symmetry like five-fold or ten-fold axes. Interest in this new class of materials was further driven by their potentially useful physical properties, either in the form of functional coatings or as reinforcement particle in composites. These practical aspects in turn raised fundamental questions about the nature of interfaces between periodic and quasiperiodic materials. Interfaces are regions of high energy compared to the bulk, where atomic positions need to be adjusted on both sides of the interface to accommodate the two different lattices. How to describe interfaces and how nature minimizes the interface energy between a periodic and a quasiperiodic

  14. Development of a Common User Interface for the Launch Decision Support System

    NASA Technical Reports Server (NTRS)

    Scholtz, Jean C.

    1991-01-01

    The Launch Decision Support System (LDSS) is software to be used by the NASA Test Director (NTD) in the firing room during countdown. This software is designed to assist the NTD with time management, that is, when to resume from a hold condition. This software will assist the NTD in making and evaluating alternate plans and will keep him advised of the existing situation. As such, the interface to this software must be designed to provide the maximum amount of information in the clearest fashion and in a timely manner. This research involves applying user interface guidelines to a mature prototype of LDSS and developing displays that will enable the users to easily and efficiently obtain information from the LDSS displays. This research also extends previous work on organizing and prioritizing human-computer interaction knowledge.

  15. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    PubMed

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  16. Easy-to-use interface

    SciTech Connect

    Blattner, M M; Blattner, D O; Tong, Y

    1999-04-01

    Easy-to-use interfaces are a class of interfaces that fall between public access interfaces and graphical user interfaces in usability and cognitive difficulty. We describe characteristics of easy-to-use interfaces by the properties of four dimensions: selection, navigation, direct manipulation, and contextual metaphors. Another constraint we introduced was to include as little text as possible, and what text we have will be in at least four languages. Formative evaluations were conducted to identify and isolate these characteristics. Our application is a visual interface for a home automation system intended for a diverse set of users. The design will be expanded to accommodate the visually disabled in the near future.

  17. Interface Configuration Experiment: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Concus, Paul; Finn, Robert; Weislogel, Mark

    1994-01-01

    The Interface Configuration Experiment (ICE) was carried out on USML-1 to investigate liquid-gas interfaces in certain rotationally-symmetric containers having prescribed, mathematically derived shapes. These containers have the property that they admit an entire continuum of distinct equilibrium rotationally-symmetric interfaces for a given liquid volume and contact angle. Furthermore, it can be shown that none of these interfaces can be stable. It was found, after the containers were filled in orbit, that an initial equilibrium interface from the symmetric continuum re-oriented, when perturbed, to a stable interface that was not rotationally symmetric, in accordance with the mathematical theory.

  18. Control-display mapping in brain-computer interfaces.

    PubMed

    Thurlings, Marieke E; van Erp, Jan B F; Brouwer, Anne-Marie; Blankertz, Benjamin; Werkhoven, Peter

    2012-01-01

    Event-related potential (ERP) based brain-computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. When using a tactile ERP-BCI for navigation, mapping is required between navigation directions on a visual display and unambiguously corresponding tactile stimuli (tactors) from a tactile control device: control-display mapping (CDM). We investigated the effect of congruent (both display and control horizontal or both vertical) and incongruent (vertical display, horizontal control) CDMs on task performance, the ERP and potential BCI performance. Ten participants attended to a target (determined via CDM), in a stream of sequentially vibrating tactors. We show that congruent CDM yields best task performance, enhanced the P300 and results in increased estimated BCI performance. This suggests a reduced availability of attentional resources when operating an ERP-BCI with incongruent CDM. Additionally, we found an enhanced N2 for incongruent CDM, which indicates a conflict between visual display and tactile control orientations. Incongruency in control-display mapping reduces task performance. In this study, brain responses, task and system performance are related to (in)congruent mapping of command options and the corresponding stimuli in a brain-computer interface (BCI). Directional congruency reduces task errors, increases available attentional resources, improves BCI performance and thus facilitates human-computer interaction.

  19. Passive wireless tags for tongue controlled assistive technology interfaces.

    PubMed

    Rakibet, Osman O; Horne, Robert J; Kelly, Stephen W; Batchelor, John C

    2016-03-01

    Tongue control with low profile, passive mouth tags is demonstrated as a human-device interface by communicating values of tongue-tag separation over a wireless link. Confusion matrices are provided to demonstrate user accuracy in targeting by tongue position. Accuracy is found to increase dramatically after short training sequences with errors falling close to 1% in magnitude with zero missed targets. The rate at which users are able to learn accurate targeting with high accuracy indicates that this is an intuitive device to operate. The significance of the work is that innovative very unobtrusive, wireless tags can be used to provide intuitive human-computer interfaces based on low cost and disposable mouth mounted technology. With the development of an appropriate reading system, control of assistive devices such as computer mice or wheelchairs could be possible for tetraplegics and others who retain fine motor control capability of their tongues. The tags contain no battery and are intended to fit directly on the hard palate, detecting tongue position in the mouth with no need for tongue piercings.

  20. PREFACE: Water at interfaces Water at interfaces

    NASA Astrophysics Data System (ADS)

    Gallo, P.; Rovere, M.

    2010-07-01

    This special issue is devoted to illustrating important aspects and significant results in the field of modeling and simulation of water at interfaces with solutes or with confining substrates, focusing on a range of temperatures from ambient to supercooled. Understanding the behavior of water, in contact with different substrates and/or in solutions, is of pivotal importance for a wide range of applications in physics, chemistry and biochemistry. Simulations of confined and/or interfacial water are also relevant for testing how different its behavior is with respect to bulk water. Simulations and modeling in this field are of particular importance when studying supercooled regions where water shows anomalous properties. These considerations motivated the organization of a workshop at CECAM in the summer of 2009 which aimed to bring together scientists working with computer simulations on the properties of water in various environments with different methodologies. In this special issue, we collected a variety of interesting contributions from some of the speakers of the workshop. We have roughly classified the contributions into four groups. The papers of the first group address the properties of interfacial and confined water upon supercooling in an effort to understand the relation with anomalous behavior of supercooled bulk water. The second group deals with the specific problem of solvation. The next group deals with water in different environments by considering problems of great importance in technological and biological applications. Finally, the last group deals with quantum mechanical calculations related to the role of water in chemical processes. The first group of papers is introduced by the general paper of Stanley et al. The authors discuss recent progress in understanding the anomalies of water in bulk, nanoconfined, and biological environments. They present evidence that liquid water may display 'polymorphism', a property that can be present in

  1. Groundwater Head Control of Catchment Nitrate Export

    NASA Astrophysics Data System (ADS)

    Musolff, A.; Schmidt, C.; Rode, M.; Fleckenstein, J. H.

    2014-12-01

    Elevated nutrient fluxes from agricultural catchments affect downstream water resources. A method to assess nutrient fluxes is the evaluation of the export regime. The export regime classifies the relation between concentration and discharge and integrates mobilization as well as retention processes. Solutes can be exported chemostatically (variance of concentration << variance of discharge) or chemodynamically (variance of concentration ≥ variance of discharge). Starting point of this study is the evaluation of export regimes of nitrate in a series of neighboring sub-catchments of the Central German River Bode catchment. We found an accretion pattern of nitrate with increasing concentration when discharge is increasing and thus a chemodynamic export regime. Here we follow a nested approach and have a closer look at the controls of nitrate export in the small (1.4 km2) headwater catchment of the Sauerbach stream. The Sauerbach catchment is dominated by agricultural land use and is characterized by tile drains. We hypothesize that discharge as well as nitrate export is controlled by the groundwater head variability over time. To that end we follow a joint data analysis of discharge, groundwater heads and nitrate concentrations in groundwater, tile drains and surface water. At the gauging station the nitrate export is chemodynamic exhibiting the typical accretion pattern also found at the larger scale. Our data analysis shows that nitrate export regime is in two ways controlled by the depth to groundwater and the groundwater head variability: Discharge increases with increasing groundwater heads due to the activation of tile drains. On the other hand, depth to groundwater and passage through the unsaturated zone is the major control of aquifer nitrate concentration. At wells with larger depth to groundwater nitrate concentrations are significantly lower than at more shallow wells indicating retention processes in the unsaturated zone. Therefore the concentration in the stream increases with increasing heads since the activated tiles drain shallow groundwater with higher nitrate concentrations. We can thus show that the export regime of nitrate provides insight into the spatial relation of discharge producing zones and nitrate source zones within a catchment.

  2. Immunochemistry at interfaces.

    PubMed Central

    Nygren, H; Stenberg, M

    1989-01-01

    The immunochemistry of antibody binding to solid-phase immobilized antigen is reviewed. Experimental data are compared with different theoretical models of reaction mechanisms at solid-liquid interfaces. It was found that reactions at the solid-liquid interface can become limited by the diffusion rate due to depletion of reactants close to the surface, even though the intrinsic bimolecular reaction at the surface is reaction-rate limited. The forward reaction-rate constant decreases with increasing concentration of bound antibodies at the surface, and when not limited by diffusion the forward reaction rate can be more than 1000-fold slower than the corresponding reaction in a liquid solution. Possible explanations for this phenomenon are discussed. The dissociation of bound antibodies is a slow process at solid phases. The antigen-antibody complexes formed are practically irreversible. Some evidence is presented which indicates that the stability of these complexes can be due to attractive lateral interactions between bound antibodies. PMID:2649437

  3. Multifunctional microcontrollable interface module

    NASA Astrophysics Data System (ADS)

    Spitzer, Mark B.; Zavracky, Paul M.; Rensing, Noa M.; Crawford, J.; Hockman, Angela H.; Aquilino, P. D.; Girolamo, Henry J.

    2001-08-01

    This paper reports the development of a complete eyeglass- mounted computer interface system including display, camera and audio subsystems. The display system provides an SVGA image with a 20 degree horizontal field of view. The camera system has been optimized for face recognition and provides a 19 degree horizontal field of view. A microphone and built-in pre-amp optimized for voice recognition and a speaker on an articulated arm are included for audio. An important feature of the system is a high degree of adjustability and reconfigurability. The system has been developed for testing by the Military Police, in a complete system comprising the eyeglass-mounted interface, a wearable computer, and an RF link. Details of the design, construction, and performance of the eyeglass-based system are discussed.

  4. Interface scattering in polycrystalline thermoelectrics

    SciTech Connect

    Popescu, Adrian; Haney, Paul M.

    2014-03-28

    We study the effect of electron and phonon interface scattering on the thermoelectric properties of disordered, polycrystalline materials (with grain sizes larger than electron and phonons' mean free path). Interface scattering of electrons is treated with a Landauer approach, while that of phonons is treated with the diffuse mismatch model. The interface scattering is embedded within a diffusive model of bulk transport, and we show that, for randomly arranged interfaces, the overall system is well described by effective medium theory. Using bulk parameters similar to those of PbTe and a square barrier potential for the interface electron scattering, we identify the interface scattering parameters for which the figure of merit ZT is increased. We find the electronic scattering is generally detrimental due to a reduction in electrical conductivity; however, for sufficiently weak electronic interface scattering, ZT is enhanced due to phonon interface scattering.

  5. SNE Industrial Fieldbus Interface

    NASA Technical Reports Server (NTRS)

    Lucena, Angel; Raines, Matthew; Oostdyk, Rebecca; Mata, Carlos

    2011-01-01

    Programmable logic controllers (PLCs) have very limited diagnostic and no prognostic capabilities, while current smart sensor designs do not have the capability to communicate over Fieldbus networks. The aim is to interface smart sensors with PLCs so that health and status information, such as failure mode identification and measurement tolerance, can be communicated via an industrial Fieldbus such as ControlNet. The SNE Industrial Fieldbus Interface (SIFI) is an embedded device that acts as a communication module in a networked smart sensor. The purpose is to enable a smart sensor to communicate health and status information to other devices, such as PLCs, via an industrial Fieldbus networking protocol. The SNE (Smart Network Element) is attached to a commercial off-the-shelf Any bus-S interface module through the SIFI. Numerous Anybus-S modules are available, each one designed to interface with a specific Fieldbus. Development of the SIFI focused on communications using the ControlNet protocol, but any of the Anybus-S modules can be used. The SIFI communicates with the Any-bus module via a data buffer and mailbox system on the Anybus module, and supplies power to the module. The Anybus module transmits and receives data on the Fieldbus using the proper protocol. The SIFI is intended to be connected to other existing SNE modules in order to monitor the health and status of a transducer. The SIFI can also monitor aspects of its own health using an onboard watchdog timer and voltage monitors. The SIFI also has the hardware to drive a touchscreen LCD (liquid crystal display) unit for manual configuration and status monitoring.

  6. Interface Board Connector

    DTIC Science & Technology

    2011-09-20

    circuit board components are generally soldered. Typically, a printed circuit board is mechanically supported by a dielectric base plate . The printed...transmission line on balun board 1. [0032] Element seat plates 12 of connectors 10 are supported by dielectric 14 and prior art partition...1. As noted previously with respect to FIG. 2, the configuration of plate 12 depends on the architecture of the interface board . For illustrative

  7. Systems interface biology

    PubMed Central

    Doyle, Francis J; Stelling, Jörg

    2006-01-01

    The field of systems biology has attracted the attention of biologists, engineers, mathematicians, physicists, chemists and others in an endeavour to create systems-level understanding of complex biological networks. In particular, systems engineering methods are finding unique opportunities in characterizing the rich behaviour exhibited by biological systems. In the same manner, these new classes of biological problems are motivating novel developments in theoretical systems approaches. Hence, the interface between systems and biology is of mutual benefit to both disciplines. PMID:16971329

  8. Virtual button interface

    DOEpatents

    Jones, J.S.

    1999-01-12

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment are disclosed. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch. 4 figs.

  9. Virtual button interface

    DOEpatents

    Jones, Jake S.

    1999-01-01

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch.

  10. Ground Station Digital Interface.

    DTIC Science & Technology

    1980-09-01

    reduction equipment. Aircraft evaluation studies involve the measurement of many physical quantities. Some quantities such as vibration displacement...Panel Components 28 3.11 Computer Interrupt Controller 28 4. PERFORMANCE OF INTERFACE UNIT 28 5. REVIEW OF DATA REDUCTION SYSTEM 29 6. SUMMARY 30...requirement for aircraft performance studies. In a system for airborne data 1ogging- 2 adopted at these laboratories the acquired data are stored on magnetic

  11. Magnetic multilayer interface anisotropy

    SciTech Connect

    Pechan, M.J.

    1991-01-01

    Ni/Mo and Ni/V multilayer magnetic anisotropy has been investigated as a function of Ni layer thickness, frequency and temperature. Variable frequency ferromagnetic resonance (FMR) measurements show, for the first time, significant frequency dependence associated with the multilayer magnetic anisotropy. The thickness dependence allows one to extract the interface contribution from the total anisotropy. Temperature dependant FMR (9 GHz) and room temperature magnetization indicate that strain between Ni and the non-magnetic layers if contributing significantly to the source of the interface anisotropy and the state of the interfacial magnetization. In order to examine the interface properties of other transition metal multilayer systems, investigations on Fe/Cu are underway and CoCr/Ag is being proposed. ESR measurements have been reported on Gd substituted YBaCuO superconductors and a novel quasi-equilibrium method has been developed to determine quickly and precisely the transition temperature. During the next project the P.I. proposes to (1) extend the variable frequency FMR measurements to low temperature, where extremely large interface anisotropies are known to obtain in Ni/Mo and Ni/V and are proposed to exist in Ni/W; (2) obtain accurate dc anisotropies via a novel, variable temperature torque magnetometer currently under construction; (3) expand upon his initial findings in Fe/Cu multilayer investigations; (4) begin anisotropy investigations on Co/Ag and CoCr/Ag multilayers where the easy magnetization direction depends upon the Cr concentration; (4) make and characterize Bi based superconductors according to resistivity, thermal conductivity and thermoelectric power and construct YBaCuO based superconducting loop-gap'' resonators for use in his magnetic resonance work. 2 figs.

  12. Standard interface file handbook

    SciTech Connect

    Shapiro, A.; Huria, H.C. )

    1992-10-01

    This handbook documents many of the standard interface file formats that have been adopted by the US Department of Energy to facilitate communications between and portability of, various large reactor physics and radiation transport software packages. The emphasis is on those files needed for use of the VENTURE/PC diffusion-depletion code system. File structures, contents and some practical advice on use of the various files are provided.

  13. User Interface Software Tools

    DTIC Science & Technology

    1994-08-01

    97. 19. Mark A. Flecchia and R. Daniel Bergeron. Specifying Complex Dialogs in ALGAE. Human Factors in Computing Systems, CHI+GI󈨛, Toronto, Ont...Spreadsheet Model. Tech. Rept. GIT-GVU-93-20, Georgia Tech Graphics, Visualization and Usability Center, May, 1993. 35. Daniel H.H. Ingalls. "I’he Smalltalk...Interactive Graphical Applications". Comm. ACM 36,4 (April 1993), 41-55. User Interface Software Tools -39 38. Anthony Karrer and Walt Scacchi . Requirements

  14. Semiconductor Properties Near Interfaces.

    DTIC Science & Technology

    1980-07-31

    electron multi- plication with a scintillation counter. This detECtor , described in the appendix, provides very low background without sacrifice of...k ADA095 858 UNIVERSITY OF SOUTHERN CALIFORNIA LOS ANGELES F/G 20/12I SEMICONDUCTOR PROPERTIES NEAR INTERFACES.(U) JUL GO0 DB WITTRY. S Y YIN, F GUO...improvements in the Ion Microprobe Mass Analyzer; in the course of the investioations in improved inn detector was developed and a microcomrnu*e

  15. PINE -- Electronic mail interface

    NASA Astrophysics Data System (ADS)

    Mellor, G. R.

    The PINE mail interface is a user-friendly mail utility for Unix systems. It has been adopted by Starlink as the recommended mail utility because of its ease of use compared with the mail utilities supplied as standard with the Unix operating system. PINE is intended to be intuitive and "to be learned by exploration rather than reading manuals". Here however are a few brief notes to get you started.

  16. Magnetic multilayer interface anisotropy

    SciTech Connect

    Pechan, M.J.

    1990-01-01

    Ni/Mo and Ni/V multilayer magnetic anisotropy has been investigated as a function of Ni layer thickness, frequency and temperature. Variable frequency ferromagnetic resonance (FMR) measurements show, for the first time, significant frequency dependence associated with the multilayer magnetic anisotropy. The thickness dependence allows one to extract the interface contribution from the total anisotropy. Temperature dependent FMR (9 GHz) and room temperature magnetization indicate that strain between Ni and the non-magnetic layers is contributing significantly to the source of the interface anisotropy and the state of the interfacial magnetization. In order to examine the interface properties of other transition metal multilayer systems, investigations on Fe/Cu are underway and CoCr/Ag is being proposed. ESR measurements have been reported on Gd substituted YBaCuO superconductors and a novel quasi-equilibrium method has been developed to determine quickly and precisely the transition temperature. During the next project period the P.I. proposes to (1) extend the variable frequency FMR measurements to low temperature, where extremely large interface anisotropies are known to obtain in Ni/Mo and Ni/V and are proposed to exist in Ni/W; (2) obtain accurate dc anisotropies via a novel, variable temperature torque magnetometer currently under construction; (3) expand upon his initial findings in Fe/Cu multilayer investigations; (4) begin anisotropy investigations on Co/Ag and CoCr/Ag multilayers where the easy magnetization direction depends upon the Cr concentration; (4) make and characterize Bi based superconductors according to resistivity, thermal conductivity and thermoelectric power and construct YBaCuO based superconducting loop-gap'' resonators for use in his magnetic resonance work.

  17. User Interface Design Patterns

    DTIC Science & Technology

    2010-07-01

    the beginning of our research) led us to Glade (glade.gnome.org), a cross- platform GUI builder platform that saves its descriptive files in XML format...Major consideration was initially given to Java Netbeans and Java Eclipse, and later extended to Glade .) The saved XML files fully describe... Glade -designed user interfaces. Glade libraries are available for numerous programming languages on many computing platforms. This makes the choice of

  18. Optical Neural Interfaces

    PubMed Central

    Warden, Melissa R.; Cardin, Jessica A.; Deisseroth, Karl

    2014-01-01

    Genetically encoded optical actuators and indicators have changed the landscape of neuroscience, enabling targetable control and readout of specific components of intact neural circuits in behaving animals. Here, we review the development of optical neural interfaces, focusing on hardware designed for optical control of neural activity, integrated optical control and electrical readout, and optical readout of population and single-cell neural activity in freely moving mammals. PMID:25014785

  19. Systems interface biology.

    PubMed

    Doyle, Francis J; Stelling, Jörg

    2006-10-22

    The field of systems biology has attracted the attention of biologists, engineers, mathematicians, physicists, chemists and others in an endeavour to create systems-level understanding of complex biological networks. In particular, systems engineering methods are finding unique opportunities in characterizing the rich behaviour exhibited by biological systems. In the same manner, these new classes of biological problems are motivating novel developments in theoretical systems approaches. Hence, the interface between systems and biology is of mutual benefit to both disciplines.

  20. The THOSE remote interface

    NASA Astrophysics Data System (ADS)

    Klawon, Kevin; Gold, Josh; Bachman, Kristen

    2013-05-01

    The DIA, in conjunction with the Army Research Lab (ARL), wants to create an Unmanned Ground Sensor (UGS) controller that is (a) interoperable across all controller platforms, (b) capable of easily adding new sensors, radios, and processes and (c) backward compatible with existing UGS systems. To achieve this, a Terra Harvest controller was created that used Java JRE 1.6 and an Open Services Gateway initiative (OSGi) platform, named Terra Harvest Open Software Environment (THOSE). OSGi is an extensible framework that provides a modularized environment for deploying functionality in "bundles". These bundles can publish, discover, and share services available from other external bundles or bundles provided by the controller core. With the addition of a web GUI used for interacting with THOSE, a natural step was then to create a common remote interface that allows 3rd party real-time interaction with the controller. This paper provides an overview of the THOSE system and its components as well as a description of the architectural structure of the remote interface, highlighting the interactions occurring between the controller and the remote interface and its role in providing a positive user experience for managing UGSS functions.

  1. ADAM -- Interface Module Reference Manual

    NASA Astrophysics Data System (ADS)

    Chipperfield, A. J.; Kelly, B. D.; Wright, S. L.

    ADAM Interface Modules provide an interface between ADAM application programs and the rest of the system. This document describes in detail the facilities available with ADAM Interface Modules and the rules for using them. It is intended as a reference manual and should shed light on some of the finer points of the ADAM parameter system. Readers requiring an introduction to Interface Modules should read SG/4.

  2. Graphic Interfaces and Online Information.

    ERIC Educational Resources Information Center

    Percival, J. Mark

    1990-01-01

    Discusses the growing importance of the use of Graphic User Interfaces (GUIs) with microcomputers and online services. Highlights include the development of graphics interfacing with microcomputers; CD-ROM databases; an evaluation of HyperCard as a potential interface to electronic mail and online commercial databases; and future possibilities.…

  3. User interfaces for voice applications.

    PubMed Central

    Kamm, C

    1995-01-01

    This paper discusses some of the aspects of task requirements, user expectations, and technological capabilities that influence the design of a voice interface and then identifies several components of user interfaces that are particularly critical in successful voice applications. Examples from several applications are provided to demonstrate how these components are used to produce effective voice interfaces. PMID:7479721

  4. User interfaces for voice applications.

    PubMed

    Kamm, C

    1995-10-24

    This paper discusses some of the aspects of task requirements, user expectations, and technological capabilities that influence the design of a voice interface and then identifies several components of user interfaces that are particularly critical in successful voice applications. Examples from several applications are provided to demonstrate how these components are used to produce effective voice interfaces.

  5. User Interfaces for Voice Applications

    NASA Astrophysics Data System (ADS)

    Kamm, Candace

    1995-10-01

    This paper discusses some of the aspects of task requirements, user expectations, and technological capabilities that influence the design of a voice interface and then identifies several components of user interfaces that are particularly critical in successful voice applications. Examples from several applications are provided to demonstrate how these components are used to produce effective voice interfaces.

  6. Graphic Interfaces and Online Information.

    ERIC Educational Resources Information Center

    Percival, J. Mark

    1990-01-01

    Discusses the growing importance of the use of Graphic User Interfaces (GUIs) with microcomputers and online services. Highlights include the development of graphics interfacing with microcomputers; CD-ROM databases; an evaluation of HyperCard as a potential interface to electronic mail and online commercial databases; and future possibilities.…

  7. Thesaurus-Enhanced Search Interfaces.

    ERIC Educational Resources Information Center

    Shiri, Ali Asghar; Revie, Crawford; Chowdhury, Gobinda

    2002-01-01

    Discussion of user interfaces to information retrieval systems focuses on interfaces that incorporate thesauri as part of their searching and browsing facilities. Discusses research literature related to information searching behavior, information retrieval interface evaluation, search term selection, and query expansion; and compares thesaurus…

  8. Thesaurus-Enhanced Search Interfaces.

    ERIC Educational Resources Information Center

    Shiri, Ali Asghar; Revie, Crawford; Chowdhury, Gobinda

    2002-01-01

    Discussion of user interfaces to information retrieval systems focuses on interfaces that incorporate thesauri as part of their searching and browsing facilities. Discusses research literature related to information searching behavior, information retrieval interface evaluation, search term selection, and query expansion; and compares thesaurus…

  9. Why Mineral Interfaces Matter

    NASA Astrophysics Data System (ADS)

    Putnis, Andrew; Putnis, Christine V.

    2015-04-01

    While it is obvious that reactions between a mineral and an aqueous solution take place at the mineral-fluid interface it is only relatively recently that high spatial resolution studies have demonstrated how the local structure of the mineral surface and the chemical composition of the fluid at the interface control both the short-range and the long-range consequences of mineral-fluid interaction. Long-range consequences of fluid-mineral interaction control element cycles in the earth, the formation of ore-deposits, the chemical composition of the oceans through weathering of rocks and hence climate changes. Although weathering is clearly related to mineral dissolution, to what extent do experimentally measured dissolution rates of minerals help to understand weathering, especially weathering mechanisms? This question is related to the short-range, local reactions that take place when a mineral, that is not stable in the fluid, begins to dissolve. In this case the fluid composition at the interface will become supersaturated with respect to a different phase or phases. This may be a different composition of the same mineral e.g. a Ca-rich feldspar dissolving in a Na-rich solution results in a fluid at the interface which may be supersaturated with respect to an Na-rich feldspar. Alternatively, the interfacial fluid could be supersaturated with respect to a different mineral e.g. an Na-rich zeolite, depending on the temperature. Numerous experiments have shown that the precipitation of a more stable phase at the mineral-fluid interface results in a coupling between the dissolution and the precipitation, and the replacement of one mineral by another. This process separates the short-range mechanisms which depend only on the composition of the interfacial solution, and the long-range consequences that depend on the composition of the residual fluid released from the reacting parent mineral. Typically such residual fluids may carry metal ions tens to hundreds of

  10. Human-computer interaction in radiotherapy target volume delineation: a prospective, multi-institutional comparison of user input devices.

    PubMed

    2011-10-01

    The purpose of this study was the prospective comparison of objective and subjective effects of target volume region of interest (ROI) delineation using mouse-keyboard and pen-tablet user input devices (UIDs). The study was designed as a prospective test/retest sequence, with Wilcoxon signed rank test for matched-pair comparison. Twenty-one physician-observers contoured target volume ROIs on four standardized cases (representative of brain, prostate, lung, and head and neck malignancies) twice: once using QWERTY keyboard/scroll-wheel mouse UID and once with pen-tablet UID (DTX2100, Wacom Technology Corporation, Vancouver, WA, USA). Active task time, ROI manipulation task data, and subjective survey data were collected. One hundred twenty-nine target volume ROI sets were collected, with 62 paired pen-tablet/mouse-keyboard sessions. Active contouring time was reduced using the pen-tablet UID, with mean ± SD active contouring time of 26 ± 23 min, compared with 32 ± 25 with the mouse (p ≤ 0.01). Subjective estimation of time spent was also reduced from 31 ± 26 with mouse to 27 ± 22 min with the pen (p = 0.02). Task analysis showed ROI correction task reduction (p = 0.045) and decreased panning and scrolling tasks (p < 0.01) with the pen-tablet; drawing, window/level changes, and zoom commands were unchanged (p = n.s.) Volumetric analysis demonstrated no detectable differences in ROI volume nor intra- or inter-observer volumetric coverage. Fifty-two of 62 (84%) users preferred the tablet for each contouring task; 5 of 62 (8%) denoted no preference, and 5 of 62 (8%) chose the mouse interface. The pen-tablet UID reduced active contouring time and reduced correction of ROIs, without substantially altering ROI volume/coverage.

  11. Productivity issues at organizational interfaces

    NASA Technical Reports Server (NTRS)

    Holland, A. W.

    1985-01-01

    The need for close interdependence between large numbers of diverse and specialized work groups makes the Space Program extremely vulnerable to loss of productivity at organizational interfaces. Trends within the program also suggest that the number and diversity of interfaces will grow in the near term. Continued maintenance of R&D excellence will require that interface performance issues be included in any future productivity improvement effort. The types and characteristics of organizational interfaces are briefly presented, followed by a review of factors which impact their productivity. Approaches to assessing and improving interface effectiveness are also discussed.

  12. Single-interface Casimir torque

    NASA Astrophysics Data System (ADS)

    Morgado, Tiago A.; Silveirinha, Mário G.

    2016-10-01

    A different type of Casimir-type interaction is theoretically predicted: a single-interface torque at a junction of an anisotropic material and a vacuum or another material system. The torque acts to reorient the polarizable microscopic units of the involved materials near the interface, and thus to change the internal structure of the materials. The single-interface torque depends on the zero-point energy of the interface localized and extended modes. Our theory demonstrates that the single-interface torque is essential to understand the Casimir physics of material systems with anisotropic elements and may influence the orientation of the director of nematic liquid crystals.

  13. Conceptual Framework for Aquatic Interfaces

    NASA Astrophysics Data System (ADS)

    Lewandowski, J.; Krause, S.

    2015-12-01

    Aquatic interfaces are generally characterized by steep gradients of physical, chemical and biological properties due to the contrast between the two adjacent environments. Innovative measurement techniques are required to study the spatially heterogeneous and temporally variable processes. Especially the different spatial and temporal scales are a large challenge. Due to the steep biogeochemical gradients and the intensive structural and compositional heterogeneity, enhanced biogeochemical processing rates are inherent to aquatic interfaces. Nevertheless, the effective turnover depends strongly on the residence time distribution along the flow paths and in sections with particular biogeochemical milieus and reaction kinetics. Thus, identification and characterization of the highly complex flow patterns in and across aquatic interfaces are crucial to understand biogeochemical processing along exchange flow paths and to quantify transport across aquatic interfaces. Hydrodynamic and biogeochemical processes are closely coupled at aquatic interfaces. However, interface processing rates are not only enhanced compared to the adjacent compartments that they connect; also completely different reactions might occur if certain thresholds are exceeded or the biogeochemical milieu differs significantly from the adjacent environments. Single events, temporal variability and spatial heterogeneity might increase overall processing rates of aquatic interfaces and thus, should not be neglected when studying aquatic interfaces. Aquatic interfaces are key zones relevant for the ecological state of the entire ecosystem and thus, understanding interface functioning and controls is paramount for ecosystem management. The overall aim of this contribution is a general conceptual framework for aquatic interfaces that is applicable to a wide range of systems, scales and processes.

  14. Matched Interface and Boundary Method for Elasticity Interface Problems

    PubMed Central

    Wang, Bao; Xia, Kelin; Wei, Guo-Wei

    2015-01-01

    Elasticity theory is an important component of continuum mechanics and has had widely spread applications in science and engineering. Material interfaces are ubiquity in nature and man-made devices, and often give rise to discontinuous coefficients in the governing elasticity equations. In this work, the matched interface and boundary (MIB) method is developed to address elasticity interface problems. Linear elasticity theory for both isotropic homogeneous and inhomogeneous media is employed. In our approach, Lamé’s parameters can have jumps across the interface and are allowed to be position dependent in modeling isotropic inhomogeneous material. Both strong discontinuity, i.e., discontinuous solution, and weak discontinuity, namely, discontinuous derivatives of the solution, are considered in the present study. In the proposed method, fictitious values are utilized so that the standard central finite different schemes can be employed regardless of the interface. Interface jump conditions are enforced on the interface, which in turn, accurately determines fictitious values. We design new MIB schemes to account for complex interface geometries. In particular, the cross derivatives in the elasticity equations are difficult to handle for complex interface geometries. We propose secondary fictitious values and construct geometry based interpolation schemes to overcome this difficulty. Numerous analytical examples are used to validate the accuracy, convergence and robustness of the present MIB method for elasticity interface problems with both small and large curvatures, strong and weak discontinuities, and constant and variable coefficients. Numerical tests indicate second order accuracy in both L∞ and L2 norms. PMID:25914439

  15. Matched Interface and Boundary Method for Elasticity Interface Problems.

    PubMed

    Wang, Bao; Xia, Kelin; Wei, Guo-Wei

    2015-09-01

    Elasticity theory is an important component of continuum mechanics and has had widely spread applications in science and engineering. Material interfaces are ubiquity in nature and man-made devices, and often give rise to discontinuous coefficients in the governing elasticity equations. In this work, the matched interface and boundary (MIB) method is developed to address elasticity interface problems. Linear elasticity theory for both isotropic homogeneous and inhomogeneous media is employed. In our approach, Lamé's parameters can have jumps across the interface and are allowed to be position dependent in modeling isotropic inhomogeneous material. Both strong discontinuity, i.e., discontinuous solution, and weak discontinuity, namely, discontinuous derivatives of the solution, are considered in the present study. In the proposed method, fictitious values are utilized so that the standard central finite different schemes can be employed regardless of the interface. Interface jump conditions are enforced on the interface, which in turn, accurately determines fictitious values. We design new MIB schemes to account for complex interface geometries. In particular, the cross derivatives in the elasticity equations are difficult to handle for complex interface geometries. We propose secondary fictitious values and construct geometry based interpolation schemes to overcome this difficulty. Numerous analytical examples are used to validate the accuracy, convergence and robustness of the present MIB method for elasticity interface problems with both small and large curvatures, strong and weak discontinuities, and constant and variable coefficients. Numerical tests indicate second order accuracy in both L∞ and L2 norms.

  16. NESSUS/NASTRAN Interface

    NASA Technical Reports Server (NTRS)

    Millwater, Harry; Riha, David

    1996-01-01

    The NESSUS probabilistic analysis computer program has been developed with a built-in finite element analysis program NESSUS/FEM. However, the NESSUS/FEM program is specialized for engine structures and may not contain sufficient features for other applications. In addition, users often become well acquainted with a particular finite element code and want to use that code for probabilistic structural analysis. For these reasons, this work was undertaken to develop an interface between NESSUS and NASTRAN such that NASTRAN can be used for the finite element analysis and NESSUS can be used for the probabilistic analysis. In addition, NESSUS was restructured such that other finite element codes could be more easily coupled with NESSUS. NESSUS has been enhanced such that NESSUS will modify the NASTRAN input deck for a given set of random variables, run NASTRAN and read the NASTRAN result. The coordination between the two codes is handled automatically. The work described here was implemented within NESSUS 6.2 which was delivered to NASA in September 1995. The code runs on Unix machines: Cray, HP, Sun, SGI and IBM. The new capabilities have been implemented such that a user familiar with NESSUS using NESSUS/FEM and NASTRAN can immediately use NESSUS with NASTRAN. In other words, the interface with NASTRAN has been implemented in an analogous manner to the interface with NESSUS/FEM. Only finite element specific input has been changed. This manual is written as an addendum to the existing NESSUS 6.2 manuals. We assume users have access to NESSUS manuals and are familiar with the operation of NESSUS including probabilistic finite element analysis. Update pages to the NESSUS PFEM manual are contained in Appendix E. The finite element features of the code and the probalistic analysis capabilities are summarized.

  17. Films of bacteria at interfaces.

    PubMed

    Vaccari, Liana; Molaei, Mehdi; Niepa, Tagbo H R; Lee, Daeyeon; Leheny, Robert L; Stebe, Kathleen J

    2017-09-01

    Bacteria are often discussed as active colloids, self-propelled organisms whose collective motion can be studied in the context of non-equilibrium statistical mechanics. In such studies, the behavior of bacteria confined to interfaces or in the proximity of an interface plays an important role. For instance, many studies have probed collective behavior of bacteria in quasi two-dimensional systems such as soap films. Since fluid interfaces can adsorb surfactants and other materials, the stress and velocity boundary conditions at interfaces can alter bacteria motion; hydrodynamic studies of interfaces with differing boundary conditions are reviewed. Also, bacteria in bulk can become trapped at or near fluid interfaces, where they colonize and form structures comprising secretions like exopolysaccharides, surfactants, living and dead bacteria, thereby creating Films of Bacteria at Interfaces (FBI). The formation of FBI is discussed at air-water, oil-water, and water-water interfaces, with an emphasis on film mechanics, and with some allusion to genetic functions guiding bacteria to restructure fluid interfaces. At air-water interfaces, bacteria form pellicles or interfacial biofilms. Studies are reviewed that reveal that pellicle material properties differ for different strains of bacteria, and that pellicle physicochemistry can act as a feedback mechanism to regulate film formation. At oil-water interfaces, a range of FBI form, depending on bacteria strain. Some bacteria-laden interfaces age from an initial active film, with dynamics dominated by motile bacteria, through viscoelastic states, to form an elastic film. Others remain active with no evidence of elastic film formation even at significant interface ages. Finally, bacteria can adhere to and colonize ultra-low surface tension interfaces such as aqueous-aqueous systems common in food industries. Relevant literature is reviewed, and areas of interest for potential application are discussed, ranging from health

  18. Dynamics of curved interfaces

    SciTech Connect

    Escudero, Carlos

    2009-08-15

    Stochastic growth phenomena on curved interfaces are studied by means of stochastic partial differential equations. These are derived as counterparts of linear planar equations on a curved geometry after a reparametrization invariance principle has been applied. We examine differences and similarities with the classical planar equations. Some characteristic features are the loss of correlation through time and a particular behavior of the average fluctuations. Dependence on the metric is also explored. The diffusive model that propagates correlations ballistically in the planar situation is particularly interesting, as this propagation becomes nonuniversal in the new regime.

  19. Virtual interface environment

    NASA Technical Reports Server (NTRS)

    Fisher, Scott S.

    1986-01-01

    A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed for use as a multipurpose interface environment. The system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, application scenarios, and research directions are described.

  20. Bidirectional Neural Interfaces

    PubMed Central

    Masters, Matthew R.; Thakor, Nitish V.

    2016-01-01

    A bidirectional neural interface is a device that transfers information into and out of the nervous system. This class of devices has potential to improve treatment and therapy in several patient populations. Progress in very-large-scale integration (VLSI) has advanced the design of complex integrated circuits. System-on-chip (SoC) devices are capable of recording neural electrical activity and altering natural activity with electrical stimulation. Often, these devices include wireless powering and telemetry functions. This review presents the state of the art of bidirectional circuits as applied to neuroprosthetic, neurorepair, and neurotherapeutic systems. PMID:26753776

  1. NESSUS/NASTRAN Interface

    NASA Technical Reports Server (NTRS)

    Millwater, Harry; Riha, David

    1996-01-01

    The NESSUS and NASTRAN computer codes were successfully integrated. The enhanced NESSUS code will use NASTRAN for the structural Analysis and NESSUS for the probabilistic analysis. Any quantities in the NASTRAN bulk data input can be random variables. Any NASTRAN result that is written to the output2 file can be returned to NESSUS as the finite element result. The interfacing between NESSUS and NASTRAN is handled automatically by NESSUS. NESSUS and NASTRAN can be run on different machines using the remote host option.

  2. Adhesion at metal interfaces

    NASA Technical Reports Server (NTRS)

    Banerjea, Amitava; Ferrante, John; Smith, John R.

    1991-01-01

    A basic adhesion process is defined, the theory of the properties influencing metallic adhesion is outlined, and theoretical approaches to the interface problem are presented, with emphasis on first-principle calculations as well as jellium-model calculations. The computation of the energies of adhesion as a function of the interfacial separation is performed; fully three-dimensional calculations are presented, and universality in the shapes of the binding energy curves is considered. An embedded-atom method and equivalent-crystal theory are covered in the framework of issues involved in practical adhesion.

  3. Multiple network interface core apparatus and method

    SciTech Connect

    Underwood, Keith D; Hemmert, Karl Scott

    2011-04-26

    A network interface controller and network interface control method comprising providing a single integrated circuit as a network interface controller and employing a plurality of network interface cores on the single integrated circuit.

  4. Mercury Shopping Cart Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Mercury Shopping Cart Interface (MSCI) is a reusable component of the Power User Interface 5.0 (PUI) program described in another article. MSCI is a means of encapsulating the logic and information needed to describe an orderable item consistent with Mercury Shopping Cart service protocol. Designed to be used with Web-browser software, MSCI generates Hypertext Markup Language (HTML) pages on which ordering information can be entered. MSCI comprises two types of Practical Extraction and Report Language (PERL) modules: template modules and shopping-cart logic modules. Template modules generate HTML pages for entering the required ordering details and enable submission of the order via a Hypertext Transfer Protocol (HTTP) post. Shopping cart modules encapsulate the logic and data needed to describe an individual orderable item to the Mercury Shopping Cart service. These modules evaluate information entered by the user to determine whether it is sufficient for the Shopping Cart service to process the order. Once an order has been passed from MSCI to a deployed Mercury Shopping Cart server, there is no further interaction with the user.

  5. Mysteries at Ice Interfaces

    NASA Astrophysics Data System (ADS)

    Fain, Samuel C., Jr.

    1996-03-01

    Michael Faraday noted that ``two pieces of thawing ice, if put together, adhere and become one...the effect will take place in air, or in water, or in vacuo." Why? He proposed that ``a particle of water, which could retain the liquid state whilst touching ice only on one side, could not retain the liquid state if it were touched by ice on both sides."footnote M. Faraday, Proc. Roy. Soc. London 10, 440 (1860) The existence of special properties at interfaces of ice is generally agreed and has important environmental consequences.(J. G. Dash, H. Fu, and J. S. Wettlaufer, Rep. Prog. Phys. 58), 115 (1995) Why do different experiments infer different properties for this layer? Impurities and electric fields at the interfaces may be responsible for some of the variations in experimental results.footnote V. F. Petrenko, U. S. Army Cold Regions Research and Engineering Laboratory Report 94-22 (1994) Some background on the physical properties of ice will be discussed, including recent force microscopy measurements done at the University of Washington.footnote C.R. Slaughterbeck, E.W. Kukes, B. Pittenger, D.J. Cook, P.C. Williams, V.L. Eden, S.C. Fain, Jr., J. Vac. Sci. Technol. (in press) Supported by NSF Grant DMR-91-19701.

  6. Engineering graded tissue interfaces.

    PubMed

    Phillips, Jennifer E; Burns, Kellie L; Le Doux, Joseph M; Guldberg, Robert E; García, Andrés J

    2008-08-26

    Interfacial zones between tissues provide specialized, transitional junctions central to normal tissue function. Regenerative medicine strategies focused on multiple cell types and/or bi/tri-layered scaffolds do not provide continuously graded interfaces, severely limiting the integration and biological performance of engineered tissue substitutes. Inspired by the bone-soft tissue interface, we describe a biomaterial-mediated gene transfer strategy for spatially regulated genetic modification and differentiation of primary dermal fibroblasts within tissue-engineered constructs. We demonstrate that zonal organization of osteoblastic and fibroblastic cellular phenotypes can be engineered by a simple, one-step seeding of fibroblasts onto scaffolds containing a spatial distribution of retrovirus encoding the osteogenic transcription factor Runx2/Cbfa1. Gradients of immobilized retrovirus, achieved via deposition of controlled poly(L-lysine) densities, resulted in spatial patterns of transcription factor expression, osteoblastic differentiation, and mineralized matrix deposition. Notably, this graded distribution of mineral deposition and mechanical properties was maintained when implanted in vivo in an ectopic site. Development of this facile and robust strategy is significant toward the regeneration of continuous interfacial zones that mimic the cellular and microstructural characteristics of native tissue.

  7. Surface inspection operator interface

    NASA Astrophysics Data System (ADS)

    Creek, Russell C.

    1992-03-01

    Surface inspection systems are widely used in many industries including steel, tin, aluminum, and paper. These systems generally use machine vision technology to detect defective surface regions and can generate very high data output rates which can be difficult for line operators to absorb and use. A graphical, windowing interface is described which provides the operators with an overview of the surface quality of the inspected web while still allowing them to select individual defective regions for display. A touch screen is used as the only operator input. This required alterations to some screen widgets due to subtle ergonomic differences of touch screen input over mouse input. The interface, although developed for inspecting coated steel, has been designed to be adaptable to other surface inspection applications. Facility is provided to allow the detection, classification, and display functions of the inspection system to be readily changed. Modifications can be implemented on two main levels; changes that reflect the configuration of the hardware system and control the detection and classification components of the surface inspection system are accessible only to authorized staff while those affecting the display and alarm settings of defect types may be changed by operators and this can generally be done dynamically.

  8. Laparoscopic simulation interface

    DOEpatents

    Rosenberg, Louis B.

    2006-04-04

    A method and apparatus for providing high bandwidth and low noise mechanical input and output for computer systems. A gimbal mechanism provides two revolute degrees of freedom to an object about two axes of rotation. A linear axis member is coupled to the gimbal mechanism at the intersection of the two axes of rotation. The linear axis member is capable of being translated along a third axis to provide a third degree of freedom. The user object is coupled to the linear axis member and is thus translatable along the third axis so that the object can be moved along all three degrees of freedom. Transducers associated with the provided degrees of freedom include sensors and actuators and provide an electromechanical interface between the object and a digital processing system. Capstan drive mechanisms transmit forces between the transducers and the object. The linear axis member can also be rotated about its lengthwise axis to provide a fourth degree of freedom, and, optionally, a floating gimbal mechanism is coupled to the linear axis member to provide fifth and sixth degrees of freedom to an object. Transducer sensors are associated with the fourth, fifth, and sixth degrees of freedom. The interface is well suited for simulations of medical procedures and simulations in which an object such as a stylus or a joystick is moved and manipulated by the user.

  9. Thermal interface conductance across metal alloy-dielectric interfaces

    NASA Astrophysics Data System (ADS)

    Freedman, Justin P.; Yu, Xiaoxiao; Davis, Robert F.; Gellman, Andrew J.; Malen, Jonathan A.

    2016-01-01

    We present measurements of thermal interface conductance as a function of metal alloy composition. Composition spread alloy films of A uxC u1 -x and A uxP d1 -x solid solutions were deposited on single crystal sapphire substrates via dual electron-beam evaporation. High throughput measurements of thermal interface conductance across the (metal alloy)-sapphire interfaces were made by positional scanning of frequency domain thermoreflectance measurements to sample a continuum of Au atomic fractions (x ˜0 →1 ) . At a temperature of 300 K, the thermal interface conductance at the A uxC u1 -x -sapphire interfaces monotonically decreased from 197 ±39 MW m-2K-1 to 74 ±11 MW m-2K-1 for x =0 →0.95 ±0.02 and at the A uxP d1 -x -sapphire interfaces from 167 ±35 MW m-2K-1 to 60 ±10 MW m-2K-1 for x =0.03 →0.97 ±0.02 . To shed light on the phonon physics at the interface, a Diffuse Mismatch Model for thermal interface conductance with alloys is presented and agrees reasonably with the thermal interface conductance data.

  10. What Do IT-People Know about the Nordic History of Computers and User Interfaces?

    NASA Astrophysics Data System (ADS)

    Jørgensen, Anker Helms

    This paper reports a preliminary, empirical exploration of what IT-people know about the history of computers and user interfaces. The principal motivation for the study is that the younger generations such as students in IT seem to know very little about these topics. The study employed a free association method administered as email. Eight students and four researchers participated, between 26-34 and 48-64 years of age, respectively. Responses totaled 222 and we analyzed and categorized them. First, the Nordic touch was extremely limited. Secondly, the knowledge of both students and researchers seems heavily based on personal experience so that the researchers know much more about the earlier days of computing and interfaces. Thirdly, there is a tendency amongst the students to conceptualize the history of computers in interface features and concepts. Hence, the interface seems to become the designation or even the icon for the computer. In other words, one of the key focal points in the area of human-computer interaction: to makethe computer as suchinvisible seems to have been successful.

  11. XPI: The Xanadu Parameter Interface

    NASA Technical Reports Server (NTRS)

    White, N.; Barrett, P.; Oneel, B.; Jacobs, P.

    1992-01-01

    XPI is a table driven parameter interface which greatly simplifies both command driven programs such as BROWSE and XIMAGE as well as stand alone single-task programs. It moves all of the syntax and semantic parsing of commands and parameters out of the users code into common code and externally defined tables. This allows the programmer to concentrate on writing the code unique to the application rather than reinventing the user interface and for external graphical interfaces to interface with no changes to the command driven program. XPI also includes a compatibility library which allows programs written using the IRAF host interface (Mandel and Roll) to use XPI in place of the IRAF host interface.

  12. Reaction Dynamics at Liquid Interfaces

    NASA Astrophysics Data System (ADS)

    Benjamin, Ilan

    2015-04-01

    The liquid interface is a narrow, highly anisotropic region, characterized by rapidly varying density, polarity, and molecular structure. I review several aspects of interfacial solvation and show how these affect reactivity at liquid/liquid interfaces. I specifically consider ion transfer, electron transfer, and SN2 reactions, showing that solvent effects on these reactions can be understood by examining the unique structure and dynamics of the liquid interface region.

  13. Reaction dynamics at liquid interfaces.

    PubMed

    Benjamin, Ilan

    2015-04-01

    The liquid interface is a narrow, highly anisotropic region, characterized by rapidly varying density, polarity, and molecular structure. I review several aspects of interfacial solvation and show how these affect reactivity at liquid/liquid interfaces. I specifically consider ion transfer, electron transfer, and SN2 reactions, showing that solvent effects on these reactions can be understood by examining the unique structure and dynamics of the liquid interface region.

  14. Intelligent interface design and evaluation

    NASA Technical Reports Server (NTRS)

    Greitzer, Frank L.

    1988-01-01

    Intelligent interface concepts and systematic approaches to assessing their functionality are discussed. Four general features of intelligent interfaces are described: interaction efficiency, subtask automation, context sensitivity, and use of an appropriate design metaphor. Three evaluation methods are discussed: Functional Analysis, Part-Task Evaluation, and Operational Testing. Design and evaluation concepts are illustrated with examples from a prototype expert system interface for environmental control and life support systems for manned space platforms.

  15. Standardized Spacecraft Onboard Interfaces

    NASA Technical Reports Server (NTRS)

    Smith, Joseph F.; Plummer, Chris; Plancke, Patrick

    2003-01-01

    The Consultative Committee for Space Data Systems (CCSDS), an international organization of national space agencies, is branching out to provide new standards to enhanced reuse of onboard spacecraft equipment and software. These Spacecraft Onboard Interface (SOIF) standards will be, in part, based on the well-known Internet protocols. This paper will provide a description of the SOIF work by describing three orthogonal views: the Services View that describes data communications services, the Interoperability view shows how to exchange data and messages between different spacecraft elements, and the Protocol view, that describes the SOIF protocols and services. We will also provide a description of the present state of the services that will be provided to SOIF users, and are the basis of the utility of these standards.

  16. Nuclear data interface retrospective

    SciTech Connect

    Gray, Mark G

    2008-01-01

    The Nuclear Data Interface (NDI) code library and data formats are the standards for multigroup nuclear data at Los Alamos National Laboratory. NDI's analysis, design, implementation, testing, integration, and maintenance required a ten person-year and ongoing effort by the Nuclear Data Team. Their efforts provide a unique, contemporary experience in producing a standard component library. In reflection upon that experience at NDI's decennial, we have identified several factors critical to NDI's success: it addressed real problems with appropriate simplicity, it fully supported all users, it added extra value through the code to the raw nuclear data, and its team went the distance from analysis through maintenance. In this report we review these critical success factors and discuss their implications for future standardization projects.

  17. Brain-computer interfaces.

    PubMed

    Wolpaw, Jonathan R

    2013-01-01

    Brain-computer interfaces (BCIs) are systems that give their users communication and control capabilities that do not depend on muscles. The user's intentions are determined from activity recorded by electrodes on the scalp, on the cortical surface, or within the brain. BCIs can enable people who are paralyzed by amyotrophic lateral sclerosis (ALS), brainstem stroke, or other disorders to convey their needs and wishes to others, to operate word-processing programs or other software, or possibly to control a wheelchair or a neuroprosthesis. BCI technology might also augment rehabilitation protocols aimed at restoring useful motor function. With continued development and clinical implementation, BCIs could substantially improve the lives of those with severe disabilities.

  18. Porphyrins at interfaces

    NASA Astrophysics Data System (ADS)

    Auwärter, Willi; Écija, David; Klappenberger, Florian; Barth, Johannes V.

    2015-02-01

    Porphyrins and other tetrapyrrole macrocycles possess an impressive variety of functional properties that have been exploited in natural and artificial systems. Different metal centres incorporated within the tetradentate ligand are key for achieving and regulating vital processes, including reversible axial ligation of adducts, electron transfer, light-harvesting and catalytic transformations. Tailored substituents optimize their performance, dictating their arrangement in specific environments and mediating the assembly of molecular nanoarchitectures. Here we review the current understanding of these species at well-defined interfaces, disclosing exquisite insights into their structural and chemical properties, and also discussing methods by which to manipulate their intramolecular and organizational features. The distinct characteristics arising from the interfacial confinement offer intriguing prospects for molecular science and advanced materials. We assess the role of surface interactions with respect to electronic and physicochemical characteristics, and describe in situ metallation pathways, molecular magnetism, rotation and switching. The engineering of nanostructures, organized layers, interfacial hybrid and bio-inspired systems is also addressed.

  19. WWW to DICOM interface

    NASA Astrophysics Data System (ADS)

    Grevera, George J.; Feingold, Eric R.; Horii, Steven C.

    1996-05-01

    In this paper we discuss the implementation and use of a WWW interface to a DICOM PACS that allows users to select, move, and display images that are currently available in the PACS and to view their corresponding radiology reports. This system allows our users to query the archive from any workstation (such as Unix, DOS, and Mac) that supports a WWW browser. To use this system, the user first runs a WWW browser such as Mosaic, Netscape, or Lynx and specifies a URL on one of our Unix workstations. This URL refers to an HTML file that contains a query form. This query form contains a number of fields such as patient name and medical record number. The user may specify any or all fields as well as wildcards in fields such as the name field. Once the form is completed, the user presses a button to submit the request. The HTML form submits the query to a C program that executes on the Unix server. This program accepts as input the form field values that the user specified. This program then communicates with the archive via DICOM requests to determine those patients that match the search criteria. The user may then choose a patient which in turn causes the studies for this patient to be displayed. Finally, the user may select a study which causes those images to be retrieved from the archive and displayed via the Web browser. The result of this system is an easy to use interface to a DICOM PACS with the option to query and move images from the PACS. In summary, a system that integrates the ease of use of WWW browsers with a DICOM PACS is discussed. We are currently incorporating information from our RIS as well. This allows us to obtain extensive patient demographics, exam information, and textual radiological reports and associate this information with information from the PACS.

  20. APST interfaces in LINCS

    SciTech Connect

    Fletcher, J.G.

    1995-07-01

    APST is an acronym for the four highest of the seven layers of the LINCS hierarchy of communication protocols: (from high to low) Application, Presentation, Session, and Transport. Routines in each but the lowest of these APST layers can utilize the facilities of any lower APST layer (normally, but not necessarily, the immediately next lower layer) by invoking various primitives (macros that in most cases are subroutine calls) defining the upper interface of the lower layer. So there are three APST interfaces: Presentation layer, used by the Application layer; Session layer, normally used by the Presentation layer; and Transport layer, normally used by the Session layer. Logically, each end of a stream (unidirectional sequence of transmitted information) is handled by three modules, one module each for the Presentation, Session, and Transport layers, and each of these modules deals with only that one end of that one stream. The internal workings of the layers, particularly the Transport layer, do not necessarily exhibit this same modularization; for example, the two oppositely directed streams between the same two ends (constituting an association) may interact within a layer. However, such interaction is an implementational detail of no direct interest to those utilizing the layer. The present document does not describe implementation, nor does it discuss in any detail how the modules employ packet headings and data formats to communicate with their partner modules at the other end of a stream. There being one logical module per end of stream is a characteristic only of the Presentation, Session, and Transport layers. An Application layer module usually manages several streams, orchestrating them to achieve some desired purpose. The modules of the layers (Network, Link, and Physical) below the APST layers each handle many streams, multiplexing them through the nodes and channels of the network to transmit them from their origins to their destinations.

  1. Interfacing Ada and other languages

    NASA Technical Reports Server (NTRS)

    Baffes, Paul; West, Brian

    1986-01-01

    Interfacing two separately developed compilers is a complex task. The complexity arises because few design standards exist for compiler development. This, coupled with the many complicated design decisions inherent in compiler construction, usually guarantees noncompatibility. The interface subroutine which would link the two different run time environments would resolve as many of the dissimilarities as possible. The differences that could not be resolved would be responsible for the restrictions placed on the interface. Albeit restrictions would exist, the resulting interface may be well worthwhile.

  2. The Evolution of Neuroprosthetic Interfaces

    PubMed Central

    Adewole, Dayo O.; Serruya, Mijail D.; Harris, James P.; Burrell, Justin C.; Petrov, Dmitriy; Chen, H. Isaac; Wolf, John A.; Cullen, D. Kacy

    2017-01-01

    The ideal neuroprosthetic interface permits high-quality neural recording and stimulation of the nervous system while reliably providing clinical benefits over chronic periods. Although current technologies have made notable strides in this direction, significant improvements must be made to better achieve these design goals and satisfy clinical needs. This article provides an overview of the state of neuroprosthetic interfaces, starting with the design and placement of these interfaces before exploring the stimulation and recording platforms yielded from contemporary research. Finally, we outline emerging research trends in an effort to explore the potential next generation of neuroprosthetic interfaces. PMID:27652455

  3. mREST Interface Specification

    NASA Technical Reports Server (NTRS)

    McCartney, Patrick; MacLean, John

    2012-01-01

    mREST is an implementation of the REST architecture specific to the management and sharing of data in a system of logical elements. The purpose of this document is to clearly define the mREST interface protocol. The interface protocol covers all of the interaction between mREST clients and mREST servers. System-level requirements are not specifically addressed. In an mREST system, there are typically some backend interfaces between a Logical System Element (LSE) and the associated hardware/software system. For example, a network camera LSE would have a backend interface to the camera itself. These interfaces are specific to each type of LSE and are not covered in this document. There are also frontend interfaces that may exist in certain mREST manager applications. For example, an electronic procedure execution application may have a specialized interface for configuring the procedures. This interface would be application specific and outside of this document scope. mREST is intended to be a generic protocol which can be used in a wide variety of applications. A few scenarios are discussed to provide additional clarity but, in general, application-specific implementations of mREST are not specifically addressed. In short, this document is intended to provide all of the information necessary for an application developer to create mREST interface agents. This includes both mREST clients (mREST manager applications) and mREST servers (logical system elements, or LSEs).

  4. Interface-assisted molecular spintronics

    SciTech Connect

    Raman, Karthik V.

    2014-09-15

    Molecular spintronics, a field that utilizes the spin state of organic molecules to develop magneto-electronic devices, has shown an enormous scientific activity for more than a decade. But, in the last couple of years, new insights in understanding the fundamental phenomena of molecular interaction on magnetic surfaces, forming a hybrid interface, are presenting a new pathway for developing the subfield of interface-assisted molecular spintronics. The recent exploration of such hybrid interfaces involving carbon based aromatic molecules shows a significant excitement and promise over the previously studied single molecular magnets. In the above new scenario, hybridization of the molecular orbitals with the spin-polarized bands of the surface creates new interface states with unique electronic and magnetic character. This study opens up a molecular-genome initiative in designing new handles to functionalize the spin dependent electronic properties of the hybrid interface to construct spin-functional tailor-made devices. Through this article, we review this subject by presenting a fundamental understanding of the interface spin-chemistry and spin-physics by taking support of advanced computational and spectroscopy tools to investigate molecular spin responses with demonstration of new interface phenomena. Spin-polarized scanning tunneling spectroscopy is favorably considered to be an important tool to investigate these hybrid interfaces with intra-molecular spatial resolution. Finally, by addressing some of the recent findings, we propose novel device schemes towards building interface tailored molecular spintronic devices for applications in sensor, memory, and quantum computing.

  5. Interfaces in perovskite solar cells.

    PubMed

    Shi, Jiangjian; Xu, Xin; Li, Dongmei; Meng, Qingbo

    2015-06-03

    The interfacial atomic and electronic structures, charge transfer processes, and interface engineering in perovskite solar cells are discussed in this review. An effective heterojunction is found to exist at the window/perovskite absorber interface, contributing to the relatively fast extraction of free electrons. Moreover, the high photovoltage in this cell can be attributed to slow interfacial charge recombination due to the outstanding material and interfacial electronic properties. However, some fundamental questions including the interfacial atomic and electronic structures and the interface stability need to be further clarified. Designing and engineering the interfaces are also important for the next-stage development of this cell.

  6. Interface-assisted molecular spintronics

    NASA Astrophysics Data System (ADS)

    Raman, Karthik V.

    2014-09-01

    Molecular spintronics, a field that utilizes the spin state of organic molecules to develop magneto-electronic devices, has shown an enormous scientific activity for more than a decade. But, in the last couple of years, new insights in understanding the fundamental phenomena of molecular interaction on magnetic surfaces, forming a hybrid interface, are presenting a new pathway for developing the subfield of interface-assisted molecular spintronics. The recent exploration of such hybrid interfaces involving carbon based aromatic molecules shows a significant excitement and promise over the previously studied single molecular magnets. In the above new scenario, hybridization of the molecular orbitals with the spin-polarized bands of the surface creates new interface states with unique electronic and magnetic character. This study opens up a molecular-genome initiative in designing new handles to functionalize the spin dependent electronic properties of the hybrid interface to construct spin-functional tailor-made devices. Through this article, we review this subject by presenting a fundamental understanding of the interface spin-chemistry and spin-physics by taking support of advanced computational and spectroscopy tools to investigate molecular spin responses with demonstration of new interface phenomena. Spin-polarized scanning tunneling spectroscopy is favorably considered to be an important tool to investigate these hybrid interfaces with intra-molecular spatial resolution. Finally, by addressing some of the recent findings, we propose novel device schemes towards building interface tailored molecular spintronic devices for applications in sensor, memory, and quantum computing.

  7. Multimodal human-machine interface based on a brain-computer interface and an electrooculography interface.

    PubMed

    Iáñez, Eduardo; Ùbeda, Andrés; Azorín, José M

    2011-01-01

    This paper describes a multimodal interface that combines a Brain-Computer Interface (BCI) with an electrooculography (EOG) interface. The non-invasive spontaneous BCI registers the electrical brain activity through surface electrodes. The EOG interface detects the eye movements through electrodes placed on the face around the eyes. Both kind of signals are registered together and processed to obtain the mental task that the user is thinking and the eye movement performed by the user. Both commands (mental task and eye movement) are combined in order to move a dot in a graphic user interface (GUI). Several experimental tests have been made where the users perform a trajectory to get closer to some targets. To perform the trajectory the user moves the dot in a plane with the EOG interface and using the BCI the dot changes its height.

  8. The interface engine: experimental consequences.

    PubMed

    Tauer, Klaus; Kozempel, Steffen; Rother, Gudrun

    2007-08-15

    A light microscopy study confirms spontaneous emulsification at the quiescent, thermally equilibrated interface between pure oil and pure water during the chemical equilibration period. The process is qualitatively explained within the frame of the classical nucleation theory assuming a mixed interface layer between the two liquids in contact allowing supersaturation.

  9. Adaptive fast interface tracking methods

    NASA Astrophysics Data System (ADS)

    Popovic, Jelena; Runborg, Olof

    2017-05-01

    In this paper, we present a fast time adaptive numerical method for interface tracking. The method uses an explicit multiresolution description of the interface, which is represented by wavelet vectors that correspond to the details of the interface on different scale levels. The complexity of standard numerical methods for interface tracking, where the interface is described by N marker points, is O (N / Δt), when a time step Δt is used. The methods that we propose in this paper have O (TOL - 1 / p log ⁡ N + Nlog ⁡ N) computational cost, at least for uniformly smooth problems, where TOL is some given tolerance and p is the order of the time stepping method that is used for time advection of the interface. The adaptive method is robust in the sense that it can handle problems with both smooth and piecewise smooth interfaces (e.g. interfaces with corners) while keeping a low computational cost. We show numerical examples that verify these properties.

  10. Overview of Graphical User Interfaces.

    ERIC Educational Resources Information Center

    Hulser, Richard P.

    1993-01-01

    Discussion of graphical user interfaces for online public access catalogs (OPACs) covers the history of OPACs; OPAC front-end design, including examples from Indiana University and the University of Illinois; and planning and implementation of a user interface. (10 references) (EA)

  11. Using eye movement to control a computer: a design for a lightweight electro-oculogram electrode array and computer interface.

    PubMed

    Iáñez, Eduardo; Azorin, Jose M; Perez-Vidal, Carlos

    2013-01-01

    This paper describes a human-computer interface based on electro-oculography (EOG) that allows interaction with a computer using eye movement. The EOG registers the movement of the eye by measuring, through electrodes, the difference of potential between the cornea and the retina. A new pair of EOG glasses have been designed to improve the user's comfort and to remove the manual procedure of placing the EOG electrodes around the user's eye. The interface, which includes the EOG electrodes, uses a new processing algorithm that is able to detect the gaze direction and the blink of the eyes from the EOG signals. The system reliably enabled subjects to control the movement of a dot on a video screen.

  12. Using Eye Movement to Control a Computer: A Design for a Lightweight Electro-Oculogram Electrode Array and Computer Interface

    PubMed Central

    Iáñez, Eduardo; Azorin, Jose M.; Perez-Vidal, Carlos

    2013-01-01

    This paper describes a human-computer interface based on electro-oculography (EOG) that allows interaction with a computer using eye movement. The EOG registers the movement of the eye by measuring, through electrodes, the difference of potential between the cornea and the retina. A new pair of EOG glasses have been designed to improve the user's comfort and to remove the manual procedure of placing the EOG electrodes around the user's eye. The interface, which includes the EOG electrodes, uses a new processing algorithm that is able to detect the gaze direction and the blink of the eyes from the EOG signals. The system reliably enabled subjects to control the movement of a dot on a video screen. PMID:23843986

  13. Colloids at Curved Fluid Interfaces

    NASA Astrophysics Data System (ADS)

    Stebe, Kathleen

    2016-11-01

    Fluid interfaces are remarkable sites for colloidal assembly. When a colloid attaches to a fluid interface, it distorts a region around it; this distortion has an associated capillary energy, the product of its area and interfacial tension. The particle's capillary energy depends on the local interface curvature. By molding the interface, we can define curvature fields that drive microparticles along pre-determined paths. This example captures the emergent nature of the interactions. We discuss curvature fields as analogues to external electro-magnetic fields, and define curvatures that drive particles to well-defined locations, and to equilibrium sites far from boundaries. Particle-particle and particle-curvature interactions can guide particles into structures via interaction among many particles. This work demonstrates the potential importance of curvature capillary interactions in schemes to make reconfigurable materials, since interfaces and their associated capillary energy landscapes can be readily reconfigured. Analogies in other soft systems will be described. Support acknowledged from NSF DMR 1607878.

  14. Aquatic Acoustic Metrics Interface

    SciTech Connect

    2012-12-18

    Fishes and marine mammals may suffer a range of potential effects from exposure to intense underwater sound generated by anthropogenic activities such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording (USR) devices have been built to acquire samples of the underwater sound generated by anthropogenic activities. Software becomes indispensable for processing and analyzing the audio files recorded by these USRs. The new Aquatic Acoustic Metrics Interface Utility Software (AAMI) is specifically designed for analysis of underwater sound recordings to provide data in metrics that facilitate evaluation of the potential impacts of the sound on aquatic animals. In addition to the basic functions, such as loading and editing audio files recorded by USRs and batch processing of sound files, the software utilizes recording system calibration data to compute important parameters in physical units. The software also facilitates comparison of the noise sound sample metrics with biological measures such as audiograms of the sensitivity of aquatic animals to the sound, integrating various components into a single analytical frame.

  15. Power User Interface

    NASA Technical Reports Server (NTRS)

    Pfister, Robin; McMahon, Joe

    2006-01-01

    Power User Interface 5.0 (PUI) is a system of middleware, written for expert users in the Earth-science community, PUI enables expedited ordering of data granules on the basis of specific granule-identifying information that the users already know or can assemble. PUI also enables expert users to perform quick searches for orderablegranule information for use in preparing orders. PUI 5.0 is available in two versions (note: PUI 6.0 has command-line mode only): a Web-based application program and a UNIX command-line- mode client program. Both versions include modules that perform data-granule-ordering functions in conjunction with external systems. The Web-based version works with Earth Observing System Clearing House (ECHO) metadata catalog and order-entry services and with an open-source order-service broker server component, called the Mercury Shopping Cart, that is provided separately by Oak Ridge National Laboratory through the Department of Energy. The command-line version works with the ECHO metadata and order-entry process service. Both versions of PUI ultimately use ECHO to process an order to be sent to a data provider. Ordered data are provided through means outside the PUI software system.

  16. Next Generation Search Interfaces

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2015-09-01

    Astronomers are constantly looking for easier ways to access multiple data sets. While much effort is spent on VO, little thought is given to the types of User Interfaces we need to effectively search this sort of data. For instance, an astronomer might need to search Spitzer, WISE, and 2MASS catalogs and images then see the results presented together in one UI. Moving seamlessly between data sets is key to presenting integrated results. Results need to be viewed using first class, web based, integrated FITS viewers, XY Plots, and advanced table display tools. These components should be able to handle very large datasets. To make a powerful Web based UI that can manage and present multiple searches to the user requires taking advantage of many HTML5 features. AJAX is used to start searches and present results. Push notifications (Server Sent Events) monitor background jobs. Canvas is required for advanced result displays. Lesser known CSS3 technologies makes it all flow seamlessly together. At IPAC, we have been developing our Firefly toolkit for several years. We are now using it to solve this multiple data set, multiple queries, and integrated presentation problem to create a powerful research experience. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). Firefly is the core for applications serving many project archives, including Spitzer, Planck, WISE, PTF, LSST and others. It is also used in IRSA's new Finder Chart and catalog and image displays.

  17. Online Remote Sensing Interface

    NASA Technical Reports Server (NTRS)

    Lawhead, Joel

    2007-01-01

    BasinTools Module 1 processes remotely sensed raster data, including multi- and hyper-spectral data products, via a Web site with no downloads and no plug-ins required. The interface provides standardized algorithms designed so that a user with little or no remote-sensing experience can use the site. This Web-based approach reduces the amount of software, hardware, and computing power necessary to perform the specified analyses. Access to imagery and derived products is enterprise-level and controlled. Because the user never takes possession of the imagery, the licensing of the data is greatly simplified. BasinTools takes the "just-in-time" inventory control model from commercial manufacturing and applies it to remotely-sensed data. Products are created and delivered on-the-fly with no human intervention, even for casual users. Well-defined procedures can be combined in different ways to extend verified and validated methods in order to derive new remote-sensing products, which improves efficiency in any well-defined geospatial domain. Remote-sensing products produced in BasinTools are self-documenting, allowing procedures to be independently verified or peer-reviewed. The software can be used enterprise-wide to conduct low-level remote sensing, viewing, sharing, and manipulating of image data without the need for desktop applications.

  18. User interface enhancement report

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Gangel, J.; Shields, G.; Fala, G.

    1985-01-01

    The existing user interfaces to TEMPUS, Plaid, and other systems in the OSDS are fundamentally based on only two modes of communication: alphanumeric commands or data input and grapical interaction. The latter are especially suited to the types of interaction necessary for creating workstation objects with BUILD and with performing body positioning in TEMPUS. Looking toward the future application of TEMPUS, however, the long-term goals of OSDS will include the analysis of extensive tasks in space involving one or more individuals working in concert over a period of time. In this context, the TEMPUS body positioning capability, though extremely useful in creating and validating a small number of particular body positions, will become somewhat tedious to use. The macro facility helps somewhat, since frequently used positions may be easily applied by executing a stored macro. The difference between body positioning and task execution, though subtle, is important. In the case of task execution, the important information at the user's level is what actions are to be performed rather than how the actions are performed. Viewed slightly differently, the what is constant over a set of individuals though the how may vary.

  19. The Human Computer Interaction Certificate Program at Rensselaer Polytechnic Institute: A Case Study in the Benefits and Costs of a Joint Industry/University Designed Program Featuring Integrated Delivery Methods.

    ERIC Educational Resources Information Center

    Jewett, Frank I.

    This case study presents information about a graduate-level certificate program in human computer interaction that was added to the Rensselaer Polytechnic Institute (New York) satellite video program in 1996, as a cooperative program between the institution and the IBM Corporation. The program was designed for individuals who work in computer…

  20. Simulation of a sensor array for multiparameter measurements at the prosthetic limb interface

    NASA Astrophysics Data System (ADS)

    Rowe, Gabriel I.; Mamishev, Alexander V.

    2004-07-01

    Sensitive skin is a highly desired device for biomechanical devices, wearable computing, human-computer interfaces, exoskeletons, and, most pertinent to this paper, for lower limb prosthetics. The measurement of shear stress is very important because shear effects are key factors in developing surface abrasions and pressure sores in paraplegics and users of prosthetic/orthotic devices. A single element of a sensitive skin is simulated and characterized in this paper. Conventional tactile sensors are designed for measurement of the normal stress only, which is inadequate for comprehensive assessment of surface contact conditions. The sensitive skin discussed here is a flexible array capable of sensing shear and normal forces, as well as humidity and temperature on each element.