An Architectural Experience for Interface Design
ERIC Educational Resources Information Center
Gong, Susan P.
2016-01-01
The problem of human-computer interface design was brought to the foreground with the emergence of the personal computer, the increasing complexity of electronic systems, and the need to accommodate the human operator in these systems. With each new technological generation discovering the interface design problems of its own technologies, initial…
An intelligent multi-media human-computer dialogue system
NASA Technical Reports Server (NTRS)
Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.
1988-01-01
Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.
Human factors research facilitates the safe application of technology
DOT National Transportation Integrated Search
1997-01-01
The science of human factors can help us better understand how people and technology-based systems interact. Human factors research not only identifies potential problems in system operator interfaces but also can define human limitations in the use ...
NASA Technical Reports Server (NTRS)
2007-01-01
This document provides definition of technology human interface requirements for Collision Avoidance (CA). This was performed through a review of CA-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Access 5 CA work package were considered... Beginning with the HSI high-level functional requirement for CA, and CA technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge CA system status, and (2) the control capability needed by the pilot to obtain CA information and affect an avoidance maneuver. Fundamentally, these requirements provide the candidate CA technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how CA operations and functions should interface with the pilot to provide the necessary CA functionality to the UA-pilot system .Requirements and guidelines for CA are partitioned into four categories: (1) General, (2) Alerting, (3) Guidance, and (4) Cockpit Display of Traffic Information. Each requirement is stated and is supported with a rationale and associated reference(s).
Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management
NASA Technical Reports Server (NTRS)
2005-01-01
This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).
NASA Technical Reports Server (NTRS)
2005-01-01
This document involves definition of technology interface requirements for Contingency Management. This was performed through a review of Contingency Management-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Contingency Management Work Package were considered. Beginning with HSI high-level functional requirements for Contingency Management, and Contingency Management technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of system failures and associated contingency procedures, and (2) the control capability needed by the pilot to obtain system status and procedure information. Fundamentally, these requirements provide the candidate Contingency Management technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Contingency Management operations and functions should interface with the pilot to provide the necessary Contingency Management functionality to the UA-pilot system. Requirements and guidelines for Contingency Management are partitioned into four categories: (1) Health and Status and (2) Contingency Management. Each requirement is stated and is supported with a rationale and associated reference(s).
Concept of software interface for BCI systems
NASA Astrophysics Data System (ADS)
Svejda, Jaromir; Zak, Roman; Jasek, Roman
2016-06-01
Brain Computer Interface (BCI) technology is intended to control external system by brain activity. One of main part of such system is software interface, which carries about clear communication between brain and either computer or additional devices connected to computer. This paper is organized as follows. Firstly, current knowledge about human brain is briefly summarized to points out its complexity. Secondly, there is described a concept of BCI system, which is then used to build an architecture of proposed software interface. Finally, there are mentioned disadvantages of sensing technology discovered during sensing part of our research.
Human/Computer Interfacing in Educational Environments.
ERIC Educational Resources Information Center
Sarti, Luigi
1992-01-01
This discussion of educational applications of user interfaces covers the benefits of adopting database techniques in organizing multimedia materials; the evolution of user interface technology, including teletype interfaces, analogic overlay graphics, window interfaces, and adaptive systems; application design problems, including the…
Healthtrak(tm): Technology Enhanced Human Interface to the Computerized Patient Record
2002-07-01
and/or findings contained in this report are those of the author( s ) and should not be construed as an official Department of the Army position...34: Technology Enhanced Human Interface to the DAMDI17-02-C-0032 Computerized Patient Record 6. AUTHOR( S ) Azad M. Madni, Ph.D. Doctor Weiwen Lin Carla...C. Madni 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Intelligent Systems Technology, Incorporated
A Graphical Operator Interface for a Telerobotic Inspection System
NASA Technical Reports Server (NTRS)
Kim, W. S.; Tso, K. S.; Hayati, S.
1993-01-01
Operator interface has recently emerged as an important element for efficient and safe operatorinteractions with the telerobotic system. Recent advances in graphical user interface (GUI) andgraphics/video merging technologies enable development of more efficient, flexible operatorinterfaces. This paper describes an advanced graphical operator interface newly developed for aremote surface inspection system at Jet Propulsion Laboratory. The interface has been designed sothat remote surface inspection can be performed by a single operator with an integrated robot controland image inspection capability. It supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.
Integration of an intelligent systems behavior simulator and a scalable soldier-machine interface
NASA Astrophysics Data System (ADS)
Johnson, Tony; Manteuffel, Chris; Brewster, Benjamin; Tierney, Terry
2007-04-01
As the Army's Future Combat Systems (FCS) introduce emerging technologies and new force structures to the battlefield, soldiers will increasingly face new challenges in workload management. The next generation warfighter will be responsible for effectively managing robotic assets in addition to performing other missions. Studies of future battlefield operational scenarios involving the use of automation, including the specification of existing and proposed technologies, will provide significant insight into potential problem areas regarding soldier workload. The US Army Tank Automotive Research, Development, and Engineering Center (TARDEC) is currently executing an Army technology objective program to analyze and evaluate the effect of automated technologies and their associated control devices with respect to soldier workload. The Human-Robotic Interface (HRI) Intelligent Systems Behavior Simulator (ISBS) is a human performance measurement simulation system that allows modelers to develop constructive simulations of military scenarios with various deployments of interface technologies in order to evaluate operator effectiveness. One such interface is TARDEC's Scalable Soldier-Machine Interface (SMI). The scalable SMI provides a configurable machine interface application that is capable of adapting to several hardware platforms by recognizing the physical space limitations of the display device. This paper describes the integration of the ISBS and Scalable SMI applications, which will ultimately benefit both systems. The ISBS will be able to use the Scalable SMI to visualize the behaviors of virtual soldiers performing HRI tasks, such as route planning, and the scalable SMI will benefit from stimuli provided by the ISBS simulation environment. The paper describes the background of each system and details of the system integration approach.
3D hybrid electrode structure as implantable interface for a vestibular neural prosthesis in humans.
Hoffmann, Klaus-P; Poppendieck, Wigand; Tätzner, Simon; DiGiovanna, Jack; Kos, Maria Izabel; Guinand, Nils; Guyot, Jean-P; Micera, Silvestro
2011-01-01
Implantable interfaces are essential components of vestibular neural prostheses. They interface the biological system with electrical stimulation that is used to restore transfer of vestibular information. Regarding the anatomical situation special 3D structures are required. In this paper, the design and the manufacturing process of a novel 3D hybrid microelectrode structure as interface to the human vestibular system are described. Photolithography techniques, assembling technology and rapid prototyping are used for manufacturing.
Human-system interfaces for space cognitive awareness
NASA Astrophysics Data System (ADS)
Ianni, J.
Space situational awareness is a human activity. We have advanced sensors and automation capabilities but these continue to be tools for humans to use. The reality is, however, that humans cannot take full advantage of the power of these tools due to time constraints, cognitive limitations, poor tool integration, poor human-system interfaces, and other reasons. Some excellent tools may never be used in operations and, even if they were, they may not be well suited to provide a cohesive and comprehensive picture. Recognizing this, the Air Force Research Laboratory (AFRL) is applying cognitive science principles to increase the knowledge derived from existing tools and creating new capabilities to help space analysts and decision makers. At the center of this research is Sensemaking Support Environment technology. The concept is to create cognitive-friendly computer environments that connect critical and creative thinking for holistic decision making. AFRL is also investigating new visualization technologies for multi-sensor exploitation and space weather, human-to-human collaboration technologies, and other technology that will be discussed in this paper.
NASA Technical Reports Server (NTRS)
2005-01-01
The document provides the Human System Integration(HSI) high-level functional C3 HSI requirements for the interface to the pilot. Description includes (1) the information required by the pilot to have knowledge C3 system status, and (2) the control capability needed by the pilot to obtain C3 information. Fundamentally, these requirements provide the candidate C3 technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how C3 operations and functions should interface with the pilot to provide the necessary C3 functionality to the UA-pilot system. Requirements and guidelines for C3 are partitioned into three categories: (1) Pilot-Air Traffic Control (ATC) Voice Communications (2) Pilot-ATC Data Communications, and (3) command and control of the unmanned aircraft (UA). Each requirement is stated and is supported with a rationale and associated reference(s).
An operator interface design for a telerobotic inspection system
NASA Technical Reports Server (NTRS)
Kim, Won S.; Tso, Kam S.; Hayati, Samad
1993-01-01
The operator interface has recently emerged as an important element for efficient and safe interactions between human operators and telerobotics. Advances in graphical user interface and graphics technologies enable us to produce very efficient operator interface designs. This paper describes an efficient graphical operator interface design newly developed for remote surface inspection at NASA-JPL. The interface, designed so that remote surface inspection can be performed by a single operator with an integrated robot control and image inspection capability, supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Kaber, David B.
2006-01-01
This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.
Hands in space: gesture interaction with augmented-reality interfaces.
Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai
2014-01-01
Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1998-01-01
Historically Command Management Systems (CMS) have been large, expensive, spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as a to develop a more generic or a set of core components for CMS systems. Current MOC (mission operations center) hardware and software include Unix workstations, the C/C++ and Java programming languages, and X and Java window interfaces representations. This configuration provides the power and flexibility to support sophisticated systems and intelligent user interfaces that exploit state-of-the-art technologies in human-machine systems engineering, decision making, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of the issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, design and analysis tools from a human-machine systems engineering point of view (e.g., operator and designer models) and human-computer interaction tools, (e.g., graphics, visualization, and animation), may provide significant savings in the design, operation, and maintenance of a spacecraft-specific CMS as well as continuity for CMS design and development across spacecraft with varying needs. The savings in this case is in software reuse at all stages of the software engineering process.
Neurotechnology for monitoring and restoring sensory, motor, and autonomic functions
NASA Astrophysics Data System (ADS)
Wu, Pae C.; Knaack, Gretchen; Weber, Douglas J.
2016-05-01
The rapid and exponential advances in micro- and nanotechnologies over the last decade have enabled devices that communicate directly with the nervous system to measure and influence neural activity. Many of the earliest implementations focused on restoration of sensory and motor function, but as knowledge of physiology advances and technology continues to improve in accuracy, precision, and safety, new modes of engaging with the autonomic system herald an era of health restoration that may augment or replace many conventional pharmacotherapies. DARPA's Biological Technologies Office is continuing to advance neurotechnology by investing in neural interface technologies that are effective, reliable, and safe for long-term use in humans. DARPA's Hand Proprioception and Touch Interfaces (HAPTIX) program is creating a fully implantable system that interfaces with peripheral nerves in amputees to enable natural control and sensation for prosthetic limbs. Beyond standard electrode implementations, the Electrical Prescriptions (ElectRx) program is investing in innovative approaches to minimally or non-invasively interface with the peripheral nervous system using novel magnetic, optogenetic, and ultrasound-based technologies. These new mechanisms of interrogating and stimulating the peripheral nervous system are driving towards unparalleled spatiotemporal resolution, specificity and targeting, and noninvasiveness to enable chronic, human-use applications in closed-loop neuromodulation for the treatment of disease.
Advanced technologies for Mission Control Centers
NASA Technical Reports Server (NTRS)
Dalton, John T.; Hughes, Peter M.
1991-01-01
Advance technologies for Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: technology needs; current technology efforts at GSFC (human-machine interface development, object oriented software development, expert systems, knowledge-based software engineering environments, and high performance VLSI telemetry systems); and test beds.
Digital Systems Validation Handbook. Volume 2. Chapter 19. Pilot - Vehicle Interface
1993-11-01
checklists, and other status messages. Voice interactive systems are defi-ed as "the interface between a cooperative human and a machine, which involv -he...Pilot-Vehicle Interface 19-85 5.6.1 Crew Interaction and the Cockpit 19-85 5.6.2 Crew Resource Management and Safety 19-87 5.6.3 Pilot and Crew Training...systems was a "stand-alone" component performing its intended function. Systems and their cockpit interfaces were added as technological advances were
Voice Response Systems Technology.
ERIC Educational Resources Information Center
Gerald, Jeanette
1984-01-01
Examines two methods of generating synthetic speech in voice response systems, which allow computers to communicate in human terms (speech), using human interface devices (ears): phoneme and reconstructed voice systems. Considerations prior to implementation, current and potential applications, glossary, directory, and introduction to Input Output…
Szalma, James L
2014-12-01
Motivation is a driving force in human-technology interaction. This paper represents an effort to (a) describe a theoretical model of motivation in human technology interaction, (b) provide design principles and guidelines based on this theory, and (c) describe a sequence of steps for the. evaluation of motivational factors in human-technology interaction. Motivation theory has been relatively neglected in human factors/ergonomics (HF/E). In both research and practice, the (implicit) assumption has been that the operator is already motivated or that motivation is an organizational concern and beyond the purview of HF/E. However, technology can induce task-related boredom (e.g., automation) that can be stressful and also increase system vulnerability to performance failures. A theoretical model of motivation in human-technology interaction is proposed, based on extension of the self-determination theory of motivation to HF/E. This model provides the basis for both future research and for development of practical recommendations for design. General principles and guidelines for motivational design are described as well as a sequence of steps for the design process. Human motivation is an important concern for HF/E research and practice. Procedures in the design of both simple and complex technologies can, and should, include the evaluation of motivational characteristics of the task, interface, or system. In addition, researchers should investigate these factors in specific human-technology domains. The theory, principles, and guidelines described here can be incorporated into existing techniques for task analysis and for interface and system design.
An intelligent control and virtual display system for evolutionary space station workstation design
NASA Technical Reports Server (NTRS)
Feng, Xin; Niederjohn, Russell J.; Mcgreevy, Michael W.
1992-01-01
Research and development of the Advanced Display and Computer Augmented Control System (ADCACS) for the space station Body-Ported Cupola Virtual Workstation (BP/VCWS) were pursued. The potential applications were explored of body ported virtual display and intelligent control technology for the human-system interfacing applications is space station environment. The new system is designed to enable crew members to control and monitor a variety of space operations with greater flexibility and efficiency than existing fixed consoles. The technologies being studied include helmet mounted virtual displays, voice and special command input devices, and microprocessor based intelligent controllers. Several research topics, such as human factors, decision support expert systems, and wide field of view, color displays are being addressed. The study showed the significant advantages of this uniquely integrated display and control system, and its feasibility for human-system interfacing applications in the space station command and control environment.
The Application of Current User Interface Technology to Interactive Wargaming Systems.
1987-09-01
components is essential to the Macintosh interface. Apple states that "Consistent visual communication is very powerful in delivering complex messages...interface. A visual interface uses visual objects as the basis of communication. "A visual communication object is some combination S. of text and...graphics used for communication under a system of inter- pretation, or visual language." The benefit of visual communication is V 45 "When humans are faced
A method to select human-system interfaces for nuclear power plants
Hugo, Jacques Victor; Gertman, David Ira
2015-10-19
The new generation of nuclear power plants (NPPs) will likely make use of state-of-the-art technologies in many areas of the plant. The analysis, design, and selection of advanced human–system interfaces (HSIs) constitute an important part of power plant engineering. Designers need to consider the new capabilities afforded by these technologies in the context of current regulations and new operational concepts, which is why they need a more rigorous method by which to plan the introduction of advanced HSIs in NPP work areas. Much of current human factors research stops at the user interface and fails to provide a definitive processmore » for integration of end user devices with instrumentation and control (I&C) and operational concepts. The current lack of a clear definition of HSI technology, including the process for integration, makes characterization and implementation of new and advanced HSIs difficult. This paper describes how new design concepts in the nuclear industry can be analyzed and how HSI technologies associated with new industrial processes might be considered. Furthermore, it also describes a basis for an understanding of human as well as technology characteristics that could be incorporated into a prioritization scheme for technology selection and deployment plans.« less
NASA Technical Reports Server (NTRS)
Potter, William J.; Mitchell, Christine M.
1993-01-01
Historically, command management systems (CMS) have been large and expensive spacecraft-specific software systems that were costly to build, operate, and maintain. Current and emerging hardware, software, and user interface technologies may offer an opportunity to facilitate the initial formulation and design of a spacecraft-specific CMS as well as to develop a more generic CMS system. New technologies, in addition to a core CMS common to a range of spacecraft, may facilitate the training and enhance the efficiency of CMS operations. Current mission operations center (MOC) hardware and software include Unix workstations, the C/C++ programming languages, and an X window interface. This configuration provides the power and flexibility to support sophisticated and intelligent user interfaces that exploit state-of-the-art technologies in human-machine interaction, artificial intelligence, and software engineering. One of the goals of this research is to explore the extent to which technologies developed in the research laboratory can be productively applied in a complex system such as spacecraft command management. Initial examination of some of these issues in CMS design and operation suggests that application of technologies such as intelligent planning, case-based reasoning, human-machine systems design and analysis tools (e.g., operator and designer models), and human-computer interaction tools (e.g., graphics, visualization, and animation) may provide significant savings in the design, operation, and maintenance of the CMS for a specific spacecraft as well as continuity for CMS design and development across spacecraft. The first six months of this research saw a broad investigation by Georgia Tech researchers into the function, design, and operation of current and planned command management systems at Goddard Space Flight Center. As the first step, the researchers attempted to understand the current and anticipated horizons of command management systems at Goddard. Preliminary results are given on CMS commonalities and causes of low re-use, and methods are proposed to facilitate increased re-use.
Diverse applications of advanced man-telerobot interfaces
NASA Technical Reports Server (NTRS)
Mcaffee, Douglas A.
1991-01-01
Advancements in man-machine interfaces and control technologies used in space telerobotics and teleoperators have potential application wherever human operators need to manipulate multi-dimensional spatial relationships. Bilateral six degree-of-freedom position and force cues exchanged between the user and a complex system can broaden and improve the effectiveness of several diverse man-machine interfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donald D Dudenhoeffer; Burce P Hallbert
Instrumentation, Controls, and Human-Machine Interface (ICHMI) technologies are essential to ensuring delivery and effective operation of optimized advanced Generation IV (Gen IV) nuclear energy systems. In 1996, the Watts Bar I nuclear power plant in Tennessee was the last U.S. nuclear power plant to go on line. It was, in fact, built based on pre-1990 technology. Since this last U.S. nuclear power plant was designed, there have been major advances in the field of ICHMI systems. Computer technology employed in other industries has advanced dramatically, and computing systems are now replaced every few years as they become functionally obsolete. Functionalmore » obsolescence occurs when newer, more functional technology replaces or supersedes an existing technology, even though an existing technology may well be in working order.Although ICHMI architectures are comprised of much of the same technology, they have not been updated nearly as often in the nuclear power industry. For example, some newer Personal Digital Assistants (PDAs) or handheld computers may, in fact, have more functionality than the 1996 computer control system at the Watts Bar I plant. This illustrates the need to transition and upgrade current nuclear power plant ICHMI technologies.« less
Process for Selecting System Level Assessments for Human System Technologies
NASA Technical Reports Server (NTRS)
Watts, James; Park, John
2006-01-01
The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.
A pen-based system to support pre-operative data collection within an anaesthesia department.
Sanz, M. F.; Gómez, E. J.; Trueba, I.; Cano, P.; Arredondo, M. T.; del Pozo, F.
1993-01-01
This paper describes the design and implementation of a pen-based computer system for remote preoperative data collection. The system is envisaged to be used by anaesthesia staff at different hospital scenarios where pre-operative data are generated. Pen-based technology offers important advantages in terms of portability and human-computer interaction, as direct manipulation interfaces by direct pointing, and "notebook user interfaces metaphors". Being the human factors analysis and user interface design a vital stage to achieve the appropriate user acceptability, a methodology that integrates the "usability" evaluation from the earlier development stages was used. Additionally, the selection of a pen-based computer system as a portable device to be used by health care personnel allows to evaluate the appropriateness of this new technology for remote data collection within the hospital environment. The work presented is currently being realised under the Research Project "TANIT: Telematics in Anaesthesia and Intensive Care", within the "A.I.M.--Telematics in Health CARE" European Research Program. PMID:8130488
Conscious brain-to-brain communication in humans using non-invasive technologies.
Grau, Carles; Ginhoux, Romuald; Riera, Alejandro; Nguyen, Thanh Lam; Chauvat, Hubert; Berg, Michel; Amengual, Julià L; Pascual-Leone, Alvaro; Ruffini, Giulio
2014-01-01
Human sensory and motor systems provide the natural means for the exchange of information between individuals, and, hence, the basis for human civilization. The recent development of brain-computer interfaces (BCI) has provided an important element for the creation of brain-to-brain communication systems, and precise brain stimulation techniques are now available for the realization of non-invasive computer-brain interfaces (CBI). These technologies, BCI and CBI, can be combined to realize the vision of non-invasive, computer-mediated brain-to-brain (B2B) communication between subjects (hyperinteraction). Here we demonstrate the conscious transmission of information between human brains through the intact scalp and without intervention of motor or peripheral sensory systems. Pseudo-random binary streams encoding words were transmitted between the minds of emitter and receiver subjects separated by great distances, representing the realization of the first human brain-to-brain interface. In a series of experiments, we established internet-mediated B2B communication by combining a BCI based on voluntary motor imagery-controlled electroencephalographic (EEG) changes with a CBI inducing the conscious perception of phosphenes (light flashes) through neuronavigated, robotized transcranial magnetic stimulation (TMS), with special care taken to block sensory (tactile, visual or auditory) cues. Our results provide a critical proof-of-principle demonstration for the development of conscious B2B communication technologies. More fully developed, related implementations will open new research venues in cognitive, social and clinical neuroscience and the scientific study of consciousness. We envision that hyperinteraction technologies will eventually have a profound impact on the social structure of our civilization and raise important ethical issues.
Conscious Brain-to-Brain Communication in Humans Using Non-Invasive Technologies
Grau, Carles; Ginhoux, Romuald; Riera, Alejandro; Nguyen, Thanh Lam; Chauvat, Hubert; Berg, Michel; Amengual, Julià L.; Pascual-Leone, Alvaro; Ruffini, Giulio
2014-01-01
Human sensory and motor systems provide the natural means for the exchange of information between individuals, and, hence, the basis for human civilization. The recent development of brain-computer interfaces (BCI) has provided an important element for the creation of brain-to-brain communication systems, and precise brain stimulation techniques are now available for the realization of non-invasive computer-brain interfaces (CBI). These technologies, BCI and CBI, can be combined to realize the vision of non-invasive, computer-mediated brain-to-brain (B2B) communication between subjects (hyperinteraction). Here we demonstrate the conscious transmission of information between human brains through the intact scalp and without intervention of motor or peripheral sensory systems. Pseudo-random binary streams encoding words were transmitted between the minds of emitter and receiver subjects separated by great distances, representing the realization of the first human brain-to-brain interface. In a series of experiments, we established internet-mediated B2B communication by combining a BCI based on voluntary motor imagery-controlled electroencephalographic (EEG) changes with a CBI inducing the conscious perception of phosphenes (light flashes) through neuronavigated, robotized transcranial magnetic stimulation (TMS), with special care taken to block sensory (tactile, visual or auditory) cues. Our results provide a critical proof-of-principle demonstration for the development of conscious B2B communication technologies. More fully developed, related implementations will open new research venues in cognitive, social and clinical neuroscience and the scientific study of consciousness. We envision that hyperinteraction technologies will eventually have a profound impact on the social structure of our civilization and raise important ethical issues. PMID:25137064
Soft, Conformal Bioelectronics for a Wireless Human-Wheelchair Interface
Mishra, Saswat; Norton, James J. S.; Lee, Yongkuk; Lee, Dong Sup; Agee, Nicolas; Chen, Yanfei; Chun, Youngjae; Yeo, Woon-Hong
2017-01-01
There are more than 3 million people in the world whose mobility relies on wheelchairs. Recent advancement on engineering technology enables more intuitive, easy-to-use rehabilitation systems. A human-machine interface that uses non-invasive, electrophysiological signals can allow a systematic interaction between human and devices; for example, eye movement-based wheelchair control. However, the existing machine-interface platforms are obtrusive, uncomfortable, and often cause skin irritations as they require a metal electrode affixed to the skin with a gel and acrylic pad. Here, we introduce a bioelectronic system that makes dry, conformal contact to the skin. The mechanically comfortable sensor records high-fidelity electrooculograms, comparable to the conventional gel electrode. Quantitative signal analysis and infrared thermographs show the advantages of the soft biosensor for an ergonomic human-machine interface. A classification algorithm with an optimized set of features shows the accuracy of 94% with five eye movements. A Bluetooth-enabled system incorporating the soft bioelectronics demonstrates a precise, hands-free control of a robotic wheelchair via electrooculograms. PMID:28152485
Zander, Thorsten O; Kothe, Christian
2011-04-01
Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.
Harper, J G; Fuller, R; Sweeney, D; Waldmann, T
1998-04-01
This paper describes ergonomic issues raised during a project to provide a replacement real-time bus route control system to a large public transport company. Task and system analyses highlighted several deficiencies in the original system architecture, the human-machine interfaces and the general approach to system management. The eventual live prototype replaced the existing original system for a trial evaluation period of several weeks. During this period a number of studies was conducted with the system users in order to measure any improvements the new system, with its ergonomic features, produced over the old. Importantly, the results confirmed that (a) general responsiveness and service quality were improved, and (b) users were more comfortable with the new design. We conclude with a number of caveats which we believe will be useful to any group addressing technology impact in a large organisation.
The Voice as Computer Interface: A Look at Tomorrow's Technologies.
ERIC Educational Resources Information Center
Lange, Holley R.
1991-01-01
Discussion of voice as the communications device for computer-human interaction focuses on voice recognition systems for use within a library environment. Voice technologies are described, including voice response and voice recognition; examples of voice systems in use in libraries are examined; and further possibilities, including use with…
Systems Engineering and Integration for Advanced Life Support System and HST
NASA Technical Reports Server (NTRS)
Kamarani, Ali K.
2005-01-01
Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.
Man-machine interface requirements - advanced technology
NASA Technical Reports Server (NTRS)
Remington, R. W.; Wiener, E. L.
1984-01-01
Research issues and areas are identified where increased understanding of the human operator and the interaction between the operator and the avionics could lead to improvements in the performance of current and proposed helicopters. Both current and advanced helicopter systems and avionics are considered. Areas critical to man-machine interface requirements include: (1) artificial intelligence; (2) visual displays; (3) voice technology; (4) cockpit integration; and (5) pilot work loads and performance.
Integration of advanced teleoperation technologies for control of space robots
NASA Technical Reports Server (NTRS)
Stagnaro, Michael J.
1993-01-01
Teleoperated robots require one or more humans to control actuators, mechanisms, and other robot equipment given feedback from onboard sensors. To accomplish this task, the human or humans require some form of control station. Desirable features of such a control station include operation by a single human, comfort, and natural human interfaces (visual, audio, motion, tactile, etc.). These interfaces should work to maximize performance of the human/robot system by streamlining the link between human brain and robot equipment. This paper describes development of a control station testbed with the characteristics described above. Initially, this testbed will be used to control two teleoperated robots. Features of the robots include anthropomorphic mechanisms, slaving to the testbed, and delivery of sensory feedback to the testbed. The testbed will make use of technologies such as helmet mounted displays, voice recognition, and exoskeleton masters. It will allow tor integration and testing of emerging telepresence technologies along with techniques for coping with control link time delays. Systems developed from this testbed could be applied to ground control of space based robots. During man-tended operations, the Space Station Freedom may benefit from ground control of IVA or EVA robots with science or maintenance tasks. Planetary exploration may also find advanced teleoperation systems to be very useful.
Applying Spatial Audio to Human Interfaces: 25 Years of NASA Experience
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Wenzel, Elizabeth M.; Godfrey, Martine; Miller, Joel D.; Anderson, Mark R.
2010-01-01
From the perspective of human factors engineering, the inclusion of spatial audio within a human-machine interface is advantageous from several perspectives. Demonstrated benefits include the ability to monitor multiple streams of speech and non-speech warning tones using a cocktail party advantage, and for aurally-guided visual search. Other potential benefits include the spatial coordination and interaction of multimodal events, and evaluation of new communication technologies and alerting systems using virtual simulation. Many of these technologies were developed at NASA Ames Research Center, beginning in 1985. This paper reviews examples and describes the advantages of spatial sound in NASA-related technologies, including space operations, aeronautics, and search and rescue. The work has involved hardware and software development as well as basic and applied research.
New generation emerging technologies for neurorehabilitation and motor assistance.
Frisoli, Antonio; Solazzi, Massimiliano; Loconsole, Claudio; Barsotti, Michele
2016-12-01
This paper illustrates the application of emerging technologies and human-machine interfaces to the neurorehabilitation and motor assistance fields. The contribution focuses on wearable technologies and in particular on robotic exoskeleton as tools for increasing freedom to move and performing Activities of Daily Living (ADLs). This would result in a deep improvement in quality of life, also in terms of improved function of internal organs and general health status. Furthermore, the integration of these robotic systems with advanced bio-signal driven human-machine interface can increase the degree of participation of patient in robotic training allowing to recognize user's intention and assisting the patient in rehabilitation tasks, thus representing a fundamental aspect to elicit motor learning.
User participation in the development of the human/computer interface for control centers
NASA Technical Reports Server (NTRS)
Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert
1996-01-01
Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.
Multiple man-machine interfaces
NASA Technical Reports Server (NTRS)
Stanton, L.; Cook, C. W.
1981-01-01
The multiple man machine interfaces inherent in military pilot training, their social implications, and the issue of possible negative feedback were explored. Modern technology has produced machines which can see, hear, and touch with greater accuracy and precision than human beings. Consequently, the military pilot is more a systems manager, often doing battle against a target he never sees. It is concluded that unquantifiable human activity requires motivation that is not intrinsic in a machine.
The Human Interface Technology Laboratory.
ERIC Educational Resources Information Center
Washington Univ., Seattle. Washington Technology Center.
This booklet contains information about the Human Interface Technology Laboratory (HITL), which was established by the Washington Technology Center at the University of Washington to transform virtual world concepts and research into practical, economically viable technology products. The booklet is divided into seven sections: (1) a brief…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
These proceedings discuss human factor issues related to aerospace systems, aging, communications, computer systems, consumer products, education and forensic topics, environmental design, industrial ergonomics, international technology transfer, organizational design and management, personality and individual differences in human performance, safety, system development, test and evaluation, training, and visual performance. Particular attention is given to HUDs, attitude indicators, and sensor displays; human factors of space exploration; behavior and aging; the design and evaluation of phone-based interfaces; knowledge acquisition and expert systems; handwriting, speech, and other input techniques; interface design for text, numerics, and speech; and human factor issues in medicine. Also discussedmore » are cumulative trauma disorders, industrial safety, evaluative techniques for automation impacts on the human operators, visual issues in training, and interpreting and organizing human factor concepts and information.« less
NASA Astrophysics Data System (ADS)
Fern, Lisa Carolynn
This dissertation examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will be deployed into complex systems. A key question for new technologies with increasingly capable automation, is how work will be accomplished by human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by designers. The human machine interface (HMI), which is intended to facilitate human-machine interaction and cooperation, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the predicted performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements for a detect and avoid (DAA) system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to a lack of exploration of different approaches to human-automation cooperation. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed. This document concludes with a look at both the importance of, and the challenges facing, the inclusion of examining human-automation coordination issues as part of the safety assurance activities of new technologies.
Selecting Appropriate Functionality and Technologies for EPSS.
ERIC Educational Resources Information Center
McGraw, Karen L.
1995-01-01
Presents background information that describes the major components of an embedded performance support system, compares levels of functionality, and discusses some of the required technologies. Highlights include the human-computer interface; online help; advisors; training and tutoring; hypermedia; and artificial intelligence techniques. (LRW)
Kaplan, A Ya
2016-01-01
Technology brain-computer interface (BCI) based on the registration and interpretation of EEG has recently become one of the most popular developments in neuroscience and psychophysiology. This is due not only to the intended future use of these technologies in many areas of practical human activity, but also to the fact that IMC--is a completely new paradigm in psychophysiology, allowing test hypotheses about the possibilities of the human brain to the development of skills of interaction with the outside world without the mediation of the motor system, i.e. only with the help of voluntary modulation of EEG generators. This paper examines the theoretical and experimental basis, the current state and prospects of development of training, communicational and assisting complexes based on BCI to control them without muscular effort on the basis of mental commands detected in the EEG of patients with severely impaired speech and motor system.
Human-Robot Control Strategies for the NASA/DARPA Robonaut
NASA Technical Reports Server (NTRS)
Diftler, M. A.; Culbert, Chris J.; Ambrose, Robert O.; Huber, E.; Bluethmann, W. J.
2003-01-01
The Robotic Systems Technology Branch at the NASA Johnson Space Center (JSC) is currently developing robot systems to reduce the Extra-Vehicular Activity (EVA) and planetary exploration burden on astronauts. One such system, Robonaut, is capable of interfacing with external Space Station systems that currently have only human interfaces. Robonaut is human scale, anthropomorphic, and designed to approach the dexterity of a space-suited astronaut. Robonaut can perform numerous human rated tasks, including actuating tether hooks, manipulating flexible materials, soldering wires, grasping handrails to move along space station mockups, and mating connectors. More recently, developments in autonomous control and perception for Robonaut have enabled dexterous, real-time man-machine interaction. Robonaut is now capable of acting as a practical autonomous assistant to the human, providing and accepting tools by reacting to body language. A versatile, vision-based algorithm for matching range silhouettes is used for monitoring human activity as well as estimating tool pose.
Knowledge-based load leveling and task allocation in human-machine systems
NASA Technical Reports Server (NTRS)
Chignell, M. H.; Hancock, P. A.
1986-01-01
Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.
Personal and Interpersonal Development of Humans in High Technology Environments.
ERIC Educational Resources Information Center
Morgan, Konrad; Morgan, Madeleine; Hall, John
This paper discusses psychological effects associated with the latest technology in computer interfaces. Emphasis is given to issues involved with multi-media systems and the development of the self on emotional, intellectual, and social levels. A review of technology attitudes and individual differences is presented in relation to the voluntary…
Mastinu, Enzo; Doguet, Pascal; Botquin, Yohan; Hakansson, Bo; Ortiz-Catalan, Max
2017-08-01
Despite the technological progress in robotics achieved in the last decades, prosthetic limbs still lack functionality, reliability, and comfort. Recently, an implanted neuromusculoskeletal interface built upon osseointegration was developed and tested in humans, namely the Osseointegrated Human-Machine Gateway. Here, we present an embedded system to exploit the advantages of this technology. Our artificial limb controller allows for bioelectric signals acquisition, processing, decoding of motor intent, prosthetic control, and sensory feedback. It includes a neurostimulator to provide direct neural feedback based on sensory information. The system was validated using real-time tasks characterization, power consumption evaluation, and myoelectric pattern recognition performance. Functionality was proven in a first pilot patient from whom results of daily usage were obtained. The system was designed to be reliably used in activities of daily living, as well as a research platform to monitor prosthesis usage and training, machine-learning-based control algorithms, and neural stimulation paradigms.
Solazzi, Massimiliano; Loconsole, Claudio; Barsotti, Michele
2016-01-01
This paper illustrates the application of emerging technologies and human-machine interfaces to the neurorehabilitation and motor assistance fields. The contribution focuses on wearable technologies and in particular on robotic exoskeleton as tools for increasing freedom to move and performing Activities of Daily Living (ADLs). This would result in a deep improvement in quality of life, also in terms of improved function of internal organs and general health status. Furthermore, the integration of these robotic systems with advanced bio-signal driven human-machine interface can increase the degree of participation of patient in robotic training allowing to recognize user's intention and assisting the patient in rehabilitation tasks, thus representing a fundamental aspect to elicit motor learning PMID:28484314
Implementing Artificial Intelligence Behaviors in a Virtual World
NASA Technical Reports Server (NTRS)
Krisler, Brian; Thome, Michael
2012-01-01
In this paper, we will present a look at the current state of the art in human-computer interface technologies, including intelligent interactive agents, natural speech interaction and gestural based interfaces. We describe our use of these technologies to implement a cost effective, immersive experience on a public region in Second Life. We provision our Artificial Agents as a German Shepherd Dog avatar with an external rules engine controlling the behavior and movement. To interact with the avatar, we implemented a natural language and gesture system allowing the human avatars to use speech and physical gestures rather than interacting via a keyboard and mouse. The result is a system that allows multiple humans to interact naturally with AI avatars by playing games such as fetch with a flying disk and even practicing obedience exercises using voice and gesture, a natural seeming day in the park.
A Touch Sensing Technique Using the Effects of Extremely Low Frequency Fields on the Human Body
Elfekey, Hatem; Bastawrous, Hany Ayad; Okamoto, Shogo
2016-01-01
Touch sensing is a fundamental approach in human-to-machine interfaces, and is currently under widespread use. Many current applications use active touch sensing technologies. Passive touch sensing technologies are, however, more adequate to implement low power or energy harvesting touch sensing interfaces. This paper presents a passive touch sensing technique based on the fact that the human body is affected by the surrounding extremely low frequency (ELF) electromagnetic fields, such as those of AC power lines. These external ELF fields induce electric potentials on the human body—because human tissues exhibit some conductivity at these frequencies—resulting in what is called AC hum. We therefore propose a passive touch sensing system that detects this hum noise when a human touch occurs, thus distinguishing between touch and non-touch events. The effectiveness of the proposed technique is validated by designing and implementing a flexible touch sensing keyboard. PMID:27918416
A Touch Sensing Technique Using the Effects of Extremely Low Frequency Fields on the Human Body.
Elfekey, Hatem; Bastawrous, Hany Ayad; Okamoto, Shogo
2016-12-02
Touch sensing is a fundamental approach in human-to-machine interfaces, and is currently under widespread use. Many current applications use active touch sensing technologies. Passive touch sensing technologies are, however, more adequate to implement low power or energy harvesting touch sensing interfaces. This paper presents a passive touch sensing technique based on the fact that the human body is affected by the surrounding extremely low frequency (ELF) electromagnetic fields, such as those of AC power lines. These external ELF fields induce electric potentials on the human body-because human tissues exhibit some conductivity at these frequencies-resulting in what is called AC hum. We therefore propose a passive touch sensing system that detects this hum noise when a human touch occurs, thus distinguishing between touch and non-touch events. The effectiveness of the proposed technique is validated by designing and implementing a flexible touch sensing keyboard.
2012-12-01
FRANCE 6.1 DATES SMAART (2006 – 2008) and SUSIE (2009 – 2011). 6.2 LOCATION Brest – Nancy – Paris (France). 6.3 SCENARIO/TASKS The setting...Agency (RTA), a dedicated staff with its headquarters in Neuilly, near Paris , France. In order to facilitate contacts with the military users and...Mission Delay for the Helicopter 8-12 Table 8-2 Assistant Interventions and Commander’s Reactions 8-13 Table 10-1 Partial LOA Matrix as Originally
Development of a stereoscopic three-dimensional drawing application
NASA Astrophysics Data System (ADS)
Carver, Donald E.; McAllister, David F.
1991-08-01
With recent advances in 3-D technology, computer users have the opportunity to work within a natural 3-D environment; a flat panel LCD computer display of this type, the DTI-100M made by Dimension Technologies, Inc., recently went on the market. In a joint venture between DTI and NCSU, an object-oriented 3-D drawing application, 3-D Draw, was developed to address some issues of human interface design for interactive stereo drawing applications. The focus of this paper is to determine some of the procedures a user would naturally expect to follow while working within a true 3-D environment. The paper discusses (1) the interface between the Macintosh II and DTI-100M during implementation of 3-D Draw, including stereo cursor development and presentation of current 2-D systems, with an additional `depth'' parameter, in the 3-D world, (2) problems in general for human interface into the 3-D environment, and (3) necessary functions and/or problems in developing future stereoscopic 3-D operating systems/tools.
NASA Astrophysics Data System (ADS)
Setscheny, Stephan
The interaction between human beings and technology builds a central aspect in human life. The most common form of this human-technology interface is the graphical user interface which is controlled through the mouse and the keyboard. In consequence of continuous miniaturization and the increasing performance of microcontrollers and sensors for the detection of human interactions, developers receive new possibilities for realising innovative interfaces. As far as this movement is concerned, the relevance of computers in the common sense and graphical user interfaces is decreasing. Especially in the area of ubiquitous computing and the interaction through tangible user interfaces a highly impact of this technical evolution can be seen. Apart from this, tangible and experience able interaction offers users the possibility of an interactive and intuitive method for controlling technical objects. The implementation of microcontrollers for control functions and sensors enables the realisation of these experience able interfaces. Besides the theories about tangible user interfaces, the consideration about sensors and the Arduino platform builds a main aspect of this work.
All printed touchless human-machine interface based on only five functional materials
NASA Astrophysics Data System (ADS)
Scheipl, G.; Zirkl, M.; Sawatdee, A.; Helbig, U.; Krause, M.; Kraker, E.; Andersson Ersman, P.; Nilsson, D.; Platt, D.; Bodö, P.; Bauer, S.; Domann, G.; Mogessie, A.; Hartmann, Paul; Stadlober, B.
2012-02-01
We demonstrate the printing of a complex smart integrated system using only five functional inks: the fluoropolymer P(VDF:TrFE) (Poly(vinylidene fluoride trifluoroethylene) sensor ink, the conductive polymer PEDOT:PSS (poly(3,4 ethylenedioxythiophene):poly(styrene sulfonic acid) ink, a conductive carbon paste, a polymeric electrolyte and SU8 for separation. The result is a touchless human-machine interface, including piezo- and pyroelectric sensor pixels (sensitive to pressure changes and impinging infrared light), transistors for impedance matching and signal conditioning, and an electrochromic display. Applications may not only emerge in human-machine interfaces, but also in transient temperature or pressure sensing used in safety technology, in artificial skins and in disposable sensor labels.
Human factors in space telepresence
NASA Technical Reports Server (NTRS)
Akin, D. L.; Howard, R. D.; Oliveria, J. S.
1983-01-01
The problems of interfacing a human with a teleoperation system, for work in space are discussed. Much of the information presented here is the result of experience gained by the M.I.T. Space Systems Laboratory during the past two years of work on the ARAMIS (Automation, Robotics, and Machine Intelligence Systems) project. Many factors impact the design of the man-machine interface for a teleoperator. The effects of each are described in turn. An annotated bibliography gives the key references that were used. No conclusions are presented as a best design, since much depends on the particular application desired, and the relevant technology is swiftly changing.
Waste Collector System Technology Comparisons for Constellation Applications
NASA Technical Reports Server (NTRS)
Broyan, James Lee, Jr.
2006-01-01
The Waste Collection Systems (WCS) for space vehicles have utilized a variety of hardware for collecting human metabolic wastes. It has typically required multiple missions to resolve crew usability and hardware performance issues that are difficult to duplicate on the ground. New space vehicles should leverage off past WCS systems. Past WCS hardware designs are substantially different and unique for each vehicle. However, each WCS can be analyzed and compared as a subset of technologies which encompass fecal collection, urine collection, air systems, pretreatment systems. Technology components from the WCS of various vehicles can then be combined to reduce hardware mass and volume while maximizing use of previous technology and proven human-equipment interfaces. Analysis of past US and Russian WCS are compared and extrapolated to Constellation missions.
Closed-loop dialog model of face-to-face communication with a photo-real virtual human
NASA Astrophysics Data System (ADS)
Kiss, Bernadette; Benedek, Balázs; Szijárto, Gábor; Takács, Barnabás
2004-01-01
We describe an advanced Human Computer Interaction (HCI) model that employs photo-realistic virtual humans to provide digital media users with information, learning services and entertainment in a highly personalized and adaptive manner. The system can be used as a computer interface or as a tool to deliver content to end-users. We model the interaction process between the user and the system as part of a closed loop dialog taking place between the participants. This dialog, exploits the most important characteristics of a face-to-face communication process, including the use of non-verbal gestures and meta communication signals to control the flow of information. Our solution is based on a Virtual Human Interface (VHI) technology that was specifically designed to be able to create emotional engagement between the virtual agent and the user, thus increasing the efficiency of learning and/or absorbing any information broadcasted through this device. The paper reviews the basic building blocks and technologies needed to create such a system and discusses its advantages over other existing methods.
Automatic Speech Recognition in Air Traffic Control: a Human Factors Perspective
NASA Technical Reports Server (NTRS)
Karlsson, Joakim
1990-01-01
The introduction of Automatic Speech Recognition (ASR) technology into the Air Traffic Control (ATC) system has the potential to improve overall safety and efficiency. However, because ASR technology is inherently a part of the man-machine interface between the user and the system, the human factors issues involved must be addressed. Here, some of the human factors problems are identified and related methods of investigation are presented. Research at M.I.T.'s Flight Transportation Laboratory is being conducted from a human factors perspective, focusing on intelligent parser design, presentation of feedback, error correction strategy design, and optimal choice of input modalities.
Determining Value in Higher Education: The Future of Instructional Technology in a Wal-Mart Economy.
ERIC Educational Resources Information Center
Tremblay, Wilfred
1992-01-01
Discusses value and the economy and examines the changing definition of educational value regarding higher education. Trends in instructional technology resulting from changes in expected educational value are described, including resource sharing, specialization, market expansion, privatization, easier human-machine interfaces, feedback systems,…
NASA Astrophysics Data System (ADS)
Roy, Jean; Breton, Richard; Paradis, Stephane
2001-08-01
Situation Awareness (SAW) is essential for commanders to conduct decision-making (DM) activities. Situation Analysis (SA) is defined as a process, the examination of a situation, its elements, and their relations, to provide and maintain a product, i.e., a state of SAW for the decision maker. Operational trends in warfare put the situation analysis process under pressure. This emphasizes the need for a real-time computer-based Situation analysis Support System (SASS) to aid commanders in achieving the appropriate situation awareness, thereby supporting their response to actual or anticipated threats. Data fusion is clearly a key enabler for SA and a SASS. Since data fusion is used for SA in support of dynamic human decision-making, the exploration of the SA concepts and the design of data fusion techniques must take into account human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight human factor aspects in order to ensure a cognitive fit of the fusion system with the decision-maker. Indeed, the tight integration of the human element with the SA technology is essential. Regarding these issues, this paper provides a description of CODSI (Command Decision Support Interface), and operational- like human machine interface prototype for investigations in computer-based SA and command decision support. With CODSI, one objective was to apply recent developments in SA theory and information display technology to the problem of enhancing SAW quality. It thus provides a capability to adequately convey tactical information to command decision makers. It also supports the study of human-computer interactions for SA, and methodologies for SAW measurement.
Exploration Life Support Critical Questions for Future Human Space Missions
NASA Technical Reports Server (NTRS)
Kwert, Michael K.; Barta, Daniel J.; McQuillan, Jeff
2010-01-01
Exploration Life Support (ELS) is a current project under NASA's Exploration Systems Mission Directorate. The ELS Project plans, coordinates and implements the development of advanced life support technologies for human exploration missions in space. Recent work has focused on closed loop atmosphere and water systems for long duration missions, including habitats and pressurized rovers. But, what are the critical questions facing life support system developers for these and other future human missions? This paper explores those questions and how progress in the development of ELS technologies can help answer them. The ELS Project includes the following Elements: Atmosphere Revitalization Systems, Water Recovery Systems, Waste Management Systems, Habitation Engineering, Systems Integration, Modeling and Analysis, and Validation and Testing, which includes the Sub-Elements Flight Experiments and Integrated Testing. Systems engineering analysis by ELS seeks to optimize overall mission architectures by considering all the internal and external interfaces of the life support system and the potential for reduction or reuse of commodities. In particular, various sources and sinks of water and oxygen are considered along with the implications on loop closure and the resulting launch mass requirements. Systems analysis will be validated through the data gathered from integrated testing, which will demonstrate the interfaces of a closed loop life support system. By applying a systematic process for defining, sorting and answering critical life support questions, the ELS project is preparing for a variety of future human space missions
NASA Astrophysics Data System (ADS)
McNamara, Laura A.; Berg, Leif; Butler, Karin; Klein, Laura
2017-05-01
Even as remote sensing technology has advanced in leaps and bounds over the past decade, the remote sensing community lacks interfaces and interaction models that facilitate effective human operation of our sensor platforms. Interfaces that make great sense to electrical engineers and flight test crews can be anxiety-inducing to operational users who lack professional experience in the design and testing of sophisticated remote sensing platforms. In this paper, we reflect on an 18-month collaboration which our Sandia National Laboratory research team partnered with an industry software team to identify and fix critical issues in a widely-used sensor interface. Drawing on basic principles from cognitive and perceptual psychology and interaction design, we provide simple, easily learned guidance for minimizing common barriers to system learnability, memorability, and user engagement.
Potential benefits and hazards of increased reliance on cockpit automation
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1990-01-01
A review is presented of the introduction of advanced technology into the modern aircraft cockpit, bringing a new era of cockpit automation, and the opportunity for safe, fuel-efficient, computer-directed flight. It is shown that this advanced technology has also brought a number of problems, not due to equipment failure, but due to problems at the human-automation interface. Consideration is given to the interface, the ATC system, and to company, regulatory, and economic environments, as well as to how they contribute to these new problems.
Automation in the graphic arts
NASA Astrophysics Data System (ADS)
Truszkowski, Walt
1995-04-01
The CHIMES (Computer-Human Interaction Models) tool was designed to help solve a simply-stated but important problem, i.e., the problem of generating a user interface to a system that complies with established human factors standards and guidelines. Though designed for use in a fairly restricted user domain, i.e., spacecraft mission operations, the CHIMES system is essentially domain independent and applicable wherever graphical user interfaces of displays are to be encountered. The CHIMES philosophy and operating strategy are quite simple. Instead of requiring a human designer to actively maintain in his or her head the now encyclopedic knowledge that human factors and user interface specialists have evolved, CHIMES incorporates this information in its knowledge bases. When directed to evaluated a design, CHIMES determines and accesses the appropriate knowledge, performs an evaluation of the design against that information, determines whether the design is compliant with the selected guidelines and suggests corrective actions if deviations from guidelines are discovered. This paper will provide an overview of the capabilities of the current CHIMES tool and discuss the potential integration of CHIMES-like technology in automated graphic arts systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, C.; Joe, J.; Boring, R.
The primary objective of the United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to sustain operation of the existing commercial nuclear power plants (NPPs) through a multi-pathway approach in conducting research and development (R&D). The Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway conducts targeted R&D to address aging and reliability concerns with legacy instrumentation and control (I&C) and other information systems in existing U.S. NPPs. Control room modernization is an important part following this pathway, and human factors experts at Idaho National Laboratory (INL) have been involved in conducting R&D to supportmore » migration of new digital main control room (MCR) technologies from legacy analog and legacy digital I&C. This paper describes a human factors engineering (HFE) process that supports human-system interface (HSI) design in MCR modernization activities, particularly with migration of old digital to new digital I&C. The process described in this work is an expansion from the LWRS Report INL/EXT-16-38576, and is a requirements-driven approach that aligns with NUREG-0711 requirements. The work described builds upon the existing literature by adding more detail around key tasks and decisions to make when transitioning from HSI Design into Verification and Validation (V&V). The overall objective of this process is to inform HSI design and elicit specific, measurable, and achievable human factors criteria for new digital technologies. Upon following this process, utilities should have greater confidence with transitioning from HSI design into V&V.« less
Enabling Exploration Through Docking Standards
NASA Technical Reports Server (NTRS)
Hatfield, Caris A.
2012-01-01
Human exploration missions beyond low earth orbit will likely require international cooperation in order to leverage limited resources. International standards can help enable cooperative missions by providing well understood, predefined interfaces allowing compatibility between unique spacecraft and systems. The International Space Station (ISS) partnership has developed a publicly available International Docking System Standard (IDSS) that provides a solution to one of these key interfaces by defining a common docking interface. The docking interface provides a way for even dissimilar spacecraft to dock for exchange of crew and cargo, as well as enabling the assembly of large space systems. This paper provides an overview of the key attributes of the IDSS, an overview of the NASA Docking System (NDS), and the plans for updating the ISS with IDSS compatible interfaces. The NDS provides a state of the art, low impact docking system that will initially be made available to commercial crew and cargo providers. The ISS will be used to demonstrate the operational utility of the IDSS interface as a foundational technology for cooperative exploration.
[Integration of nursing in science and technology policies].
Rocha, Semíramis Melani Melo; Ogata, Márcia Niituma; Arantes, Cássia Irene Spinelli
2003-01-01
Brazilian nursing is included in the national science and technology system, as part of the health knowledge area. Its scientific production is reknown but is yet to strengthen its position. Among the strategies to be used, we can emphasize: study different ways to promote a closer relationship between university and services; create or intensify interfacing between clinical and academic nurses; promote strategic research for the use of technological innovations, continuing education of human resources, and implement studies on Nursing care while integrating skills required by complex technological systems and intersubjectivity, acting in a therapeutic way.
Implantable brain computer interface: challenges to neurotechnology translation.
Konrad, Peter; Shanks, Todd
2010-06-01
This article reviews three concepts related to implantable brain computer interface (BCI) devices being designed for human use: neural signal extraction primarily for motor commands, signal insertion to restore sensation, and technological challenges that remain. A significant body of literature has occurred over the past four decades regarding motor cortex signal extraction for upper extremity movement or computer interface. However, little is discussed regarding postural or ambulation command signaling. Auditory prosthesis research continues to represent the majority of literature on BCI signal insertion. Significant hurdles continue in the technological translation of BCI implants. These include developing a stable neural interface, significantly increasing signal processing capabilities, and methods of data transfer throughout the human body. The past few years, however, have provided extraordinary human examples of BCI implant potential. Despite technological hurdles, proof-of-concept animal and human studies provide significant encouragement that BCI implants may well find their way into mainstream medical practice in the foreseeable future.
Role and interest of new technologies in data processing for space control centers
NASA Astrophysics Data System (ADS)
Denier, Jean-Paul; Caspar, Raoul; Borillo, Mario; Soubie, Jean-Luc
1990-10-01
The ways in which a multidisplinary approach will improve space control centers is discussed. Electronic documentation, ergonomics of human computer interfaces, natural language, intelligent tutoring systems and artificial intelligence systems are considered and applied in the study of the Hermes flight control center. It is concluded that such technologies are best integrated into a classical operational environment rather than taking a revolutionary approach which would involve a global modification of the system.
Monitoring and controlling ATLAS data management: The Rucio web user interface
NASA Astrophysics Data System (ADS)
Lassnig, M.; Beermann, T.; Vigne, R.; Barisits, M.; Garonne, V.; Serfon, C.
2015-12-01
The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for usergenerated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained frontends like web-browsers as well as remote services. This contribution will detail the reasons for these principles and the design choices taken. Additionally, the implementation, the interactions with external systems, and an evaluation of the system in production, both from a technological and user perspective, conclude this contribution.
Reducing lumber thickness variation using real-time statistical process control
Thomas M. Young; Brian H. Bond; Jan Wiedenbeck
2002-01-01
A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL
NASA Technical Reports Server (NTRS)
Dumas, Joseph D., II
2002-01-01
The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.
Attacking the information access problem with expert systems
NASA Technical Reports Server (NTRS)
Ragusa, James M.; Orwig, Gary W.
1991-01-01
The results of applications research directed at finding an improved method of storing and accessing information are presented. Twelve microcomputer-based expert systems shells and five laser-optical formats have been studied, and the general and specific methods of interfacing these technologies are being tested in prototype systems. Shell features and interfacing capabilities are discussed, and results from the study of five laser-optical formats are recounted including the video laser, compact, and WORM disks, and laser cards and film. Interfacing, including laser disk device driver interfacing, is discussed and it is pointed out that in order to control the laser device from within the expert systems application, the expert systems shell must be able to access the device driver software. Potential integrated applications are investigated and an initial list is provided including consumer services, travel, law enforcement, human resources, marketing, and education and training.
NASA Astrophysics Data System (ADS)
Pieper, Steven D.; McKenna, Michael; Chen, David; McDowall, Ian E.
1994-04-01
We are interested in the application of computer animation to surgery. Our current project, a navigation and visualization tool for knee arthroscopy, relies on real-time computer graphics and the human interface technologies associated with virtual reality. We believe that this new combination of techniques will lead to improved surgical outcomes and decreased health care costs. To meet these expectations in the medical field, the system must be safe, usable, and cost-effective. In this paper, we outline some of the most important hardware and software specifications in the areas of video input and output, spatial tracking, stereoscopic displays, computer graphics models and libraries, mass storage and network interfaces, and operating systems. Since this is a fairly new combination of technologies and a new application, our justification for our specifications are drawn from the current generation of surgical technology and by analogy to other fields where virtual reality technology has been more extensively applied and studied.
Buried waste integrated demonstration human engineered control station. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-09-01
This document describes the Human Engineered Control Station (HECS) project activities including the conceptual designs. The purpose of the HECS is to enhance the effectiveness and efficiency of remote retrieval by providing an integrated remote control station. The HECS integrates human capabilities, limitations, and expectations into the design to reduce the potential for human error, provides an easy system to learn and operate, provides an increased productivity, and reduces the ultimate investment in training. The overall HECS consists of the technology interface stations, supporting engineering aids, platform (trailer), communications network (broadband system), and collision avoidance system.
Visual Debugging of Object-Oriented Systems With the Unified Modeling Language
2004-03-01
to be “the systematic and imaginative use of the technology of interactive computer graphics and the disciplines of graphic design , typography ... Graphics volume 23 no 6, pp893-901, 1999. [SHN98] Shneiderman, B. Designing the User Interface. Strategies for Effective Human-Computer Interaction...System Design Objectives ................................................................................ 44 3.3 System Architecture
Multimodal Neuroelectric Interface Development
NASA Technical Reports Server (NTRS)
Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Totah, Joseph (Technical Monitor)
2001-01-01
This project aims to improve performance of NASA missions by developing multimodal neuroelectric technologies for augmented human-system interaction. Neuroelectric technologies will add completely new modes of interaction that operate in parallel with keyboards, speech, or other manual controls, thereby increasing the bandwidth of human-system interaction. We recently demonstrated the feasibility of real-time electromyographic (EMG) pattern recognition for a direct neuroelectric human-computer interface. We recorded EMG signals from an elastic sleeve with dry electrodes, while a human subject performed a range of discrete gestures. A machine-teaming algorithm was trained to recognize the EMG patterns associated with the gestures and map them to control signals. Successful applications now include piloting two Class 4 aircraft simulations (F-15 and 757) and entering data with a "virtual" numeric keyboard. Current research focuses on on-line adaptation of EMG sensing and processing and recognition of continuous gestures. We are also extending this on-line pattern recognition methodology to electroencephalographic (EEG) signals. This will allow us to bypass muscle activity and draw control signals directly from the human brain. Our system can reliably detect P-rhythm (a periodic EEG signal from motor cortex in the 10 Hz range) with a lightweight headset containing saline-soaked sponge electrodes. The data show that EEG p-rhythm can be modulated by real and imaginary motions. Current research focuses on using biofeedback to train of human subjects to modulate EEG rhythms on demand, and to examine interactions of EEG-based control with EMG-based and manual control. Viewgraphs on these neuroelectric technologies are also included.
Automatic Speech Acquisition and Recognition for Spacesuit Audio Systems
NASA Technical Reports Server (NTRS)
Ye, Sherry
2015-01-01
NASA has a widely recognized but unmet need for novel human-machine interface technologies that can facilitate communication during astronaut extravehicular activities (EVAs), when loud noises and strong reverberations inside spacesuits make communication challenging. WeVoice, Inc., has developed a multichannel signal-processing method for speech acquisition in noisy and reverberant environments that enables automatic speech recognition (ASR) technology inside spacesuits. The technology reduces noise by exploiting differences between the statistical nature of signals (i.e., speech) and noise that exists in the spatial and temporal domains. As a result, ASR accuracy can be improved to the level at which crewmembers will find the speech interface useful. System components and features include beam forming/multichannel noise reduction, single-channel noise reduction, speech feature extraction, feature transformation and normalization, feature compression, and ASR decoding. Arithmetic complexity models were developed and will help designers of real-time ASR systems select proper tasks when confronted with constraints in computational resources. In Phase I of the project, WeVoice validated the technology. The company further refined the technology in Phase II and developed a prototype for testing and use by suited astronauts.
Ubiquitous Wireless Smart Sensing and Control
NASA Technical Reports Server (NTRS)
Wagner, Raymond
2013-01-01
Need new technologies to reliably and safely have humans interact within sensored environments (integrated user interfaces, physical and cognitive augmentation, training, and human-systems integration tools). Areas of focus include: radio frequency identification (RFID), motion tracking, wireless communication, wearable computing, adaptive training and decision support systems, and tele-operations. The challenge is developing effective, low cost/mass/volume/power integrated monitoring systems to assess and control system, environmental, and operator health; and accurately determining and controlling the physical, chemical, and biological environments of the areas and associated environmental control systems.
Ubiquitous Wireless Smart Sensing and Control. Pumps and Pipes JSC: Uniquely Houston
NASA Technical Reports Server (NTRS)
Wagner, Raymond
2013-01-01
Need new technologies to reliably and safely have humans interact within sensored environments (integrated user interfaces, physical and cognitive augmentation, training, and human-systems integration tools).Areas of focus include: radio frequency identification (RFID), motion tracking, wireless communication, wearable computing, adaptive training and decision support systems, and tele-operations. The challenge is developing effective, low cost/mass/volume/power integrated monitoring systems to assess and control system, environmental, and operator health; and accurately determining and controlling the physical, chemical, and biological environments of the areas and associated environmental control systems.
Information visualization: Beyond traditional engineering
NASA Technical Reports Server (NTRS)
Thomas, James J.
1995-01-01
This presentation addresses a different aspect of the human-computer interface; specifically the human-information interface. This interface will be dominated by an emerging technology called Information Visualization (IV). IV goes beyond the traditional views of computer graphics, CADS, and enables new approaches for engineering. IV specifically must visualize text, documents, sound, images, and video in such a way that the human can rapidly interact with and understand the content structure of information entities. IV is the interactive visual interface between humans and their information resources.
Monitoring osseointegration and developing intelligent systems (Conference Presentation)
NASA Astrophysics Data System (ADS)
Salvino, Liming W.
2017-05-01
Effective monitoring of structural and biological systems is an extremely important research area that enables technology development for future intelligent devices, platforms, and systems. This presentation provides an overview of research efforts funded by the Office of Naval Research (ONR) to establish structural health monitoring (SHM) methodologies in the human domain. Basic science efforts are needed to utilize SHM sensing, data analysis, modeling, and algorithms to obtain the relevant physiological and biological information for human-specific health and performance conditions. This overview of current research efforts is based on the Monitoring Osseointegrated Prosthesis (MOIP) program. MOIP develops implantable and intelligent prosthetics that are directly anchored to the bone of residual limbs. Through real-time monitoring, sensing, and responding to osseointegration of bones and implants as well as interface conditions and environment, our research program aims to obtain individualized actionable information for implant failure identification, load estimation, infection mitigation and treatment, as well as healing assessment. Looking ahead to achieve ultimate goals of SHM, we seek to expand our research areas to cover monitoring human, biological and engineered systems, as well as human-machine interfaces. Examples of such include 1) brainwave monitoring and neurological control, 2) detecting and evaluating brain injuries, 3) monitoring and maximizing human-technological object teaming, and 4) closed-loop setups in which actions can be triggered automatically based on sensors, actuators, and data signatures. Finally, some ongoing and future collaborations across different disciplines for the development of knowledge automation and intelligent systems will be discussed.
Flight Deck Display Technologies for 4DT and Surface Equivalent Visual Operations
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Jones, Denis R.; Shelton, Kevin J.; Arthur, Jarvis J., III; Bailey, Randall E.; Allamandola, Angela S.; Foyle, David C.; Hooey, Becky L.
2009-01-01
NASA research is focused on flight deck display technologies that may significantly enhance situation awareness, enable new operating concepts, and reduce the potential for incidents/accidents for terminal area and surface operations. The display technologies include surface map, head-up, and head-worn displays; 4DT guidance algorithms; synthetic and enhanced vision technologies; and terminal maneuvering area traffic conflict detection and alerting systems. This work is critical to ensure that the flight deck interface technologies and the role of the human participants can support the full realization of the Next Generation Air Transportation System (NextGen) and its novel operating concepts.
Autonomous power expert system
NASA Technical Reports Server (NTRS)
Ringer, Mark J.; Quinn, Todd M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling and dynamic replanning.
Autonomous power expert system
NASA Technical Reports Server (NTRS)
Ringer, Mark J.; Quinn, Todd M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling an dynamic replanning.
The Human Factors and Ergonomics of P300-Based Brain-Computer Interfaces
Powers, J. Clark; Bieliaieva, Kateryna; Wu, Shuohao; Nam, Chang S.
2015-01-01
Individuals with severe neuromuscular impairments face many challenges in communication and manipulation of the environment. Brain-computer interfaces (BCIs) show promise in presenting real-world applications that can provide such individuals with the means to interact with the world using only brain waves. Although there has been a growing body of research in recent years, much relates only to technology, and not to technology in use—i.e., real-world assistive technology employed by users. This review examined the literature to highlight studies that implicate the human factors and ergonomics (HFE) of P300-based BCIs. We assessed 21 studies on three topics to speak directly to improving the HFE of these systems: (1) alternative signal evocation methods within the oddball paradigm; (2) environmental interventions to improve user performance and satisfaction within the constraints of current BCI systems; and (3) measures and methods of measuring user acceptance. We found that HFE is central to the performance of P300-based BCI systems, although researchers do not often make explicit this connection. Incorporation of measures of user acceptance and rigorous usability evaluations, increased engagement of disabled users as test participants, and greater realism in testing will help progress the advancement of P300-based BCI systems in assistive applications. PMID:26266424
Haptic interfaces: Hardware, software and human performance
NASA Technical Reports Server (NTRS)
Srinivasan, Mandayam A.
1995-01-01
Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.
Robust human machine interface based on head movements applied to assistive robotics.
Perez, Elisa; López, Natalia; Orosco, Eugenio; Soria, Carlos; Mut, Vicente; Freire-Bastos, Teodiano
2013-01-01
This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair.
Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics
Perez, Elisa; López, Natalia; Orosco, Eugenio; Soria, Carlos; Mut, Vicente; Freire-Bastos, Teodiano
2013-01-01
This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair. PMID:24453877
Exploration Life Support Critical Questions for Future Human Space Missions
NASA Technical Reports Server (NTRS)
Ewert, Michael K.; Barta, Daniel J.; McQuillan, Jeff
2009-01-01
Exploration Life Support (ELS) is a project under NASA s Exploration Technology Development Program. The ELS Project plans, coordinates and implements the development of advanced life support technologies for human exploration missions in space. Recent work has focused on closed loop atmosphere and water systems for a lunar outpost, including habitats and pressurized rovers. But, what are the critical questions facing life support system developers for these and other future human missions? This paper explores those questions and discusses how progress in the development of ELS technologies can help answer them. The ELS Project includes Atmosphere Revitalization Systems (ARS), Water Recovery Systems (WRS), Waste Management Systems (WMS), Habitation Engineering, Systems Integration, Modeling and Analysis (SIMA), and Validation and Testing, which includes the sub-elements Flight Experiments and Integrated Testing. Systems engineering analysis by ELS seeks to optimize the overall mission architecture by considering all the internal and external interfaces of the life support system and the potential for reduction or reuse of commodities. In particular, various sources and sinks of water and oxygen are considered along with the implications on loop closure and the resulting launch mass requirements.
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
1980-10-21
as a Design Tool for Future C3 Systems" ..... ..... ooo .......... 91 Dr. S. Amoroso, U.S. Army Communications Research and Development Command, Center...systems. System design must take in the human factors aspects as well since unless a human can easily assimilate the information made available to him...developmental efforts directed towards building a base of technology which will provide the basis for the architectual design of the C31 System of the 1990’s
HFE safety reviews of advanced nuclear power plant control rooms
NASA Technical Reports Server (NTRS)
Ohara, John
1994-01-01
Advanced control rooms (ACR's) will utilize human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role and means of interacting with the system. The Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) aspects of HSI's to ensure that they are designed to good HFE principles and support performance and reliability in order to protect public health and safety. However, the only available NRC guidance was developed more than ten years ago, and does not adequately address the human performance issues and technology changes associated with ACR's. Accordingly, a new approach to ACR safety reviews was developed based upon the concept of 'convergent validity'. This approach to ACR safety reviews is described.
Modular System to Enable Extravehicular Activity
NASA Technical Reports Server (NTRS)
Sargusingh, Miriam J.
2011-01-01
The ability to perform extravehicular activity (EVA), both human and robotic, has been identified as a key component to space missions to support such operations as assembly and maintenance of space system (e.g. construction and maintenance of the International Space Station), and unscheduled activities to repair an element of the transportation and habitation systems that can only be accessed externally and via unpressurized areas. In order to make human transportation beyond lower earth orbit (BLEO) practical, efficiencies must be incorporated into the integrated transportation systems to reduce system mass and operational complexity. Affordability is also a key aspect to be considered in space system development; this could be achieved through commonality, modularity and component reuse. Another key aspect identified for the EVA system was the ability to produce flight worthy hardware quickly to support early missions and near Earth technology demonstrations. This paper details a conceptual architecture for a modular extravehicular activity system (MEVAS) that would meet these stated needs for EVA capability that is affordable, and that could be produced relatively quickly. Operational concepts were developed to elaborate on the defined needs and define the key capabilities, operational and design constraints, and general timelines. The operational concept lead to a high level design concept for a module that interfaces with various space transportation elements and contains the hardware and systems required to support human and telerobotic EVA; the module would not be self-propelled and would rely on an interfacing element for consumable resources. The conceptual architecture was then compared to EVA Systems used in the Shuttle Orbiter, on the International Space Station to develop high level design concepts that incorporate opportunities for cost savings through hardware reuse, and quick production through the use of existing technologies and hardware designs. An upgrade option was included to make use of the developing suitport technologies.
From pilot's associate to satellite controller's associate
NASA Technical Reports Server (NTRS)
Neyland, David L.; Lizza, Carl; Merkel, Philip A.
1992-01-01
Associate technology is an emerging engineering discipline wherein intelligent automation can significantly augment the performance of man-machine systems. An associate system is one that monitors operator activity and adapts its operational behavior accordingly. Associate technology is most effectively applied when mapped into management of the human-machine interface and display-control loop in typical manned systems. This paper addresses the potential for application of associate technology into the arena of intelligent command and control of satellite systems, from diagnosis of onboard and onground of satellite systems fault conditions, to execution of nominal satellite control functions. Rather than specifying a specific solution, this paper draws parallels between the Pilot's Associate concept and the domain of satellite control.
Advanced Technologies for Future Spacecraft Cockpits and Space-based Control Centers
NASA Technical Reports Server (NTRS)
Garcia-Galan, Carlos; Uckun, Serdar; Gregory, William; Williams, Kerry
2006-01-01
The National Aeronautics and Space Administration (NASA) is embarking on a new era of Space Exploration, aimed at sending crewed spacecraft beyond Low Earth Orbit (LEO), in medium and long duration missions to the Lunar surface, Mars and beyond. The challenges of such missions are significant and will require new technologies and paradigms in vehicle design and mission operations. Current roles and responsibilities of spacecraft systems, crew and the flight control team, for example, may not be sustainable when real-time support is not assured due to distance-induced communication lags, radio blackouts, equipment failures, or other unexpected factors. Therefore, technologies and applications that enable greater Systems and Mission Management capabilities on-board the space-based system will be necessary to reduce the dependency on real-time critical Earth-based support. The focus of this paper is in such technologies that will be required to bring advance Systems and Mission Management capabilities to space-based environments where the crew will be required to manage both the systems performance and mission execution without dependence on the ground. We refer to this concept as autonomy. Environments that require high levels of autonomy include the cockpits of future spacecraft such as the Mars Exploration Vehicle, and space-based control centers such as a Lunar Base Command and Control Center. Furthermore, this paper will evaluate the requirements, available technology, and roadmap to enable full operational implementation of onboard System Health Management, Mission Planning/re-planning, Autonomous Task/Command Execution, and Human Computer Interface applications. The technology topics covered by the paper include enabling technology to perform Intelligent Caution and Warning, where the systems provides directly actionable data for human understanding and response to failures, task automation applications that automate nominal and Off-nominal task execution based on human input or integrated health state-derived conditions. Shifting from Systems to Mission Management functions, we discuss the role of automated planning applications (tactical planning) on-board, which receive data from the other cockpit automation systems and evaluate the mission plan against the dynamic systems and mission states and events, to provide the crew with capabilities that enable them to understand, change, and manage the timeline of their mission. Lastly, we discuss the role of advanced human interface technologies that organize and provide the system md mission information to the crew in ways that maximize their situational awareness and ability to provide oversight and control of aLl the automated data and functions.
Virtual reality applications to automated rendezvous and capture
NASA Technical Reports Server (NTRS)
Hale, Joseph; Oneil, Daniel
1991-01-01
Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hugo, Jacques Victor; Gertman, David Ira
The new generation of nuclear power plants (NPPs) will likely make use of state-of-the-art technologies in many areas of the plant. The analysis, design, and selection of advanced human–system interfaces (HSIs) constitute an important part of power plant engineering. Designers need to consider the new capabilities afforded by these technologies in the context of current regulations and new operational concepts, which is why they need a more rigorous method by which to plan the introduction of advanced HSIs in NPP work areas. Much of current human factors research stops at the user interface and fails to provide a definitive processmore » for integration of end user devices with instrumentation and control (I&C) and operational concepts. The current lack of a clear definition of HSI technology, including the process for integration, makes characterization and implementation of new and advanced HSIs difficult. This paper describes how new design concepts in the nuclear industry can be analyzed and how HSI technologies associated with new industrial processes might be considered. Furthermore, it also describes a basis for an understanding of human as well as technology characteristics that could be incorporated into a prioritization scheme for technology selection and deployment plans.« less
An Implantable Wireless Neural Interface for Recording Cortical Circuit Dynamics in Moving Primates
Borton, David A.; Yin, Ming; Aceros, Juan; Nurmikko, Arto
2013-01-01
Objective Neural interface technology suitable for clinical translation has the potential to significantly impact the lives of amputees, spinal cord injury victims, and those living with severe neuromotor disease. Such systems must be chronically safe, durable, and effective. Approach We have designed and implemented a neural interface microsystem, housed in a compact, subcutaneous, and hermetically sealed titanium enclosure. The implanted device interfaces the brain with a 510k-approved, 100-element silicon-based MEA via a custom hermetic feedthrough design. Full spectrum neural signals were amplified (0.1Hz to 7.8kHz, ×200 gain) and multiplexed by a custom application specific integrated circuit, digitized, and then packaged for transmission. The neural data (24 Mbps) was transmitted by a wireless data link carried on an frequency shift key modulated signal at 3.2GHz and 3.8GHz to a receiver 1 meter away by design as a point-to-point communication link for human clinical use. The system was powered by an embedded medical grade rechargeable Li-ion battery for 7-hour continuous operation between recharge via an inductive transcutaneous wireless power link at 2MHz. Main results Device verification and early validation was performed in both swine and non-human primate freely-moving animal models and showed that the wireless implant was electrically stable, effective in capturing and delivering broadband neural data, and safe for over one year of testing. In addition, we have used the multichannel data from these mobile animal models to demonstrate the ability to decode neural population dynamics associated with motor activity. Significance We have developed an implanted wireless broadband neural recording device evaluated in non-human primate and swine. The use of this new implantable neural interface technology can provide insight on how to advance human neuroprostheses beyond the present early clinical trials. Further, such tools enable mobile patient use, have the potential for wider diagnosis of neurological conditions, and will advance brain research. PMID:23428937
An implantable wireless neural interface for recording cortical circuit dynamics in moving primates
NASA Astrophysics Data System (ADS)
Borton, David A.; Yin, Ming; Aceros, Juan; Nurmikko, Arto
2013-04-01
Objective. Neural interface technology suitable for clinical translation has the potential to significantly impact the lives of amputees, spinal cord injury victims and those living with severe neuromotor disease. Such systems must be chronically safe, durable and effective. Approach. We have designed and implemented a neural interface microsystem, housed in a compact, subcutaneous and hermetically sealed titanium enclosure. The implanted device interfaces the brain with a 510k-approved, 100-element silicon-based microelectrode array via a custom hermetic feedthrough design. Full spectrum neural signals were amplified (0.1 Hz to 7.8 kHz, 200× gain) and multiplexed by a custom application specific integrated circuit, digitized and then packaged for transmission. The neural data (24 Mbps) were transmitted by a wireless data link carried on a frequency-shift-key-modulated signal at 3.2 and 3.8 GHz to a receiver 1 m away by design as a point-to-point communication link for human clinical use. The system was powered by an embedded medical grade rechargeable Li-ion battery for 7 h continuous operation between recharge via an inductive transcutaneous wireless power link at 2 MHz. Main results. Device verification and early validation were performed in both swine and non-human primate freely-moving animal models and showed that the wireless implant was electrically stable, effective in capturing and delivering broadband neural data, and safe for over one year of testing. In addition, we have used the multichannel data from these mobile animal models to demonstrate the ability to decode neural population dynamics associated with motor activity. Significance. We have developed an implanted wireless broadband neural recording device evaluated in non-human primate and swine. The use of this new implantable neural interface technology can provide insight into how to advance human neuroprostheses beyond the present early clinical trials. Further, such tools enable mobile patient use, have the potential for wider diagnosis of neurological conditions and will advance brain research.
Comparing video and avatar technology for a health education application for deaf people.
Chiriac, Ionuţ Adrian; Stoicu-Tivadar, Lăcrămioara; Podoleanu, Elena
2015-01-01
The article describes the steps and results of a parallel research investigating e-health systems design and implementation for deaf people both in avatar and video technology. The application translates medical knowledge and concepts in deaf sign language for impaired users through an avatar. Two types of avatar technologies are taken into consideration: Video Avatar with recorded humans interface and Animated Avatar with animated figure interface. The comparative study investigates the data collection, design, implementation and the impact study. The comparative analysis of video and animated technology for data collection shows that the video format editing requires fewer skills and results are obtained easier, quicker and less expensive. The video technology supports an easier to design and implement architecture. The impact study for 2 deaf students communities is under development and for the time being the video avatar is better perceived.
OTM Machine Acceptance: In the Arab Culture
NASA Astrophysics Data System (ADS)
Rashed, Abdullah; Santos, Henrique
Basically, neglecting the human factor is one of the main reasons for system failures or for technology rejection, even when important technologies are considered. Biometrics mostly have the characteristics needed for effortless acceptance, such as easiness and usefulness, that are essential pillars of acceptance models such as TAM (technology acceptance model). However, it should be investigated. Many studies have been carried out to research the issues of technology acceptance in different cultures, especially the western culture. Arabic culture lacks these types of studies with few publications in this field. This paper introduces a new biometric interface for ATM machines. This interface depends on a promising biometrics which is odour. To discover the acceptance of this biometrics, we distributed a questionnaire via a web site and called for participation in the Arab Area and found that most respondents would accept to use odour.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, C.; Joe, J.
The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is developing a scientific basis through targeted research and development (R&D) to support the U.S. nuclear power plant (NPP) fleet in extending their existing licensing period and ensuring their long-term reliability, productivity, safety, and security. Over the last several years, human factors engineering (HFE) professionals at the Idaho National Laboratory (INL) have supported the LWRS Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway across several U.S. commercial NPPs in analog-to-digital migrations (i.e., turbine control systems) and digital-to-digital migrations (i.e., Safety Parameter Display System). These effortsmore » have included in-depth human factors evaluation of proposed human-system interface (HSI) design concepts against established U.S. Nuclear Regulatory Commission (NRC) design guidelines from NUREG-0700, Rev 2 to inform subsequent HSI design prior to transitioning into Verification and Validation. This paper discusses some of the overarching design issues observed from these past HFE evaluations. In addition, this work presents some observed challenges such as common tradeoffs utilities are likely to face when introducing new HSI technologies into NPP hybrid control rooms. The primary purpose of this work is to distill these observed design issues into general HSI design guidance that industry can use in early stages of HSI design.« less
Smart Camera Technology Increases Quality
NASA Technical Reports Server (NTRS)
2004-01-01
When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.
Techno-Human Mesh: The Growing Power of Information Technologies.
ERIC Educational Resources Information Center
West, Cynthia K.
This book examines the intersection of information technologies, power, people, and bodies. It explores how information technologies are on a path of creating efficiency, productivity, profitability, surveillance, and control, and looks at the ways in which human-machine interface technologies, such as wearable computers, biometric technologies,…
Engineering and commercialization of human-device interfaces, from bone to brain.
Knothe Tate, Melissa L; Detamore, Michael; Capadona, Jeffrey R; Woolley, Andrew; Knothe, Ulf
2016-07-01
Cutting edge developments in engineering of tissues, implants and devices allow for guidance and control of specific physiological structure-function relationships. Yet the engineering of functionally appropriate human-device interfaces represents an intractable challenge in the field. This leading opinion review outlines a set of current approaches as well as hurdles to design of interfaces that modulate transfer of information, i.a. forces, electrical potentials, chemical gradients and haptotactic paths, between endogenous and engineered body parts or tissues. The compendium is designed to bridge across currently separated disciplines by highlighting specific commonalities between seemingly disparate systems, e.g. musculoskeletal and nervous systems. We focus on specific examples from our own laboratories, demonstrating that the seemingly disparate musculoskeletal and nervous systems share common paradigms which can be harnessed to inspire innovative interface design solutions. Functional barrier interfaces that control molecular and biophysical traffic between tissue compartments of joints are addressed in an example of the knee. Furthermore, we describe the engineering of gradients for interfaces between endogenous and engineered tissues as well as between electrodes that physically and electrochemically couple the nervous and musculoskeletal systems. Finally, to promote translation of newly developed technologies into products, protocols, and treatments that benefit the patients who need them most, regulatory and technical challenges and opportunities are addressed on hand from an example of an implant cum delivery device that can be used to heal soft and hard tissues, from brain to bone. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Experiencing the Sights, Smells, Sounds, and Climate of Southern Italy in VR.
Manghisi, Vito M; Fiorentino, Michele; Gattullo, Michele; Boccaccio, Antonio; Bevilacqua, Vitoantonio; Cascella, Giuseppe L; Dassisti, Michele; Uva, Antonio E
2017-01-01
This article explores what it takes to make interactive computer graphics and VR attractive as a promotional vehicle, from the points of view of tourism agencies and the tourists themselves. The authors exploited current VR and human-machine interface (HMI) technologies to develop an interactive, innovative, and attractive user experience called the Multisensory Apulia Touristic Experience (MATE). The MATE system implements a natural gesture-based interface and multisensory stimuli, including visuals, audio, smells, and climate effects.
Stieglitz, T
2007-01-01
Today applications of neural prostheses that successfully help patients to increase their activities of daily living and participate in social life again are quite simple implants that yield definite tissue response and are well recognized as foreign body. Latest developments in genetic engineering, nanotechnologies and materials sciences have paved the way to new scenarios towards highly complex systems to interface the human nervous system. Combinations of neural cells with microimplants promise stable biohybrid interfaces. Nanotechnology opens the door to macromolecular landscapes on implants that mimic the biologic topology and surface interaction of biologic cells. Computer sciences dream of technical cognitive systems that act and react due to knowledge-based conclusion mechanisms to a changing or adaptive environment. Different sciences start to interact and discuss the synergies when methods and paradigms from biology, computer sciences and engineering, neurosciences, psychology will be combined. They envision the era of "converging technologies" to completely change the understanding of science and postulate a new vision of humans. In this chapter, these research lines will be discussed on some examples as well as the societal implications and ethical questions that arise from these new opportunities.
NASA Astrophysics Data System (ADS)
Whitestone, Jennifer J.; Geisen, Glen R.; McQuiston, Barbara K.
1997-03-01
Anthropometric surveys conducted by the military provide comprehensive human body measurement data that are human interface requirements for successful mission performance of weapon systems, including cockpits, protective equipment, and clothing. The application of human body dimensions to model humans and human-machine performance begins with engineering anthropometry. There are two critical elements to engineering anthropometry: data acquisition and data analysis. First, the human body is captured dimensionally with either traditional anthropometric tools, such as calipers and tape measures, or with advanced image acquisition systems, such as a laser scanner. Next, numerous statistical analysis tools, such as multivariate modeling and feature envelopes, are used to effectively transition these data for design and evaluation of equipment and work environments. Recently, Air Force technology transfer allowed researchers at the Computerized Anthropometric Research and Design (CARD) Laboratory at Wright-Patterson Air Force Base to work with the Dayton, Ohio area medical community in assessing the rate of wound healing and improving the fit of total contract burn masks. This paper describes the successful application of CARD Lab engineering anthropometry to two medically oriented human interface problems.
Passive wireless tags for tongue controlled assistive technology interfaces.
Rakibet, Osman O; Horne, Robert J; Kelly, Stephen W; Batchelor, John C
2016-03-01
Tongue control with low profile, passive mouth tags is demonstrated as a human-device interface by communicating values of tongue-tag separation over a wireless link. Confusion matrices are provided to demonstrate user accuracy in targeting by tongue position. Accuracy is found to increase dramatically after short training sequences with errors falling close to 1% in magnitude with zero missed targets. The rate at which users are able to learn accurate targeting with high accuracy indicates that this is an intuitive device to operate. The significance of the work is that innovative very unobtrusive, wireless tags can be used to provide intuitive human-computer interfaces based on low cost and disposable mouth mounted technology. With the development of an appropriate reading system, control of assistive devices such as computer mice or wheelchairs could be possible for tetraplegics and others who retain fine motor control capability of their tongues. The tags contain no battery and are intended to fit directly on the hard palate, detecting tongue position in the mouth with no need for tongue piercings.
Space Station Workstation Technology Workshop Report
NASA Technical Reports Server (NTRS)
Moe, K. L.; Emerson, C. M.; Eike, D. R.; Malone, T. B.
1985-01-01
This report describes the results of a workshop conducted at Goddard Space Flight Center (GSFC) to identify current and anticipated trends in human-computer interface technology that may influence the design or operation of a space station workstation. The workshop was attended by approximately 40 persons from government and academia who were selected for their expertise in some aspect of human-machine interaction research. The focus of the workshop was a 1 1/2 brainstorming/forecasting session in which the attendees were assigned to interdisciplinary working groups and instructed to develop predictions for each of the following technology areas: (1) user interface, (2) resource management, (3) control language, (4) data base systems, (5) automatic software development, (6) communications, (7) training, and (8) simulation. This report is significant in that it provides a unique perspective on workstation design for the space station. This perspective, which is characterized by a major emphasis on user requirements, should be most valuable to Phase B contractors involved in design development of the space station workstation. One of the more compelling results of the workshop is the recognition that no major technological breakthroughs are required to implement the current workstation concept. What is required is the creative application of existing knowledge and technology.
Emerging CAE technologies and their role in Future Ambient Intelligence Environments
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2011-03-01
Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.
NASA Technical Reports Server (NTRS)
Arthur, Jarvis J., III; Shelton, Kevin J.; Prinzel, Lawrence J., III; Bailey, Randall E.
2016-01-01
During the flight trials known as Gulfstream-V Synthetic Vision Systems Integrated Technology Evaluation (GV-SITE), a Speech Recognition System (SRS) was used by the evaluation pilots. The SRS system was intended to be an intuitive interface for display control (rather than knobs, buttons, etc.). This paper describes the performance of the current "state of the art" Speech Recognition System (SRS). The commercially available technology was evaluated as an application for possible inclusion in commercial aircraft flight decks as a crew-to-vehicle interface. Specifically, the technology is to be used as an interface from aircrew to the onboard displays, controls, and flight management tasks. A flight test of a SRS as well as a laboratory test was conducted.
Analysis of a rotating advanced-technology space station for the year 2025
NASA Technical Reports Server (NTRS)
Queijo, M. J.; Butterfield, A. J.; Cuddihy, W. F.; King, C. B.; Stone, R. W.; Garn, P. A.
1988-01-01
An analysis is made of several aspects of an advanced-technology rotating space station configuration generated under a previous study. The analysis includes examination of several modifications of the configuration, interface with proposed launch systems, effects of low-gravity environment on human subjects, and the space station assembly sequence. Consideration was given also to some aspects of space station rotational dynamics, surface charging, and the possible application of tethers.
Artificial intelligence and expert systems in-flight software testing
NASA Technical Reports Server (NTRS)
Demasie, M. P.; Muratore, J. F.
1991-01-01
The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.
Human facial neural activities and gesture recognition for machine-interfacing applications.
Hamedi, M; Salleh, Sh-Hussain; Tan, T S; Ismail, K; Ali, J; Dee-Uam, C; Pavaganun, C; Yupapin, P P
2011-01-01
The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human-machine interface (HMI) technology utilizes human neural activities as input controllers for the machine. Recently, much work has been done on the specific application of facial electromyography (EMG)-based HMI, which have used limited and fixed numbers of facial gestures. In this work, a multipurpose interface is suggested that can support 2-11 control commands that can be applied to various HMI systems. The significance of this work is finding the most accurate facial gestures for any application with a maximum of eleven control commands. Eleven facial gesture EMGs are recorded from ten volunteers. Detected EMGs are passed through a band-pass filter and root mean square features are extracted. Various combinations of gestures with a different number of gestures in each group are made from the existing facial gestures. Finally, all combinations are trained and classified by a Fuzzy c-means classifier. In conclusion, combinations with the highest recognition accuracy in each group are chosen. An average accuracy >90% of chosen combinations proved their ability to be used as command controllers.
NASA Technical Reports Server (NTRS)
Watson, Amanda
2013-01-01
Wearable technology projects, to be useful, in the future, must be seamlessly integrated with the Flight Deck of the Future (F.F). The lab contains mockups of space vehicle cockpits, habitat living quarters, and workstations equipped with novel user interfaces. The Flight Deck of the Future is one element of the Integrated Power, Avionics, and Software (IPAS) facility, which, to a large extent, manages the F.F network and data systems. To date, integration with the Flight Deck of the Future has been limited by a lack of tools and understanding of the Flight Deck of the Future data handling systems. To remedy this problem it will be necessary to learn how data is managed in the Flight Deck of the Future and to develop tools or interfaces that enable easy integration of WEAR Lab and EV3 products into the Flight Deck of the Future mockups. This capability is critical to future prototype integration, evaluation, and demonstration. This will provide the ability for WEAR Lab products, EV3 human interface prototypes, and technologies from other JSC organizations to be evaluated and tested while in the Flight Deck of the Future. All WEAR Lab products must be integrated with the interface that will connect them to the Flight Deck of the Future. The WEAR Lab products will primarily be programmed in Arduino. Arduino will be used for the development of wearable controls and a tactile communication garment. Arduino will also be used in creating wearable methane detection and warning system.
Su, Kuo-Wei; Liu, Cheng-Li
2012-06-01
A conventional Nursing Information System (NIS), which supports the role of nurse in some areas, is typically deployed as an immobile system. However, the traditional information system can't response to patients' conditions in real-time, causing delays on the availability of this information. With the advances of information technology, mobile devices are increasingly being used to extend the human mind's limited capacity to recall and process large numbers of relevant variables and to support information management, general administration, and clinical practice. Unfortunately, there have been few studies about the combination of a well-designed small-screen interface with a personal digital assistant (PDA) in clinical nursing. Some researchers found that user interface design is an important factor in determining the usability and potential use of a mobile system. Therefore, this study proposed a systematic approach to the development of a mobile nursing information system (MNIS) based on Mobile Human-Computer Interaction (M-HCI) for use in clinical nursing. The system combines principles of small-screen interface design with user-specified requirements. In addition, the iconic functions were designed with metaphor concept that will help users learn the system more quickly with less working-memory. An experiment involving learnability testing, thinking aloud and a questionnaire investigation was conducted for evaluating the effect of MNIS on PDA. The results show that the proposed MNIS is good on learning and higher satisfaction on symbol investigation, terminology and system information.
A system-level approach to automation research
NASA Technical Reports Server (NTRS)
Harrison, F. W.; Orlando, N. E.
1984-01-01
Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.
Zhao, Li; Xing, Xiao; Guo, Xuhong; Liu, Zehua; He, Yang
2014-10-01
Brain-computer interface (BCI) system is a system that achieves communication and control among humans and computers and other electronic equipment with the electroencephalogram (EEG) signals. This paper describes the working theory of the wireless smart home system based on the BCI technology. We started to get the steady-state visual evoked potential (SSVEP) using the single chip microcomputer and the visual stimulation which composed by LED lamp to stimulate human eyes. Then, through building the power spectral transformation on the LabVIEW platform, we processed timely those EEG signals under different frequency stimulation so as to transfer them to different instructions. Those instructions could be received by the wireless transceiver equipment to control the household appliances and to achieve the intelligent control towards the specified devices. The experimental results showed that the correct rate for the 10 subjects reached 100%, and the control time of average single device was 4 seconds, thus this design could totally achieve the original purpose of smart home system.
NASA Technical Reports Server (NTRS)
Booher, Cletis R.; Goldsberry, Betty S.
1994-01-01
During the second half of the 1980s, a document was created by the National Aeronautics and Space Administration (NASA) to aid in the application of good human factors engineering and human interface practices to the design and development of hardware and systems for use in all United States manned space flight programs. This comprehensive document, known as NASA-STD-3000, the Man-Systems Integration Standards (MSIS), attempts to address, from a human factors engineering/human interface standpoint, all of the various types of equipment with which manned space flight crew members must deal. Basically, all of the human interface situations addressed in the MSIS are present in terrestrially based systems also. The premise of this paper is that, starting with this already created standard, comprehensive documents addressing human factors engineering and human interface concerns could be developed to aid in the design of almost any type of equipment or system which humans interface with in any terrestrial environment. Utilizing the systems and processes currently in place in the MSIS Development Facility at the Johnson Space Center in Houston, TX, any number of MSIS volumes addressing the human factors / human interface needs of any terrestrially based (or, for that matter, airborne) system could be created.
A Survey of Research in Supervisory Control and Data Acquisition (SCADA)
2014-09-01
distance learning .2 The data acquired may be operationally oriented and used to better run the system, or it could be strategic in nature and used to...Technically the SCADA system is composed of the information technology (IT) that provides the human- machine interface (HMI) and stores and analyzes the data...systems work by learning what normal or benign traffic is and reporting on any abnormal traffic. These systems have the potential to detect zero-day
Szostak, Katarzyna M.; Grand, Laszlo; Constandinou, Timothy G.
2017-01-01
Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes—microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges. PMID:29270103
Szostak, Katarzyna M; Grand, Laszlo; Constandinou, Timothy G
2017-01-01
Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes-microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.
Virtual reality in surgical training.
Lange, T; Indelicato, D J; Rosen, J M
2000-01-01
Virtual reality in surgery and, more specifically, in surgical training, faces a number of challenges in the future. These challenges are building realistic models of the human body, creating interface tools to view, hear, touch, feel, and manipulate these human body models, and integrating virtual reality systems into medical education and treatment. A final system would encompass simulators specifically for surgery, performance machines, telemedicine, and telesurgery. Each of these areas will need significant improvement for virtual reality to impact medicine successfully in the next century. This article gives an overview of, and the challenges faced by, current systems in the fast-changing field of virtual reality technology, and provides a set of specific milestones for a truly realistic virtual human body.
Ethics in published brain-computer interface research
NASA Astrophysics Data System (ADS)
Specker Sullivan, L.; Illes, J.
2018-02-01
Objective. Sophisticated signal processing has opened the doors to more research with human subjects than ever before. The increase in the use of human subjects in research comes with a need for increased human subjects protections. Approach. We quantified the presence or absence of ethics language in published reports of brain-computer interface (BCI) studies that involved human subjects and qualitatively characterized ethics statements. Main results. Reports of BCI studies with human subjects that are published in neural engineering and engineering journals are anchored in the rationale of technological improvement. Ethics language is markedly absent, omitted from 31% of studies published in neural engineering journals and 59% of studies in biomedical engineering journals. Significance. As the integration of technological tools with the capacities of the mind deepens, explicit attention to ethical issues will ensure that broad human benefit is embraced and not eclipsed by technological exclusiveness.
Modular System to Enable Extravehicular Activity
NASA Technical Reports Server (NTRS)
Sargusingh, Miriam J.
2012-01-01
The ability to perform extravehicular activity (EVA), both human and robotic, has been identified as a key component to space missions to support such operations as assembly and maintenance of space systems (e.g. construction and maintenance of the International Space Station), and unscheduled activities to repair an element of the transportation and habitation systems that can only be accessed externally and via unpressurized areas. In order to make human transportation beyond lower Earth orbit (LEO) practical, efficiencies must be incorporated into the integrated transportation systems to reduce system mass and operational complexity. Affordability is also a key aspect to be considered in space system development; this could be achieved through commonality, modularity and component reuse. Another key aspect identified for the EVA system was the ability to produce flight worthy hardware quickly to support early missions and near Earth technology demonstrations. This paper details a conceptual architecture for a modular EVA system that would meet these stated needs for EVA capability that is affordable, and that could be produced relatively quickly. Operational concepts were developed to elaborate on the defined needs, and to define the key capabilities, operational and design constraints, and general timelines. The operational concept lead to a high level design concept for a module that interfaces with various space transportation elements and contains the hardware and systems required to support human and telerobotic EVA; the module would not be self-propelled and would rely on an interfacing element for consumable resources. The conceptual architecture was then compared to EVA Systems used in the Space Shuttle Orbiter, on the International Space Station to develop high level design concepts that incorporate opportunities for cost savings through hardware reuse, and quick production through the use of existing technologies and hardware designs. An upgrade option was included to make use of the developing suit port technologies.
Advanced Technology for Portable Personal Visualization
1993-01-01
have no cable to drag. " We submitted a short article describing the ceiling tracker and the requirements demanded of trackers in see-through systems...Newspaper/Magazine Articles : "Virtual Reality: It’s All in the Mind," Atlanta Consnrution, 29 September 1992 "Virtual Reality: Exploring the Future...basic scientific investigation of the human haptic system or to serve as haptic interfaces for virtual environments and teleloperation. 2. Research
NASA Astrophysics Data System (ADS)
Lin, Chern-Sheng; Chen, Chia-Tse; Shei, Hung-Jung; Lay, Yun-Long; Chiu, Chuang-Chien
2012-09-01
This study develops a body motion interactive system with computer vision technology. This application combines interactive games, art performing, and exercise training system. Multiple image processing and computer vision technologies are used in this study. The system can calculate the characteristics of an object color, and then perform color segmentation. When there is a wrong action judgment, the system will avoid the error with a weight voting mechanism, which can set the condition score and weight value for the action judgment, and choose the best action judgment from the weight voting mechanism. Finally, this study estimated the reliability of the system in order to make improvements. The results showed that, this method has good effect on accuracy and stability during operations of the human-machine interface of the sports training system.
Using SysML to model complex systems for security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cano, Lester Arturo
2010-08-01
As security systems integrate more Information Technology the design of these systems has tended to become more complex. Some of the most difficult issues in designing Complex Security Systems (CSS) are: Capturing Requirements: Defining Hardware Interfaces: Defining Software Interfaces: Integrating Technologies: Radio Systems: Voice Over IP Systems: Situational Awareness Systems.
Sampling and Control Circuit Board for an Inertial Measurement Unit
NASA Technical Reports Server (NTRS)
Chelmins, David; Powis, Rick
2012-01-01
Spacesuit navigation is one component of NASA s efforts to return humans to the Moon. Studies performed at the NASA Glenn Research Center (GRC) considered various navigation technologies and filtering approaches to enable navigation on the lunar surface. As part of this effort, microelectromechanical systems (MEMS) inertial measurement units (IMUs) were studied to determine if they could supplement a radiometric infrastructure. MEMS IMUs were included in the Lunar Extra-Vehicular Activity Crewmember Location Determination System (LECLDS) testbed during NASA s annual Desert Research and Technology Studies (D-RATS) event in 2009 and 2010. The testbed included one IMU in 2009 and three IMUs in 2010, along with a custom circuit board interfacing between the navigation processor and each IMU. The board was revised for the 2010 test, and this paper documents the design details of this latest revision of the interface circuit board and firmware.
A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies
NASA Technical Reports Server (NTRS)
Fern, Lisa Carolynn
2016-01-01
This document examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will deployed into complex systems. A key question for new technologies, is how work will be accomplished by the human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by the designers. The human machine interface (HMI) which is intended to facilitate human-machine interaction and cooperation, however, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture, can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the expected performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements a detect and avoid system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned from a recent research effort in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to the complete absence of different approaches to human-automation cooperation. For example, all of the prototype technologies that were evaluated in the research program assumed a human-automation architecture that relied on serial processing from the automation to the human. While this type of human-automation architecture is typical across many different technologies and in many different domains, it ignores different architectures where humans and automation work in parallel. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed.
Human computer interface guide, revision A
NASA Technical Reports Server (NTRS)
1993-01-01
The Human Computer Interface Guide, SSP 30540, is a reference document for the information systems within the Space Station Freedom Program (SSFP). The Human Computer Interface Guide (HCIG) provides guidelines for the design of computer software that affects human performance, specifically, the human-computer interface. This document contains an introduction and subparagraphs on SSFP computer systems, users, and tasks; guidelines for interactions between users and the SSFP computer systems; human factors evaluation and testing of the user interface system; and example specifications. The contents of this document are intended to be consistent with the tasks and products to be prepared by NASA Work Package Centers and SSFP participants as defined in SSP 30000, Space Station Program Definition and Requirements Document. The Human Computer Interface Guide shall be implemented on all new SSFP contractual and internal activities and shall be included in any existing contracts through contract changes. This document is under the control of the Space Station Control Board, and any changes or revisions will be approved by the deputy director.
NASA Johnson Space Center Usability Testing and Analysis facility (UTAF) Overview
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Holden, Kritina L.
2005-01-01
The Usability Testing and Analysis Facility (UTAF) is part of the Space Human Factors Laboratory at the NASA Johnson Space Center in Houston, Texas. The facility performs research for NASA's HumanSystems Integration Program, under the HumanSystems Research and Technology Division. Specifically, the UTAF provides human factors support for space vehicles, including the International Space Station, the Space Shuttle, and the forthcoming Crew Exploration Vehicle. In addition, there are ongoing collaborative research efforts with external corporations and universities. The UTAF provides human factors analysis, evaluation, and usability testing of crew interfaces for space applications. This includes computer displays and controls, workstation systems, and work environments. The UTAF has a unique mix of capabilities, with a staff experienced in both cognitive human factors and ergonomics. The current areas of focus are: human factors applications in emergency medical care and informatics; control and display technologies for electronic procedures and instructions; voice recognition in noisy environments; crew restraint design for unique microgravity workstations; and refinement of human factors processes and requirements. This presentation will provide an overview of ongoing activities, and will address how the UTAF projects will evolve to meet new space initiatives.
Khaleghi, A; Chávez-Santiago, R; Balasingham, I
2012-01-01
Ultra wideband (UWB) technology has big potential for applications in wireless body area networks (WBANs). The inherent characteristics of UWB signals make them suitable for the wireless interface of medical sensors. In particular, implanted medical wireless sensors for monitoring physiological parameters, automatic drug provision, etc. can benefit greatly from this ultra low power (ULP) interface. As with any other wireless technology, accurate knowledge of the channel is necessary for the proper design of communication systems. Only a few models that describe the radio propagation inside the human body have been published. Moreover, there is no comprehensive UWB in-body propagation model that includes the frequency-dependent attenuation. Hence, this paper extends a statistical model for UWB propagation channels inside the human chest in the 1-6 GHz frequency range by including the frequency-dependent attenuation. This is done by modeling the spectrum shape of distorted pulses at different depths inside the human chest. The distortion of the pulse was obtained through numerical simulations using a voxel representation of the human body. We propose a mathematical expression for the spectrum shape of the distorted pulses that act as a window function to reproduce the effects of frequency-dependent attenuation.
Human performance interfaces in air traffic control.
Chang, Yu-Hern; Yeh, Chung-Hsing
2010-01-01
This paper examines how human performance factors in air traffic control (ATC) affect each other through their mutual interactions. The paper extends the conceptual SHEL model of ergonomics to describe the ATC system as human performance interfaces in which the air traffic controllers interact with other human performance factors including other controllers, software, hardware, environment, and organisation. New research hypotheses about the relationships between human performance interfaces of the system are developed and tested on data collected from air traffic controllers, using structural equation modelling. The research result suggests that organisation influences play a more significant role than individual differences or peer influences on how the controllers interact with the software, hardware, and environment of the ATC system. There are mutual influences between the controller-software, controller-hardware, controller-environment, and controller-organisation interfaces of the ATC system, with the exception of the controller-controller interface. Research findings of this study provide practical insights in managing human performance interfaces of the ATC system in the face of internal or external change, particularly in understanding its possible consequences in relation to the interactions between human performance factors.
PointCom: semi-autonomous UGV control with intuitive interface
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham
2008-04-01
Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).
ECLSS evolution: Advanced instrumentation interface requirements. Volume 3: Appendix C
NASA Technical Reports Server (NTRS)
1991-01-01
An Advanced ECLSS (Environmental Control and Life Support System) Technology Interfaces Database was developed primarily to provide ECLSS analysts with a centralized and portable source of ECLSS technologies interface requirements data. The database contains 20 technologies which were previously identified in the MDSSC ECLSS Technologies database. The primary interfaces of interest in this database are fluid, electrical, data/control interfaces, and resupply requirements. Each record contains fields describing the function and operation of the technology. Fields include: an interface diagram, description applicable design points and operating ranges, and an explaination of data, as required. A complete set of data was entered for six of the twenty components including Solid Amine Water Desorbed (SAWD), Thermoelectric Integrated Membrane Evaporation System (TIMES), Electrochemical Carbon Dioxide Concentrator (EDC), Solid Polymer Electrolysis (SPE), Static Feed Electrolysis (SFE), and BOSCH. Additional data was collected for Reverse Osmosis Water Reclaimation-Potable (ROWRP), Reverse Osmosis Water Reclaimation-Hygiene (ROWRH), Static Feed Solid Polymer Electrolyte (SFSPE), Trace Contaminant Control System (TCCS), and Multifiltration Water Reclamation - Hygiene (MFWRH). A summary of the database contents is presented in this report.
Addressing the human factors issues associated with control room modifications
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Hara, J.; Stubler, W.; Kramer, J.
1998-03-01
Advanced human-system interface (HSI) technology is being integrated into existing nuclear plants as part of plant modifications and upgrades. The result of this trend is that hybrid HSIs are created, i.e., HSIs containing a mixture of conventional (analog) and advanced (digital) technology. The purpose of the present research is to define the potential effects of hybrid HSIs on personnel performance and plant safety and to develop human factors guidance for safety reviews of them where necessary. In support of this objective, human factors issues associated with hybrid HSIs were identified. The issues were evaluated for their potential significance to plantmore » safety, i.e., their human performance concerns have the potential to compromise plant safety. The issues were then prioritized and a subset was selected for design review guidance development.« less
Liquid lens: advances in adaptive optics
NASA Astrophysics Data System (ADS)
Casey, Shawn Patrick
2010-12-01
'Liquid lens' technologies promise significant advancements in machine vision and optical communications systems. Adaptations for machine vision, human vision correction, and optical communications are used to exemplify the versatile nature of this technology. Utilization of liquid lens elements allows the cost effective implementation of optical velocity measurement. The project consists of a custom image processor, camera, and interface. The images are passed into customized pattern recognition and optical character recognition algorithms. A single camera would be used for both speed detection and object recognition.
Knowledge-based control of an adaptive interface
NASA Technical Reports Server (NTRS)
Lachman, Roy
1989-01-01
The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.
Simulation of the human-telerobot interface
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Smith, Randy L.
1988-01-01
A part of NASA's Space Station will be a Flight Telerobotic Servicer (FTS) used to help assemble, service, and maintain the Space Station. Since the human operator will be required to control the FTS, the design of the human-telerobot interface must be optimized from a human factors perspective. Simulation has been used as an aid in the development of complex systems. Simulation has been especially useful when it has been applied to the development of complex systems. Simulation should ensure that the hardware and software components of the human-telerobot interface have been designed and selected so that the operator's capabilities and limitations have been accommodated for since this is a complex system where few direct comparisons to existent systems can be made. Three broad areas of the human-telerobot interface where simulation can be of assistance are described. The use of simulation not only can result in a well-designed human-telerobot interface, but also can be used to ensure that components have been selected to best meet system's goals, and for operator training.
Science and technology integration for increased human potential and societal outcomes.
Roco, Mihail C
2004-05-01
Unifying science based on the material unity of nature at the nanoscale provides a new foundation for knowledge, innovation, and integration of technology. Revolutionary and synergistic advances at the interfaces between previously separated fields of science, engineering and areas of relevance are ready to create nano-bio-info-cogno (NBIC) transforming tools. Developments in systems approach, mathematics, and computation in conjunction with NBIC allow us to understand the natural world and scientific research as closely coupled, complex, hierarchical entities. At this unique moment of scientific and technical achievement, improvement of human performance at individual and group levels, as well as development of suitable revolutionary products, becomes possible and these are primary goals for converging new technologies. NBIC addresses long-term advances in key areas of human activity, including working, learning, aging, group interaction, organizations, and human evolution ((Roco and Bainbridge, 2003)). Fundamentally new tools, technologies, and products will be integrated into individual and social human architecture. This introductory chapter of the Annals outlines research and education trends, funding activities, and the potential of development of revolutionary products and services.
The human power amplifier technology at the University of California, Berkeley.
Kazerooni, H
1996-01-01
A human's ability to perform physical tasks is limited by physical strength, not by intelligence. We define "extenders" as a class of robot manipulators worn by humans to augment human mechanical strength, while the wearer's intellect remains the central control system for manipulating the extender. Our research objective is to determine the ground rules for the design and control of robotic systems worn by humans through the design, construction, and control of several prototype experimental direct-drive/non-direct-drive multi-degree-of-freedom hydraulic/electric extenders. The design of extenders is different from the design of conventional robots because the extender interfaces with the human on a physical level. Two sets of force sensors measure the forces imposed on the extender by the human and by the environment (i.e., the load). The extender's compliances in response to such contact forces were designed by selecting appropriate force compensators. This paper gives a summary of some of the selected research efforts related to Extender Technology, carried out during 1980s. The references, at the end of this article, give detailed description of the research efforts.
Redesigning the Human-Machine Interface for Computer-Mediated Visual Technologies.
ERIC Educational Resources Information Center
Acker, Stephen R.
1986-01-01
This study examined an application of a human machine interface which relies on the use of optical bar codes incorporated in a computer-based module to teach radio production. The sequencing procedure used establishes the user rather than the computer as the locus of control for the mediated instruction. (Author/MBR)
Using hub technology to facilitate information system integration in a health-care enterprise.
Gendler, S M; Friedman, B A; Henricks, W H
1996-04-01
The deployment and maintenance of multiple point-to-point interfaces between a clinical information system, such as a laboratory information system, and other systems within a healthcare enterprise is expensive and time consuming. Moreover, the demand for such interfaces is increasing as hospitals consolidate and clinical laboratories participate in the development of regional laboratory networks and create host-to-host links with laboratory outreach clients. An interface engine, also called a hub, is an evolving technology that could replace multiple point-to-point interfaces from a laboratory information system with a single interface to the hub, preferably HL7 based. The hub then routes and translates laboratory information to other systems within the enterprise. Changes in application systems in an enterprise where a centralized interface engine has been implemented then amount to thorough analysis, an update of the enterprise's data dictionary, purchase of a single new vendor-supported interface, and table-based parameter changes on the hub. Two other features of an interface engine, support for structured query language and information store-and-forward, will facilitate the development of clinical data repositories and provide flexibility when interacting with other host systems. This article describes the advantages and disadvantages of an interface engine and lists some problems not solved by the technology. Finally, early developmental experience with an interface engine at the University of Michigan Medical Center and the benefits of the project on system integration efforts are described, not the least of which has been the enthusiastic adoption of the HL7 standard for all future interface projects.
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Obenschain, Arthur F. (Technical Monitor)
2002-01-01
Currently, spacecraft ground systems have a well defined and somewhat standard architecture and operations concept. Based on domain analysis studies of various control centers conducted over the years it is clear that ground systems have core capabilities and functionality that are common across all ground systems. This observation alone supports the realization of reuse. Additionally, spacecraft ground systems are increasing in their ability to do things autonomously. They are being engineered using advanced expert systems technology to provide automated support for operators. A clearer understanding of the possible roles of agent technology is advancing the prospects of greater autonomy for these systems. Many of their functional and management tasks are or could be supported by applied agent technology, the dynamics of the ground system's infrastructure could be monitored by agents, there are intelligent agent-based approaches to user-interfaces, etc. The premise of this paper is that the concepts associated with software reuse, applicable in consideration of classically-engineered ground systems, can be updated to address their application in highly agent-based realizations of future ground systems. As a somewhat simplified example consider the following situation, involving human agents in a ground system context. Let Group A of controllers be working on Mission X. They are responsible for the command, control and health and safety of the Mission X spacecraft. Let us suppose that mission X successfully completes it mission and is turned off. Group A could be dispersed or perhaps move to another Mission Y. In this case there would be reuse of the human agents from Mission X to Mission Y. The Group A agents perform their well-understood functions in a somewhat but related context. There will be a learning or familiarization process that the group A agents go through to make the new context, determined by the new Mission Y, understood. This simplified scenario highlights some of the major issues that need to be addressed when considering the situation where Group A is composed of software-based agents (not their human counterparts) and they migrate from one mission support system to another. This paper will address: - definition of an agent architecture appropriate to support reuse; - identification of non-mission-specific agent capabilities required; - appropriate knowledge representation schemes for mission-specific knowledge; - agent interface with mission-specific knowledge (a type of Learning); development of a fully-operational group of cooperative software agents for ground system support; architecture and operation of a repository of reusable agents that could be the source of intelligent components for realizing an autonomous (or nearly autonomous) agent-based ground system, and an agent-based approach to repository management and operation (an intelligent interface for human use of the repository in a ground-system development activity).
NASA Astrophysics Data System (ADS)
Johnson, Bradley; May, Gayle L.; Korn, Paula
A recent symposium produced papers in the areas of solar system exploration, man machine interfaces, cybernetics, virtual reality, telerobotics, life support systems and the scientific and technology spinoff from the NASA space program. A number of papers also addressed the social and economic impacts of the space program. For individual titles, see A95-87468 through A95-87479.
Passive sensor technology interface to assess elder activity in independent living.
Alexander, Gregory L; Wakefield, Bonnie J; Rantz, Marilyn; Skubic, Marjorie; Aud, Myra A; Erdelez, Sanda; Ghenaimi, Said Al
2011-01-01
The effectiveness of clinical information systems to improve nursing and patient outcomes depends on human factors, including system usability, organizational workflow, and user satisfaction. The aim of this study was to examine to what extent residents, family members, and clinicians find a sensor data interface used to monitor elder activity levels usable and useful in an independent living setting. Three independent expert reviewers conducted an initial heuristic evaluation. Subsequently, 20 end users (5 residents, 5 family members, 5 registered nurses, and 5 physicians) participated in the evaluation. During the evaluation, each participant was asked to complete three scenarios taken from three residents. Morae recorder software was used to capture data during the user interactions. The heuristic evaluation resulted in 26 recommendations for interface improvement; these were classified under the headings content, aesthetic appeal, navigation, and architecture, which were derived from heuristic results. Total time for elderly residents to complete scenarios was much greater than for other users. Family members spent more time than clinicians but less time than residents did to complete scenarios. Elder residents and family members had difficulty interpreting clinical data and graphs, experienced information overload, and did not understand terminology. All users found the sensor data interface useful for identifying changing resident activities. Older adult users have special needs that should be addressed when designing clinical interfaces for them, especially information as important as health information. Evaluating human factors during user interactions with clinical information systems should be a requirement before implementation.
The future of the provision process for mobility assistive technology: a survey of providers.
Dicianno, Brad E; Joseph, James; Eckstein, Stacy; Zigler, Christina K; Quinby, Eleanor J; Schmeler, Mark R; Schein, Richard M; Pearlman, Jon; Cooper, Rory A
2018-03-20
The purpose of this study was to evaluate the opinions of providers of mobility assistive technologies to help inform a research agenda and set priorities. This survey study was anonymous and gathered opinions of individuals who participate in the process to provide wheelchairs and other assistive technologies to clients. Participants were asked to rank the importance of developing various technologies and rank items against each other in terms of order of importance. Participants were also asked to respond to several open-ended questions or statements. A total of 161 providers from 35 states within the USA consented to participation and completed the survey. This survey revealed themes of advanced wheelchair design, assistive robotics and intelligent systems, human machine interfaces and smart device applications. It also outlined priorities for researchers to provide continuing education to clients and providers. These themes will be used to develop research and development priorities. Implications for Rehabilitation • Research in advanced wheelchair design is needed to facilitate travel and environmental access with wheelchairs and to develop alternative power sources for wheelchairs.• New assistive robotics and intelligent systems are needed to help wheelchairs overcome obstacles or self-adjust, assist wheelchair navigation in the community, assist caregivers and transfers, and aid ambulation.• Innovations in human machine interfaces may help advance the control of mobility devices and robots with the brain, eye movements, facial gesture recognition or other systems.• Development of new smart devices is needed for better control of the environment, monitoring activity and promoting healthy behaviours.
OBPR Product Lines, Human Research Initiative, and Physics Roadmap for Exploration
NASA Technical Reports Server (NTRS)
Israelsson, Ulf
2004-01-01
The pace of change has increased at NASA. OBPR s focus is now on the Human interface as it relates to the new Exploration vision. The fundamental physics community must demonstrate how we can contribute. Many opportunities exist for physicists to participate in addressing NASA's cross-disciplinary exploration challenges: a) Physicists can contribute to elucidating basic operating principles for complex biological systems; b) Physics technologies can contribute to developing miniature sensors and systems required for manned missions to Mars. NASA Codes other than OBPR may be viable sources of funding for physics research.
Advanced integrated enhanced vision systems
NASA Astrophysics Data System (ADS)
Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha
2003-09-01
In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques V Hugo
This book chapter describes the considerations for the selection of advanced human–system interfaces (HSIs) for the new generation of nuclear power plants. The chapter discusses the technologies that will be needed to support highly automated nuclear power plants, while minimising demands for numbers of operational staff, reducing human error and improving plant efficiency and safety. Special attention is paid to the selection and deployment of advanced technologies in nuclear power plants (NPPs). The chapter closes with an examination of how technologies are likely to develop over the next 10–15 years and how this will affect design choices for the nuclearmore » industry.« less
Project Orion, Environmental Control and Life Support System Integrated Studies
NASA Technical Reports Server (NTRS)
Russell, James F.; Lewis, John F.
2008-01-01
Orion is the next vehicle for human space travel. Humans will be sustained in space by the Orion subystem, environmental control and life support (ECLS). The ECLS concept at the subsystem level is outlined by function and technology. In the past two years, the interface definition with other subsystems has increased through different integrated studies. The paper presents the key requirements and discusses three recent studies (e.g., unpressurized cargo) along with the respective impacts on the ECLS design moving forward.
Annotated Bibliography of Enabling Technologies for the Small Aircraft Transportation System
NASA Technical Reports Server (NTRS)
ONeil, Patrick D.; Tarry, Scott E.
2002-01-01
The following collection of research summaries are submitted as fulfillment of a request from NASA LaRC to conduct research into existing enabling technologies that support the development of the Small Aircraft Transportation System aircraft and accompanying airspace management infrastructure. Due to time and fiscal constraints, the included studies focus primarily on visual systems and architecture, flight control design, instrumentation and display, flight deck design considerations, Human-Machine Interface issues, and supporting augmentation technologies and software. This collation of summaries is divided in sections in an attempt to group similar technologies and systems. However, the reader is advised that many of these studies involve multiple technologies and systems that span across many categories. Because of this fact, studies are not easily categorized into single sections. In an attempt to help the reader more easily identify topics of interest, a SATS application description is provided for each summary. In addition, a list of acronyms provided at the front of the report to aid the reader.
Human factor engineering based design and modernization of control rooms with new I and C systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larraz, J.; Rejas, L.; Ortega, F.
2012-07-01
Instrumentation and Control (I and C) systems of the latest nuclear power plants are based on the use of digital technology, distributed control systems and the integration of information in data networks (Distributed Control and Instrumentation Systems). This has a repercussion on Control Rooms (CRs), where the operations and monitoring interfaces correspond to these systems. These technologies are also used in modernizing I and C systems in currently operative nuclear power plants. The new interfaces provide additional capabilities for operation and supervision, as well as a high degree of flexibility, versatility and reliability. An example of this is the implementationmore » of solutions such as compact stations, high level supervision screens, overview displays, computerized procedures, new operational support systems or intelligent alarms processing systems in the modernized Man-Machine Interface (MMI). These changes in the MMI are accompanied by newly added Software (SW) controls and new solutions in automation. Tecnatom has been leading various projects in this area for several years, both in Asian countries and in the United States, using in all cases international standards from which Tecnatom own methodologies have been developed and optimized. The experience acquired in applying this methodology to the design of new control rooms is to a large extent applicable also to the modernization of current control rooms. An adequate design of the interface between the operator and the systems will facilitate safe operation, contribute to the prompt identification of problems and help in the distribution of tasks and communications between the different members of the operating shift. Based on Tecnatom experience in the field, this article presents the methodological approach used as well as the most relevant aspects of this kind of project. (authors)« less
Human factors in air traffic control: problems at the interfaces.
Shouksmith, George
2003-10-01
The triangular ISIS model for describing the operation of human factors in complex sociotechnical organisations or systems is applied in this research to a large international air traffic control system. A large sample of senior Air Traffic Controllers were randomly assigned to small focus discussion groups, whose task was to identify problems occurring at the interfaces of the three major human factor components: individual, system impacts, and social. From these discussions, a number of significant interface problems, which could adversely affect the functioning of the Air Traffic Control System, emerged. The majority of these occurred at the Individual-System Impact and Individual-Social interfaces and involved a perceived need for further interface centered training.
EVA Systems Technology Gaps and Priorities 2017
NASA Technical Reports Server (NTRS)
Johnson, Brian J.; Buffington, Jesse A.
2017-01-01
Performance of Extra-Vehicular Activities (EVA) has been and will continue to be a critical capability for human space flight. Human exploration missions beyond LEO will require EVA capability for either contingency or nominal activities to support mission objectives and reduce mission risk. EVA systems encompass a wide array of products across pressure suits, life support systems, EVA tools and unique spacecraft interface hardware (i.e. EVA Translation Paths and EVA Worksites). In a fiscally limited environment with evolving transportation and habitation options, it is paramount that the EVA community's strategic planning and architecture integration products be reviewed and vetted for traceability between the mission needs far into the future to the known technology and knowledge gaps to the current investments across EVA systems. To ascertain EVA technology and knowledge gaps many things need to be brought together, assessed and analyzed. This includes an understanding of the destination environments, various mission concept of operations, current state of the art of EVA systems, EVA operational lessons learned, and reference advanced capabilities. A combined assessment of these inputs should result in well-defined list of gaps. This list can then be prioritized depending on the mission need dates and time scale of the technology or knowledge gap closure plan. This paper will summarize the current state of EVA related technology and knowledge gaps derived from NASA's Exploration EVA Reference Architecture and Operations Concept products. By linking these products and articulating NASA's approach to strategic development for EVA across all credible destinations an EVA could be done in, the identification of these gaps is then used to illustrate the tactical and strategic planning for the EVA technology development portfolio. Finally, this paper illustrates the various "touch points" with other human exploration risk identification areas including human health and performance.
A Conformal, Bio-interfaced Class of Silicon Electronics for Mapping Cardiac Electrophysiology
Viventi, Jonathan; Kim, Dae-Hyeong; Moss, Joshua D.; Kim, Yun-Soung; Blanco, Justin A.; Annetta, Nicholas; Hicks, Andrew; Xiao, Jianliang; Huang, Younggang; Callans, David J.; Rogers, John A.; Litt, Brian
2011-01-01
The sophistication and resolution of current implantable medical devices are limited by the need connect each sensor separately to data acquisition systems. The ability of these devices to sample and modulate tissues is further limited by the rigid, planar nature of the electronics and the electrode-tissue interface. Here, we report the development of a class of mechanically flexible silicon electronics for measuring signals in an intimate, conformal integrated mode on the dynamic, three dimensional surfaces of soft tissues in the human body. We illustrate this technology in sensor systems composed of 2016 silicon nanomembrane transistors configured to record electrical activity directly from the curved, wet surface of a beating heart in vivo. The devices sample with simultaneous sub-millimeter and sub-millisecond resolution through 288 amplified and multiplexed channels. We use these systems to map the spread of spontaneous and paced ventricular depolarization in real time, at high resolution, on the epicardial surface in a porcine animal model. This clinical-scale demonstration represents one example of many possible uses of this technology in minimally invasive medical devices. [Conformal electronics and sensors intimately integrated with living tissues enable a new generation of implantable devices capable of addressing important problems in human health.] PMID:20375008
Potential of Cognitive Computing and Cognitive Systems
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2015-01-01
Cognitive computing and cognitive technologies are game changers for future engineering systems, as well as for engineering practice and training. They are major drivers for knowledge automation work, and the creation of cognitive products with higher levels of intelligence than current smart products. This paper gives a brief review of cognitive computing and some of the cognitive engineering systems activities. The potential of cognitive technologies is outlined, along with a brief description of future cognitive environments, incorporating cognitive assistants - specialized proactive intelligent software agents designed to follow and interact with humans and other cognitive assistants across the environments. The cognitive assistants engage, individually or collectively, with humans through a combination of adaptive multimodal interfaces, and advanced visualization and navigation techniques. The realization of future cognitive environments requires the development of a cognitive innovation ecosystem for the engineering workforce. The continuously expanding major components of the ecosystem include integrated knowledge discovery and exploitation facilities (incorporating predictive and prescriptive big data analytics); novel cognitive modeling and visual simulation facilities; cognitive multimodal interfaces; and cognitive mobile and wearable devices. The ecosystem will provide timely, engaging, personalized / collaborative, learning and effective decision making. It will stimulate creativity and innovation, and prepare the participants to work in future cognitive enterprises and develop new cognitive products of increasing complexity. http://www.aee.odu.edu/cognitivecomp
Brodersen, Søsser; Lindegaard, Hanne
2015-01-01
Currently, a wide variety of healthcare technologies are being implemented in private homes, hospitals, nursing homes, etc. with the triple aim of improving people's health, improving the quality of care, and reducing costs related to healthcare services. In this chapter, we discuss how different actors in a public-private partnership co-developed a heterogeneous system around the Smart Floor to ensure that both new healthcare practices and residents' routines were inscribed into the new healthcare technology. We argue that implementing the Smart Floor was not just a question of buying a technology and integrating it during construction-it required co-development with the healthcare staff. The floor is more than a technology placed under the floor surface in a resident's apartment; rather, it is a heterogeneous network of human and non-human actors communicating with each other. In this chapter, we illustrate how the heterogeneous technological system was co-developed and redesigned during knowledge sharing processes with companies, lead-users, and healthcare staff. We also discuss how care practices have changed as a result of the Smart Floor system. In particular, healthcare staff members no longer feel a need to disturb elderly residents with routine in-person checks. Domesticating the technologies for different groups of actors required not only coordinating communication among sensors, the interface, the portable nurse call (smartphones), and alarms, but also accepting the use of surveillance technology.
Systems and technologies for high-speed inter-office/datacenter interface
NASA Astrophysics Data System (ADS)
Sone, Y.; Nishizawa, H.; Yamamoto, S.; Fukutoku, M.; Yoshimatsu, T.
2017-01-01
Emerging requirements for inter-office/inter-datacenter short reach links for data center interconnects (DCI) and metro transport networks have led to various inter-office and inter-datacenter optical interface technologies. These technologies are bringing significant changes to systems and network architectures. In this paper, we present a system and ZR optical interface technologies for DCI and metro transport networks, then introduce the latest challenges facing the system framework. There are two trends in reach extension; one is to use Ethernet and the other is to use digital coherent technologies. The first approach achieves reach extension while using as many existing Ethernet components as possible. It offers low costs as reuses the cost-effective components created for the large Ethernet market. The second approach adopts low-cost and low power coherent DSPs that implement the minimal set long haul transmission functions. This paper introduces an architecture that integrates both trends. The architecture satisfies both datacom and telecom needs with a common control and management interface and automated configuration.
HSI Guidelines Outline for the Air Vehicle Control Station. Version 2
NASA Technical Reports Server (NTRS)
2006-01-01
This document provides guidance to the FAA and manufacturers on how to develop UAS Pilot Vehicle Interfaces to safely and effectively integrate UASs into the NAS. Preliminary guidelines are provided for Aviate, Communicate, Navigate and Avoid Hazard functions. The pilot shall have information and control capability so that pilot-UA interactions are not adverse, unfavorable, nor compromise safety. Unfavorable interactions include anomalous aircraft-pilot coupling (APC) interactions (closed loop), pilot-involved oscillations (categories I, II or III), and non-oscillatory APC events (e.g., divergence). - Human Systems Integration Pilot-Technology Interface Requirements for Command, Control, and Communications (C3)
The human role in space. Volume 3: Generalizations on human roles in space
NASA Technical Reports Server (NTRS)
1984-01-01
The human role in space was studied. The role and the degree of direct involvement of humans that will be required in future space missions, was investigated. Valid criteria for allocating functional activities between humans and machines were established. The technology requirements, ecnomics, and benefits of the human presence in space were examined. Factors which affect crew productivity include: internal architecture; crew support; crew activities; LVA systems; IVA/EVA interfaces; and remote systems management. The accomplished work is reported and the data and analyses from which the study results are derived are included. The results provide information and guidelines to enable NASA program managers and decision makers to establish, early in the design process, the most cost effective design approach for future space programs, through the optimal application of unique human skills and capabilities in space.
Nakajima, Sawako; Ino, Shuichi; Ifukube, Tohru
2007-01-01
Mixed Reality (MR) technologies have recently been explored in many areas of Human-Machine Interface (HMI) such as medicine, manufacturing, entertainment and education. However MR sickness, a kind of motion sickness is caused by sensory conflicts between the real world and virtual world. The purpose of this paper is to find out a new evaluation method of motion and MR sickness. This paper investigates a relationship between the whole-body vibration related to MR technologies and the motion aftereffect (MAE) phenomenon in the human visual system. This MR environment is modeled after advanced driver assistance systems in near-future vehicles. The seated subjects in the MR simulator were shaken in the pitch direction ranging from 0.1 to 2.0 Hz. Results show that MAE is useful for evaluation of MR sickness incidence. In addition, a method to reduce the MR sickness by auditory stimulation is proposed.
NASA Technical Reports Server (NTRS)
Wilber, George F.
2017-01-01
This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).
A meta-analysis of human-system interfaces in unmanned aerial vehicle (UAV) swarm management.
Hocraffer, Amy; Nam, Chang S
2017-01-01
A meta-analysis was conducted to systematically evaluate the current state of research on human-system interfaces for users controlling semi-autonomous swarms composed of groups of drones or unmanned aerial vehicles (UAVs). UAV swarms pose several human factors challenges, such as high cognitive demands, non-intuitive behavior, and serious consequences for errors. This article presents findings from a meta-analysis of 27 UAV swarm management papers focused on the human-system interface and human factors concerns, providing an overview of the advantages, challenges, and limitations of current UAV management interfaces, as well as information on how these interfaces are currently evaluated. In general allowing user and mission-specific customization to user interfaces and raising the swarm's level of autonomy to reduce operator cognitive workload are beneficial and improve situation awareness (SA). It is clear more research is needed in this rapidly evolving field. Copyright © 2016 Elsevier Ltd. All rights reserved.
Biosensor Technologies for Augmented Brain-Computer Interfaces in the Next Decades
2012-05-13
Research Triangle Park, NC 27709-2211 Augmented brain–computer interface (ABCI);biosensor; cognitive-state monitoring; electroencephalogram( EEG ); human...biosensor; cognitive-state monitoring; electroencephalogram ( EEG ); human brain imaging Manuscript received November 28, 2011; accepted December 20...magnetic reso- nance imaging (fMRI) [1], positron emission tomography (PET) [2], electroencephalograms ( EEGs ) and optical brain imaging techniques (i.e
Cloud-based robot remote control system for smart factory
NASA Astrophysics Data System (ADS)
Wu, Zhiming; Li, Lianzhong; Xu, Yang; Zhai, Jingmei
2015-12-01
With the development of internet technologies and the wide application of robots, there is a prospect (trend/tendency) of integration between network and robots. A cloud-based robot remote control system over networks for smart factory is proposed, which enables remote users to control robots and then realize intelligent production. To achieve it, a three-layer system architecture is designed including user layer, service layer and physical layer. Remote control applications running on the cloud server is developed on Microsoft Azure. Moreover, DIV+ CSS technologies are used to design human-machine interface to lower maintenance cost and improve development efficiency. Finally, an experiment is implemented to verify the feasibility of the program.
Seamless 3D interaction for virtual tables, projection planes, and CAVEs
NASA Astrophysics Data System (ADS)
Encarnacao, L. M.; Bimber, Oliver; Schmalstieg, Dieter; Barton, Robert J., III
2000-08-01
The Virtual Table presents stereoscopic graphics to a user in a workbench-like setting. This device shares with other large- screen display technologies (such as data walls and surround- screen projection systems) the lack of human-centered unencumbered user interfaces and 3D interaction technologies. Such shortcomings present severe limitations to the application of virtual reality (VR) technology to time- critical applications as well as employment scenarios that involve heterogeneous groups of end-users without high levels of computer familiarity and expertise. Traditionally such employment scenarios are common in planning-related application areas such as mission rehearsal and command and control. For these applications, a high grade of flexibility with respect to the system requirements (display and I/O devices) as well as to the ability to seamlessly and intuitively switch between different interaction modalities and interaction are sought. Conventional VR techniques may be insufficient to meet this challenge. This paper presents novel approaches for human-centered interfaces to Virtual Environments focusing on the Virtual Table visual input device. It introduces new paradigms for 3D interaction in virtual environments (VE) for a variety of application areas based on pen-and-clipboard, mirror-in-hand, and magic-lens metaphors, and introduces new concepts for combining VR and augmented reality (AR) techniques. It finally describes approaches toward hybrid and distributed multi-user interaction environments and concludes by hypothesizing on possible use cases for defense applications.
CONTACT: An Air Force technical report on military satellite control technology
NASA Astrophysics Data System (ADS)
Weakley, Christopher K.
1993-07-01
This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.
Improving Performance During Image-Guided Procedures
Duncan, James R.; Tabriz, David
2015-01-01
Objective Image-guided procedures have become a mainstay of modern health care. This article reviews how human operators process imaging data and use it to plan procedures and make intraprocedural decisions. Methods A series of models from human factors research, communication theory, and organizational learning were applied to the human-machine interface that occupies the center stage during image-guided procedures. Results Together, these models suggest several opportunities for improving performance as follows: 1. Performance will depend not only on the operator’s skill but also on the knowledge embedded in the imaging technology, available tools, and existing protocols. 2. Voluntary movements consist of planning and execution phases. Performance subscores should be developed that assess quality and efficiency during each phase. For procedures involving ionizing radiation (fluoroscopy and computed tomography), radiation metrics can be used to assess performance. 3. At a basic level, these procedures consist of advancing a tool to a specific location within a patient and using the tool. Paradigms from mapping and navigation should be applied to image-guided procedures. 4. Recording the content of the imaging system allows one to reconstruct the stimulus/response cycles that occur during image-guided procedures. Conclusions When compared with traditional “open” procedures, the technology used during image-guided procedures places an imaging system and long thin tools between the operator and the patient. Taking a step back and reexamining how information flows through an imaging system and how actions are conveyed through human-machine interfaces suggest that much can be learned from studying system failures. In the same way that flight data recorders revolutionized accident investigations in aviation, much could be learned from recording video data during image-guided procedures. PMID:24921628
Speech Acquisition and Automatic Speech Recognition for Integrated Spacesuit Audio Systems
NASA Technical Reports Server (NTRS)
Huang, Yiteng; Chen, Jingdong; Chen, Shaoyan
2010-01-01
A voice-command human-machine interface system has been developed for spacesuit extravehicular activity (EVA) missions. A multichannel acoustic signal processing method has been created for distant speech acquisition in noisy and reverberant environments. This technology reduces noise by exploiting differences in the statistical nature of signal (i.e., speech) and noise that exists in the spatial and temporal domains. As a result, the automatic speech recognition (ASR) accuracy can be improved to the level at which crewmembers would find the speech interface useful. The developed speech human/machine interface will enable both crewmember usability and operational efficiency. It can enjoy a fast rate of data/text entry, small overall size, and can be lightweight. In addition, this design will free the hands and eyes of a suited crewmember. The system components and steps include beam forming/multi-channel noise reduction, single-channel noise reduction, speech feature extraction, feature transformation and normalization, feature compression, model adaption, ASR HMM (Hidden Markov Model) training, and ASR decoding. A state-of-the-art phoneme recognizer can obtain an accuracy rate of 65 percent when the training and testing data are free of noise. When it is used in spacesuits, the rate drops to about 33 percent. With the developed microphone array speech-processing technologies, the performance is improved and the phoneme recognition accuracy rate rises to 44 percent. The recognizer can be further improved by combining the microphone array and HMM model adaptation techniques and using speech samples collected from inside spacesuits. In addition, arithmetic complexity models for the major HMMbased ASR components were developed. They can help real-time ASR system designers select proper tasks when in the face of constraints in computational resources.
WTEC panel report on European nuclear instrumentation and controls
NASA Technical Reports Server (NTRS)
White, James D.; Lanning, David D.; Beltracchi, Leo; Best, Fred R.; Easter, James R.; Oakes, Lester C.; Sudduth, A. L.
1991-01-01
Control and instrumentation systems might be called the 'brain' and 'senses' of a nuclear power plant. As such they become the key elements in the integrated operation of these plants. Recent developments in digital equipment have allowed a dramatic change in the design of these instrument and control (I&C) systems. New designs are evolving with cathode ray tube (CRT)-based control rooms, more automation, and better logical information for the human operators. As these new advanced systems are developed, various decisions must be made about the degree of automation and the human-to-machine interface. Different stages of the development of control automation and of advanced digital systems can be found in various countries. The purpose of this technology assessment is to make a comparative evaluation of the control and instrumentation systems that are being used for commercial nuclear power plants in Europe and the United States. This study is limited to pressurized water reactors (PWR's). Part of the evaluation includes comparisons with a previous similar study assessing Japanese technology.
ERIC Educational Resources Information Center
Barbian, Jeff
2001-01-01
Looks at some of the electronic learning technology that has already been developed and will become common for training, including robots, lucid dreaming, tele-immersion, human interface technology, among others. (JOW)
[Application prospect of human-artificial intelligence system in future manned space flight].
Wei, Jin-he
2003-01-01
To make the manned space flight more efficient and safer, a concept of human-artificial (AI) system is proposed in the present paper. The task of future manned space flight and the technique requirement with respect to the human-AI system development were analyzed. The main points are as follows: 1)Astronaut and AI are complementary to each other functionally; 2) Both symbol AI and connectionist AI should be included in the human-AI system, but expert system and Soar-like system are used mainly inside the cabin, the COG-like robots are mainly assigned for EVA either in LEO flight or on the surface of Moon or Mars; 3) The human-AI system is hierarchical in nature with astronaut at the top level; 4) The complex interfaces between astronaut and AI are the key points for running the system reliably and efficiently. As the importance of human-AI system in future manned space flight and the complexity of related technology, it is suggested that the R/D should be planned as early as possible.
Joint Service Aircrew Mask (JSAM) - Strategic Aircraft (SA): Noise Attenuation Performance
2015-08-25
Billy Swayne Ball Aerospace and Technologies Corp. Dayton, OH Hilary Gallagher Warfighter Interface Division Battlespace Acoustics Branch...DISTRIBUTION STATEMENT. //signed// //signed// Hilary Gallagher Robert C. McKinley Work Unit Manager Chief, Battlespace Acoustics Branch...Battlespace Acoustics Branch Warfighter Interface Division //signed// William E. Russell, Chief Warfighter Interface Division Human
2004-06-01
such as that represented in the know-how of the master craftsman), and cognitive (know why, perceptions, values, beliefs, and mental models).4... cognitive engineering, educational technology, industrial/organizational psychology, sociology, cultural anthropology, and computational...such as human-human interaction, interface design and evaluation methodology, cognitive models and user models, health and ergonomic studies, empirical
Development and human factors analysis of neuronavigation vs. augmented reality.
Pandya, Abhilash; Siadat, Mohammad-Reza; Auner, Greg; Kalash, Mohammad; Ellis, R Darin
2004-01-01
This paper is focused on the human factors analysis comparing a standard neuronavigation system with an augmented reality system. We use a passive articulated arm (Microscribe, Immersion technology) to track a calibrated end-effector mounted video camera. In real time, we superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. Using the same robotic arm, we have developed a neuronavigation system able to show the end-effector of the arm on orthogonal CT scans. Both the AR and the neuronavigation systems have been shown to be within 3mm of accuracy. A human factors study was conducted in which subjects were asked to draw craniotomies and answer questions to gage their understanding of the phantom objects. The human factors study included 21 subjects and indicated that the subjects performed faster, with more accuracy and less errors using the Augmented Reality interface.
Exoskeleton for Soldier Enhancement Systems Feasibility Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jansen, J.F.
2000-09-28
The development of a successful exoskeleton for human performance augmentation (EHPA) will require a multi-disciplinary systems approach based upon sound biomechanics, power generation and actuation systems, controls technology, and operator interfaces. The ability to integrate key components into a system that enhances performance without impeding operator mobility is essential. The purpose of this study and report are to address the issue of feasibility of building a fieldable EHPA. Previous efforts, while demonstrating progress and enhancing knowledge, have not approached the level required for a fully functional, fieldable system. It is doubtless that the technologies required for a successful exoskeleton havemore » advanced, and some of them significantly. The question to be addressed in this report is have they advanced to the point of making a system feasible in the next three to five years? In this study, the key technologies required to successfully build an exoskeleton have been examined. The primary focus has been on the key technologies of power sources, actuators, and controls. Power sources, including internal combustion engines, fuel cells, batteries, super capacitors, and hybrid sources have been investigated and compared with respect to the exoskeleton application. Both conventional and non-conventional actuator technologies that could impact EHPA have been assessed. In addition to the current state of the art of actuators, the potential for near-term improvements using non-conventional actuators has also been addressed. Controls strategies, and their implication to the design approach, and the exoskeleton to soldier interface have also been investigated. In addition to these key subsystems and technologies, this report addresses technical concepts and issues relating to an integrated design. A recommended approach, based on the results of the study is also presented.« less
FwWebViewPlus: integration of web technologies into WinCC OA based Human-Machine Interfaces at CERN
NASA Astrophysics Data System (ADS)
Golonka, Piotr; Fabian, Wojciech; Gonzalez-Berges, Manuel; Jasiun, Piotr; Varela-Rodriguez, Fernando
2014-06-01
The rapid growth in popularity of web applications gives rise to a plethora of reusable graphical components, such as Google Chart Tools and JQuery Sparklines, implemented in JavaScript and run inside a web browser. In the paper we describe the tool that allows for seamless integration of web-based widgets into WinCC Open Architecture, the SCADA system used commonly at CERN to build complex Human-Machine Interfaces. Reuse of widely available widget libraries and pushing the development efforts to a higher abstraction layer based on a scripting language allow for significant reduction in maintenance of the code in multi-platform environments compared to those currently used in C++ visualization plugins. Adequately designed interfaces allow for rapid integration of new web widgets into WinCC OA. At the same time, the mechanisms familiar to HMI developers are preserved, making the use of new widgets "native". Perspectives for further integration between the realms of WinCC OA and Web development are also discussed.
Evidence Report, Risk of Inadequate Design of Human and Automation/Robotic Integration
NASA Technical Reports Server (NTRS)
Zumbado, Jennifer Rochlis; Billman, Dorrit; Feary, Mike; Green, Collin
2011-01-01
The success of future exploration missions depends, even more than today, on effective integration of humans and technology (automation and robotics). This will not emerge by chance, but by design. Both crew and ground personnel will need to do more demanding tasks in more difficult conditions, amplifying the costs of poor design and the benefits of good design. This report has looked at the importance of good design and the risks from poor design from several perspectives: 1) If the relevant functions needed for a mission are not identified, then designs of technology and its use by humans are unlikely to be effective: critical functions will be missing and irrelevant functions will mislead or drain attention. 2) If functions are not distributed effectively among the (multiple) participating humans and automation/robotic systems, later design choices can do little to repair this: additional unnecessary coordination work may be introduced, workload may be redistributed to create problems, limited human attentional resources may be wasted, and the capabilities of both humans and technology underused. 3) If the design does not promote accurate understanding of the capabilities of the technology, the operators will not use the technology effectively: the system may be switched off in conditions where it would be effective, or used for tasks or in contexts where its effectiveness may be very limited. 4) If an ineffective interaction design is implemented and put into use, a wide range of problems can ensue. Many involve lack of transparency into the system: operators may be unable or find it very difficult to determine a) the current state and changes of state of the automation or robot, b) the current state and changes in state of the system being controlled or acted on, and c) what actions by human or by system had what effects. 5) If the human interfaces for operation and control of robotic agents are not designed to accommodate the unique points of view and operating environments of both the human and the robotic agent, then effective human-robot coordination cannot be achieved.
Physiologically relevant organs on chips
Yum, Kyungsuk; Hong, Soon Gweon; Lee, Luke P.
2015-01-01
Recent advances in integrating microengineering and tissue engineering have generated promising microengineered physiological models for experimental medicine and pharmaceutical research. Here we review the recent development of microengineered physiological systems, or organs on chips, that reconstitute the physiologically critical features of specific human tissues and organs and their interactions. This technology uses microengineering approaches to construct organ-specific microenvironments, reconstituting tissue structures, tissue–tissue interactions and interfaces, and dynamic mechanical and biochemical stimuli found in specific organs, to direct cells to assemble into functional tissues. We first discuss microengineering approaches to reproduce the key elements of physiologically important, dynamic mechanical microenvironments, biochemical microenvironments, and microarchitectures of specific tissues and organs in microfluidic cell culture systems. This is followed by examples of microengineered individual organ models that incorporate the key elements of physiological microenvironments into single microfluidic cell culture systems to reproduce organ-level functions. Finally, microengineered multiple organ systems that simulate multiple organ interactions to better represent human physiology, including human responses to drugs, is covered in this review. This emerging organs-on-chips technology has the potential to become an alternative to 2D and 3D cell culture and animal models for experimental medicine, human disease modeling, drug development, and toxicology. PMID:24357624
Human Machine Interfaces for Teleoperators and Virtual Environments
NASA Technical Reports Server (NTRS)
Durlach, Nathaniel I. (Compiler); Sheridan, Thomas B. (Compiler); Ellis, Stephen R. (Compiler)
1991-01-01
In Mar. 1990, a meeting organized around the general theme of teleoperation research into virtual environment display technology was conducted. This is a collection of conference-related fragments that will give a glimpse of the potential of the following fields and how they interplay: sensorimotor performance; human-machine interfaces; teleoperation; virtual environments; performance measurement and evaluation methods; and design principles and predictive models.
A user interface framework for the Square Kilometre Array: concepts and responsibilities
NASA Astrophysics Data System (ADS)
Marassi, Alessandro; Brajnik, Giorgio; Nicol, Mark; Alberti, Valentina; Le Roux, Gerhard
2016-07-01
The Square Kilometre Array (SKA) project is responsible for developing the SKA Observatory, the world's largest radio telescope, with eventually over a square kilometre of collecting area and including a general headquarters as well as two radio telescopes: SKA1-Mid in South Africa and SKA1-Low in Australia. The SKA project consists of a number of subsystems (elements) among which the Telescope Manager (TM) is the one involved in controlling and monitoring the SKA telescopes. The TM element has three primary responsibilities: management of astronomical observations, management of telescope hardware and software subsystems, management of data to support system operations and all stakeholders (operators, maintainers, engineers and science users) in achieving operational, maintenance and engineering goals. Operators, maintainers, engineers and science users will interact with TM via appropriate user interfaces (UI). The TM UI framework envisaged is a complete set of general technical solutions (components, technologies and design information) for implementing a generic computing system (UI platform). Such a system will enable UI components to be instantiated to allow for human interaction via screens, keyboards, mouse and to implement the necessary logic for acquiring or deriving the information needed for interaction. It will provide libraries and specific Application Programming Interfaces (APIs) to implement operator and engineer interactive interfaces. This paper will provide a status update of the TM UI framework, UI platform and UI components design effort, including the technology choices, and discuss key challenges in the TM UI architecture, as well as our approaches to addressing them.
Human reliability assessment: tools for law enforcement
NASA Astrophysics Data System (ADS)
Ryan, Thomas G.; Overlin, Trudy K.
1997-01-01
This paper suggests ways in which human reliability analysis (HRA) can assist the United State Justice System, and more specifically law enforcement, in enhancing the reliability of the process from evidence gathering through adjudication. HRA is an analytic process identifying, describing, quantifying, and interpreting the state of human performance, and developing and recommending enhancements based on the results of individual HRA. It also draws on lessons learned from compilations of several HRA. Given the high legal standards the Justice System is bound to, human errors that might appear to be trivial in other venues can make the difference between a successful and unsuccessful prosecution. HRA has made a major contribution to the efficiency, favorable cost-benefit ratio, and overall success of many enterprises where humans interface with sophisticated technologies, such as the military, ground transportation, chemical and oil production, nuclear power generation, commercial aviation and space flight. Each of these enterprises presents similar challenges to the humans responsible for executing action and action sequences, especially where problem solving and decision making are concerned. Nowhere are humans confronted, to a greater degree, with problem solving and decision making than are the diverse individuals and teams responsible for arrest and adjudication of criminal proceedings. This paper concludes that because of the parallels between the aforementioned technologies and the adjudication process, especially crime scene evidence gathering, there is reason to believe that the HRA technology, developed and enhanced in other applications, can be transferred to the Justice System with minimal cost and with significant payoff.
Cognitive Awareness Prototype Development on User Interface Design
ERIC Educational Resources Information Center
Rosli, D'oria Islamiah
2015-01-01
Human error is a crucial problem in manufacturing industries. Due to the misinterpretation of information on interface system design, accidents or death may occur at workplace. Lack of human cognition criteria in interface system design is also one of the contributions to the failure in using the system effectively. Therefore, this paper describes…
Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs
Bass, Ellen J.
2011-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930
NASA Astrophysics Data System (ADS)
Miękina, Andrzej; Wagner, Jakub; Mazurek, Paweł; Morawski, Roman Z.; Sudmann, Tobba T.; Børsheim, Ingebjørg T.; Øvsthus, Knut; Jacobsen, Frode F.; Ciamulski, Tomasz; Winiecki, Wiesław
2016-11-01
The importance of research on new technologies that could be employed in care services for elderly and disabled persons is highlighted. Advantages of radar sensors, when applied for non-invasive monitoring of such persons in their home environment, are indicated. A need for comprehensible visualisation of the intermediate results of measurement data processing is justified. Capability of an impulse-radar-based system to provide information, being of crucial importance for medical or healthcare personnel, are investigated. An exemplary software interface, tailored for non-technical users, is proposed, and preliminary results of impulse-radar-based monitoring of human movements are demonstrated.
NASA Technical Reports Server (NTRS)
Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.
Avatars and virtual agents – relationship interfaces for the elderly
2017-01-01
In the Digital Era, the authors witness a change in the relationship between the patient and the care-giver or Health Maintenance Organization's providing the health services. Another fact is the use of various technologies to increase the effectiveness and quality of health services across all primary and secondary users. These technologies range from telemedicine systems, decision making tools, online and self-services applications and virtual agents; all providing information and assistance. The common thread between all these digital implementations, is they all require human machine interfaces. These interfaces must be interactive, user friendly and inviting, to create user involvement and cooperation incentives. The challenge is to design interfaces which will best fit the target users and enable smooth interaction especially, for the elderly users. Avatars and Virtual Agents are one of the interfaces used for both home care monitoring and companionship. They are also inherently multimodal in nature and allow an intimate relation between the elderly users and the Avatar. This study discusses the need and nature of these relationship models, the challenges of designing for the elderly. The study proposes key features for the design and evaluation in the area of assistive applications using Avatar and Virtual agents for the elderly users. PMID:28706725
Guidelines and Capabilities for Designing Human Missions
NASA Technical Reports Server (NTRS)
Allen, Christopher S.; Burnett, Rebeka; Charles, John; Cucinotta, Frank; Fullerton, Richard; Goodman, Jerry R.; Griffith, Anthony D., Sr.; Kosmo, Joseph J.; Perchonok, Michele; Railsback, Jan;
2003-01-01
These guidelines and capabilities identify the points of intersection between human spaceflight crews and mission considerations such as architecture, vehicle design, technologies, operations, and science requirements. In these chapters, we will provide clear, top-level guidelines for human-related exploration studies and technology research that will address common questions and requirements. As a result, we hope that ongoing mission trade studies will consider common, standard, and practical criteria for human interfaces.
Wearable computer for mobile augmented-reality-based controlling of an intelligent robot
NASA Astrophysics Data System (ADS)
Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino
2000-10-01
An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.
NASA Technical Reports Server (NTRS)
1991-01-01
Summary reports on each of the eight tasks undertaken by this contract are given. Discussed here is an evaluation of a Closed Ecological Life Support System (CELSS), including modeling and analysis of Physical/Chemical Closed Loop Life Support (P/C CLLS); the Environmental Control and Life Support Systems (ECLSS) evolution - Intermodule Ventilation study; advanced technologies interface requirements relative to ECLSS; an ECLSS resupply analysis; the ECLSS module addition relocation systems engineering analysis; an ECLSS cost/benefit analysis to identify rack-level interface requirements of the alternate technologies evaluated in the ventilation study, with a comparison of these with the rack level interface requirements for the baseline technologies; advanced instrumentation - technology database enhancement; and a clean room survey and assessment of various ECLSS evaluation options for different growth scenarios.
A conformal, bio-interfaced class of silicon electronics for mapping cardiac electrophysiology.
Viventi, Jonathan; Kim, Dae-Hyeong; Moss, Joshua D; Kim, Yun-Soung; Blanco, Justin A; Annetta, Nicholas; Hicks, Andrew; Xiao, Jianliang; Huang, Younggang; Callans, David J; Rogers, John A; Litt, Brian
2010-03-24
In all current implantable medical devices such as pacemakers, deep brain stimulators, and epilepsy treatment devices, each electrode is independently connected to separate control systems. The ability of these devices to sample and stimulate tissues is hindered by this configuration and by the rigid, planar nature of the electronics and the electrode-tissue interfaces. Here, we report the development of a class of mechanically flexible silicon electronics for multiplexed measurement of signals in an intimate, conformal integrated mode on the dynamic, three-dimensional surfaces of soft tissues in the human body. We demonstrate this technology in sensor systems composed of 2016 silicon nanomembrane transistors configured to record electrical activity directly from the curved, wet surface of a beating porcine heart in vivo. The devices sample with simultaneous submillimeter and submillisecond resolution through 288 amplified and multiplexed channels. We use this system to map the spread of spontaneous and paced ventricular depolarization in real time, at high resolution, on the epicardial surface in a porcine animal model. This demonstration is one example of many possible uses of this technology in minimally invasive medical devices.
Knowledge-Based Extensible Natural Language Interface Technology Program
1989-11-30
natural language as its own meta-language to explain the meaning and attributes of the words and idioms of the larguage. Educational courses in language...understood and used by Lydia for human-computer dialogue. The KL enables a systems developer or " teacher -user" to build the system to a point where new...language can be "formal" as in a structured educational language program or it can be "informal" as in the case of a person consulting a dictionary for the
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
A Robot System for Remote Book Browsing
NASA Astrophysics Data System (ADS)
Tomizawa, Tetsuo; Ohya, Akihisa; Yuta, Shin'ichi
This paper describes a system which uses a mobile manipulator located in a library as a teleoperated tool for browsing books from a remote location via the Internet. In the process of developing this system, we designed and built a robot system, specially equipped for the accomplishment of browsing determined books, which is mainly categorized by 3 basic goals: (1) picking up the book by using a manipulator, (2) opening the book and (3) turning pages by a developed browsing device. Likewise, this paper also describes the human interface by the integration of Internet technologies, and summarize some considerations about the system.
Evaluation of display technologies for Internet of Things (IoT)
NASA Astrophysics Data System (ADS)
Sabo, Julia; Fegert, Tobias; Cisowski, Matthäus Stephanus; Marsal, Anatolij; Eichberger, Domenik; Blankenbach, Karlheinz
2017-02-01
Internet of Things (IoT) is a booming industry. We investigated several (semi-) professional IoT devices in combination with displays (focus on reflective technologies) and LEDs. First, these displays were compared for reflectance and ambient light performance. Two measurement set-ups with diffuse conditions were used for simulating typical indoor lighting conditions of IoT displays. E-paper displays were evaluated best as they combine a relative high reflectance with large contrast ratio. Reflective monochrome LCDs show a lower reflectance but are widely available. Second we studied IoT microprocessors interfaces to displays. A µP can drive single LEDs and one or two Seg 8 LED digits directly by GPIOs. Other display technologies require display controllers with a parallel or serial interface to the microprocessor as they need dedicated waveforms for driving the pixels. Most suitable are display modules with built-in display RAM as only pixel data have to be transferred which changes. A HDMI output (e.g. Raspberry Pi) results in high cost for the displays, therefore AMLCDs are not suitable for low to medium cost IoT systems. We compared and evaluated furthermore status indicators, icons, text and graphics IoT display systems regarding human machine interface (HMI) characteristics and effectiveness as well as power consumption. We found out that low resolution graphics bistable e-paper displays are the most appropriate display technology for IoT systems as they show as well information after a power failure or power switch off during maintenance or e.g. QR codes for installation. LED indicators are the most cost effective approach which has however very limited HMI capabilities.
NASA Astrophysics Data System (ADS)
Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie
2017-06-01
Objective. Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Approach. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys + EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. Main results. The Open Ephys + EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys + EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Significance. Open Ephys + EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.
Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie
2017-06-01
Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys + EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. The Open Ephys + EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys + EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Open Ephys + EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.
The development of the Canadian Mobile Servicing System Kinematic Simulation Facility
NASA Technical Reports Server (NTRS)
Beyer, G.; Diebold, B.; Brimley, W.; Kleinberg, H.
1989-01-01
Canada will develop a Mobile Servicing System (MSS) as its contribution to the U.S./International Space Station Freedom. Components of the MSS will include a remote manipulator (SSRMS), a Special Purpose Dexterous Manipulator (SPDM), and a mobile base (MRS). In order to support requirements analysis and the evaluation of operational concepts related to the use of the MSS, a graphics based kinematic simulation/human-computer interface facility has been created. The facility consists of the following elements: (1) A two-dimensional graphics editor allowing the rapid development of virtual control stations; (2) Kinematic simulations of the space station remote manipulators (SSRMS and SPDM), and mobile base; and (3) A three-dimensional graphics model of the space station, MSS, orbiter, and payloads. These software elements combined with state of the art computer graphics hardware provide the capability to prototype MSS workstations, evaluate MSS operational capabilities, and investigate the human-computer interface in an interactive simulation environment. The graphics technology involved in the development and use of this facility is described.
A motion sensing-based framework for robotic manipulation.
Deng, Hao; Xia, Zeyang; Weng, Shaokui; Gan, Yangzhou; Fang, Peng; Xiong, Jing
2016-01-01
To data, outside of the controlled environments, robots normally perform manipulation tasks operating with human. This pattern requires the robot operators with high technical skills training for varied teach-pendant operating system. Motion sensing technology, which enables human-machine interaction in a novel and natural interface using gestures, has crucially inspired us to adopt this user-friendly and straightforward operation mode on robotic manipulation. Thus, in this paper, we presented a motion sensing-based framework for robotic manipulation, which recognizes gesture commands captured from motion sensing input device and drives the action of robots. For compatibility, a general hardware interface layer was also developed in the framework. Simulation and physical experiments have been conducted for preliminary validation. The results have shown that the proposed framework is an effective approach for general robotic manipulation with motion sensing control.
Oppold, P; Rupp, M; Mouloua, M; Hancock, P A; Martin, J
2012-01-01
Unmanned (UAVs, UCAVs, and UGVs) systems still have major human factors and ergonomic challenges related to the effective design of their control interface systems, crucial to their efficient operation, maintenance, and safety. Unmanned system interfaces with a human centered approach promote intuitive interfaces that are easier to learn, and reduce human errors and other cognitive ergonomic issues with interface design. Automation has shifted workload from physical to cognitive, thus control interfaces for unmanned systems need to reduce mental workload on the operators and facilitate the interaction between vehicle and operator. Two-handed video game controllers provide wide usability within the overall population, prior exposure for new operators, and a variety of interface complexity levels to match the complexity level of the task and reduce cognitive load. This paper categorizes and provides taxonomy for 121 haptic interfaces from the entertainment industry that can be utilized as control interfaces for unmanned systems. Five categories of controllers were based on the complexity of the buttons, control pads, joysticks, and switches on the controller. This allows the selection of the level of complexity needed for a specific task without creating an entirely new design or utilizing an overly complex design.
Towards Rehabilitation Robotics: Off-the-Shelf BCI Control of Anthropomorphic Robotic Arms.
Athanasiou, Alkinoos; Xygonakis, Ioannis; Pandria, Niki; Kartsidis, Panagiotis; Arfaras, George; Kavazidi, Kyriaki Rafailia; Foroglou, Nicolas; Astaras, Alexander; Bamidis, Panagiotis D
2017-01-01
Advances in neural interfaces have demonstrated remarkable results in the direction of replacing and restoring lost sensorimotor function in human patients. Noninvasive brain-computer interfaces (BCIs) are popular due to considerable advantages including simplicity, safety, and low cost, while recent advances aim at improving past technological and neurophysiological limitations. Taking into account the neurophysiological alterations of disabled individuals, investigating brain connectivity features for implementation of BCI control holds special importance. Off-the-shelf BCI systems are based on fast, reproducible detection of mental activity and can be implemented in neurorobotic applications. Moreover, social Human-Robot Interaction (HRI) is increasingly important in rehabilitation robotics development. In this paper, we present our progress and goals towards developing off-the-shelf BCI-controlled anthropomorphic robotic arms for assistive technologies and rehabilitation applications. We account for robotics development, BCI implementation, and qualitative assessment of HRI characteristics of the system. Furthermore, we present two illustrative experimental applications of the BCI-controlled arms, a study of motor imagery modalities on healthy individuals' BCI performance, and a pilot investigation on spinal cord injured patients' BCI control and brain connectivity. We discuss strengths and limitations of our design and propose further steps on development and neurophysiological study, including implementation of connectivity features as BCI modality.
Towards Rehabilitation Robotics: Off-the-Shelf BCI Control of Anthropomorphic Robotic Arms
Xygonakis, Ioannis; Pandria, Niki; Kartsidis, Panagiotis; Arfaras, George; Kavazidi, Kyriaki Rafailia; Foroglou, Nicolas
2017-01-01
Advances in neural interfaces have demonstrated remarkable results in the direction of replacing and restoring lost sensorimotor function in human patients. Noninvasive brain-computer interfaces (BCIs) are popular due to considerable advantages including simplicity, safety, and low cost, while recent advances aim at improving past technological and neurophysiological limitations. Taking into account the neurophysiological alterations of disabled individuals, investigating brain connectivity features for implementation of BCI control holds special importance. Off-the-shelf BCI systems are based on fast, reproducible detection of mental activity and can be implemented in neurorobotic applications. Moreover, social Human-Robot Interaction (HRI) is increasingly important in rehabilitation robotics development. In this paper, we present our progress and goals towards developing off-the-shelf BCI-controlled anthropomorphic robotic arms for assistive technologies and rehabilitation applications. We account for robotics development, BCI implementation, and qualitative assessment of HRI characteristics of the system. Furthermore, we present two illustrative experimental applications of the BCI-controlled arms, a study of motor imagery modalities on healthy individuals' BCI performance, and a pilot investigation on spinal cord injured patients' BCI control and brain connectivity. We discuss strengths and limitations of our design and propose further steps on development and neurophysiological study, including implementation of connectivity features as BCI modality. PMID:28948168
A framework to support human factors of automation in railway intelligent infrastructure.
Dadashi, Nastaran; Wilson, John R; Golightly, David; Sharples, Sarah
2014-01-01
Technological and organisational advances have increased the potential for remote access and proactive monitoring of the infrastructure in various domains and sectors - water and sewage, oil and gas and transport. Intelligent Infrastructure (II) is an architecture that potentially enables the generation of timely and relevant information about the state of any type of infrastructure asset, providing a basis for reliable decision-making. This paper reports an exploratory study to understand the concepts and human factors associated with II in the railway, largely drawing from structured interviews with key industry decision-makers and attachment to pilot projects. Outputs from the study include a data-processing framework defining the key human factors at different levels of the data structure within a railway II system and a system-level representation. The framework and other study findings will form a basis for human factors contributions to systems design elements such as information interfaces and role specifications.
Deep Space Network information system architecture study
NASA Technical Reports Server (NTRS)
Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.
1992-01-01
The purpose of this article is to describe an architecture for the DSN information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990's. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies--i.e., computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.
Save medical personnel's time by improved user interfaces.
Kindler, H
1997-01-01
Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.
Remote Collaboration on Task Scheduling for Humans at Mars
NASA Technical Reports Server (NTRS)
Jaap, John; Meyer, Patrick; Davis, Elizabeth; Richardson, Lea
2006-01-01
As humans venture farther from Earth for longer durations, it will become essential for those on the journey to have significant control over the scheduling of their own activities as well as the activities of their companion systems and robots. However, the crew will not do all the scheduling; timelines will be the result of collaboration with ground personnel. Emerging technologies such as in-space message buses, delay-tolerant networks, and in-space internet will be the carriers on which the collaboration rides. Advances in scheduling technology, in the areas of task modeling, scheduling engines, and user interfaces will allow the crew to become virtual scheduling experts. New concepts of operations for producing the timeline will allow the crew and the ground support to collaborate while providing safeguards to ensure that the mission will be effectively accomplished without endangering the systems or personnel.
The Evolvable Advanced Multi-Mission Operations System (AMMOS): Making Systems Interoperable
NASA Technical Reports Server (NTRS)
Ko, Adans Y.; Maldague, Pierre F.; Bui, Tung; Lam, Doris T.; McKinney, John C.
2010-01-01
The Advanced Multi-Mission Operations System (AMMOS) provides a common Mission Operation System (MOS) infrastructure to NASA deep space missions. The evolution of AMMOS has been driven by two factors: increasingly challenging requirements from space missions, and the emergence of new IT technology. The work described in this paper focuses on three key tasks related to IT technology requirements: first, to eliminate duplicate functionality; second, to promote the use of loosely coupled application programming interfaces, text based file interfaces, web-based frameworks and integrated Graphical User Interfaces (GUI) to connect users, data, and core functionality; and third, to build, develop, and deploy AMMOS services that are reusable, agile, adaptive to project MOS configurations, and responsive to industrially endorsed information technology standards.
Display integration for ground combat vehicles
NASA Astrophysics Data System (ADS)
Busse, David J.
1998-09-01
The United States Army's requirement to employ high resolution target acquisition sensors and information warfare to increase its dominance over enemy forces has led to the need to integrate advanced display devices into ground combat vehicle crew stations. The Army's force structure require the integration of advanced displays on both existing and emerging ground combat vehicle systems. The fielding of second generation target acquisition sensors, color digital terrain maps and high volume digital command and control information networks on these platforms define display performance requirements. The greatest challenge facing the system integrator is the development and integration of advanced displays that meet operational, vehicle and human computer interface performance requirements for the ground combat vehicle fleet. The subject of this paper is to address those challenges: operational and vehicle performance, non-soldier centric crew station configurations, display performance limitations related to human computer interfaces and vehicle physical environments, display technology limitations and the Department of Defense (DOD) acquisition reform initiatives. How the ground combat vehicle Program Manager and system integrator are addressing these challenges are discussed through the integration of displays on fielded, current and future close combat vehicle applications.
Man-systems integration and the man-machine interface
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
1990-01-01
Viewgraphs on man-systems integration and the man-machine interface are presented. Man-systems integration applies the systems' approach to the integration of the user and the machine to form an effective, symbiotic Man-Machine System (MMS). A MMS is a combination of one or more human beings and one or more physical components that are integrated through the common purpose of achieving some objective. The human operator interacts with the system through the Man-Machine Interface (MMI).
The experience of agency in human-computer interactions: a review
Limerick, Hannah; Coyle, David; Moore, James W.
2014-01-01
The sense of agency is the experience of controlling both one’s body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied “real-life” situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces. PMID:25191256
NASA Astrophysics Data System (ADS)
Simeral, J. D.; Kim, S.-P.; Black, M. J.; Donoghue, J. P.; Hochberg, L. R.
2011-04-01
The ongoing pilot clinical trial of the BrainGate neural interface system aims in part to assess the feasibility of using neural activity obtained from a small-scale, chronically implanted, intracortical microelectrode array to provide control signals for a neural prosthesis system. Critical questions include how long implanted microelectrodes will record useful neural signals, how reliably those signals can be acquired and decoded, and how effectively they can be used to control various assistive technologies such as computers and robotic assistive devices, or to enable functional electrical stimulation of paralyzed muscles. Here we examined these questions by assessing neural cursor control and BrainGate system characteristics on five consecutive days 1000 days after implant of a 4 × 4 mm array of 100 microelectrodes in the motor cortex of a human with longstanding tetraplegia subsequent to a brainstem stroke. On each of five prospectively-selected days we performed time-amplitude sorting of neuronal spiking activity, trained a population-based Kalman velocity decoding filter combined with a linear discriminant click state classifier, and then assessed closed-loop point-and-click cursor control. The participant performed both an eight-target center-out task and a random target Fitts metric task which was adapted from a human-computer interaction ISO standard used to quantify performance of computer input devices. The neural interface system was further characterized by daily measurement of electrode impedances, unit waveforms and local field potentials. Across the five days, spiking signals were obtained from 41 of 96 electrodes and were successfully decoded to provide neural cursor point-and-click control with a mean task performance of 91.3% ± 0.1% (mean ± s.d.) correct target acquisition. Results across five consecutive days demonstrate that a neural interface system based on an intracortical microelectrode array can provide repeatable, accurate point-and-click control of a computer interface to an individual with tetraplegia 1000 days after implantation of this sensor.
Simeral, J D; Kim, S-P; Black, M J; Donoghue, J P; Hochberg, L R
2013-01-01
The ongoing pilot clinical trial of the BrainGate neural interface system aims in part to assess the feasibility of using neural activity obtained from a small-scale, chronically implanted, intracortical microelectrode array to provide control signals for a neural prosthesis system. Critical questions include how long implanted microelectrodes will record useful neural signals, how reliably those signals can be acquired and decoded, and how effectively they can be used to control various assistive technologies such as computers and robotic assistive devices, or to enable functional electrical stimulation of paralyzed muscles. Here we examined these questions by assessing neural cursor control and BrainGate system characteristics on five consecutive days 1000 days after implant of a 4 × 4 mm array of 100 microelectrodes in the motor cortex of a human with longstanding tetraplegia subsequent to a brainstem stroke. On each of five prospectively-selected days we performed time-amplitude sorting of neuronal spiking activity, trained a population-based Kalman velocity decoding filter combined with a linear discriminant click state classifier, and then assessed closed-loop point-and-click cursor control. The participant performed both an eight-target center-out task and a random target Fitts metric task which was adapted from a human-computer interaction ISO standard used to quantify performance of computer input devices. The neural interface system was further characterized by daily measurement of electrode impedances, unit waveforms and local field potentials. Across the five days, spiking signals were obtained from 41 of 96 electrodes and were successfully decoded to provide neural cursor point-and-click control with a mean task performance of 91.3% ± 0.1% (mean ± s.d.) correct target acquisition. Results across five consecutive days demonstrate that a neural interface system based on an intracortical microelectrode array can provide repeatable, accurate point-and-click control of a computer interface to an individual with tetraplegia 1000 days after implantation of this sensor. PMID:21436513
ERIC Educational Resources Information Center
Aquino, Cesar A.
2014-01-01
This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…
A brain-spine interface alleviating gait deficits after spinal cord injury in primates.
Capogrosso, Marco; Milekovic, Tomislav; Borton, David; Wagner, Fabien; Moraud, Eduardo Martin; Mignardot, Jean-Baptiste; Buse, Nicolas; Gandar, Jerome; Barraud, Quentin; Xing, David; Rey, Elodie; Duis, Simone; Jianzhong, Yang; Ko, Wai Kin D; Li, Qin; Detemple, Peter; Denison, Tim; Micera, Silvestro; Bezard, Erwan; Bloch, Jocelyne; Courtine, Grégoire
2016-11-10
Spinal cord injury disrupts the communication between the brain and the spinal circuits that orchestrate movement. To bypass the lesion, brain-computer interfaces have directly linked cortical activity to electrical stimulation of muscles, and have thus restored grasping abilities after hand paralysis. Theoretically, this strategy could also restore control over leg muscle activity for walking. However, replicating the complex sequence of individual muscle activation patterns underlying natural and adaptive locomotor movements poses formidable conceptual and technological challenges. Recently, it was shown in rats that epidural electrical stimulation of the lumbar spinal cord can reproduce the natural activation of synergistic muscle groups producing locomotion. Here we interface leg motor cortex activity with epidural electrical stimulation protocols to establish a brain-spine interface that alleviated gait deficits after a spinal cord injury in non-human primates. Rhesus monkeys (Macaca mulatta) were implanted with an intracortical microelectrode array in the leg area of the motor cortex and with a spinal cord stimulation system composed of a spatially selective epidural implant and a pulse generator with real-time triggering capabilities. We designed and implemented wireless control systems that linked online neural decoding of extension and flexion motor states with stimulation protocols promoting these movements. These systems allowed the monkeys to behave freely without any restrictions or constraining tethered electronics. After validation of the brain-spine interface in intact (uninjured) monkeys, we performed a unilateral corticospinal tract lesion at the thoracic level. As early as six days post-injury and without prior training of the monkeys, the brain-spine interface restored weight-bearing locomotion of the paralysed leg on a treadmill and overground. The implantable components integrated in the brain-spine interface have all been approved for investigational applications in similar human research, suggesting a practical translational pathway for proof-of-concept studies in people with spinal cord injury.
A multimodal interface for real-time soldier-robot teaming
NASA Astrophysics Data System (ADS)
Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.
2016-05-01
Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.
Natural User Interface Sensors for Human Body Measurement
NASA Astrophysics Data System (ADS)
Boehm, J.
2012-08-01
The recent push for natural user interfaces (NUI) in the entertainment and gaming industry has ushered in a new era of low cost three-dimensional sensors. While the basic idea of using a three-dimensional sensor for human gesture recognition dates some years back it is not until recently that such sensors became available on the mass market. The current market leader is PrimeSense who provide their technology for the Microsoft Xbox Kinect. Since these sensors are developed to detect and observe human users they should be ideally suited to measure the human body. We describe the technology of a line of NUI sensors and assess their performance in terms of repeatability and accuracy. We demonstrate the implementation of a prototype scanner integrating several NUI sensors to achieve full body coverage. We present the results of the obtained surface model of a human body.
Chemical Transformation System: Cloud Based ...
Integrated Environmental Modeling (IEM) systems that account for the fate/transport of organics frequently require physicochemical properties as well as transformation products. A myriad of chemical property databases exist but these can be difficult to access and often do not contain the proprietary chemicals that environmental regulators must consider. We are building the Chemical Transformation System (CTS) to facilitate model parameterization and analysis. CTS integrates a number of physicochemical property calculators into the system including EPI Suite, SPARC, TEST and ChemAxon. The calculators are heterogeneous in their scientific methodologies, technology implementations and deployment stacks. CTS also includes a chemical transformation processing engine that has been loaded with reaction libraries for human biotransformation, abiotic reduction and abiotic hydrolysis. CTS implements a common interface for the disparate calculators accepting molecular identifiers (SMILES, IUPAC, CAS#, user-drawn molecule) before submission for processing. To make the system as accessible as possible and provide a consistent programmatic interface, we wrapped the calculators in a standardized RESTful Application Programming Interface (API) which makes it capable of servicing a much broader spectrum of clients without constraints to interoperability such as operating system or programming language. CTS is hosted in a shared cloud environment, the Quantitative Environmental
Personal mobility and manipulation using robotics, artificial intelligence and advanced control.
Cooper, Rory A; Ding, Dan; Grindle, Garrett G; Wang, Hongwu
2007-01-01
Recent advancements of technologies, including computation, robotics, machine learning, communication, and miniaturization technologies, bring us closer to futuristic visions of compassionate intelligent devices. The missing element is a basic understanding of how to relate human functions (physiological, physical, and cognitive) to the design of intelligent devices and systems that aid and interact with people. Our stakeholder and clinician consultants identified a number of mobility barriers that have been intransigent to traditional approaches. The most important physical obstacles are stairs, steps, curbs, doorways (doors), rough/uneven surfaces, weather hazards (snow, ice), crowded/cluttered spaces, and confined spaces. Focus group participants suggested a number of ways to make interaction simpler, including natural language interfaces such as the ability to say "I want a drink", a library of high level commands (open a door, park the wheelchair, ...), and a touchscreen interface with images so the user could point and use other gestures.
Step 1: C3 Flight Demo Data Analysis Plan
NASA Technical Reports Server (NTRS)
2005-01-01
The Data Analysis Plan (DAP) describes the data analysis that the C3 Work Package (WP) will perform in support of the Access 5 Step 1 C3 flight demonstration objectives as well as the processes that will be used by the Flight IPT to gather and distribute the data collected to satisfy those objectives. In addition to C3 requirements, this document will encompass some Human Systems Interface (HSI) requirements in performing the C3 flight demonstrations. The C3 DAP will be used as the primary interface requirements document between the C3 Work Package and Flight Test organizations (Flight IPT and Non-Access 5 Flight Programs). In addition to providing data requirements for Access 5 flight test (piggyback technology demonstration flights, dedicated C3 technology demonstration flights, and Airspace Operations Demonstration flights), the C3 DAP will be used to request flight data from Non- Access 5 flight programs for C3 related data products
Passive BCI in Operational Environments: Insights, Recent Advances, and Future Trends.
Arico, Pietro; Borghini, Gianluca; Di Flumeri, Gianluca; Sciaraffa, Nicolina; Colosimo, Alfredo; Babiloni, Fabio
2017-07-01
This minireview aims to highlight recent important aspects to consider and evaluate when passive brain-computer interface (pBCI) systems would be developed and used in operational environments, and remarks future directions of their applications. Electroencephalography (EEG) based pBCI has become an important tool for real-time analysis of brain activity since it could potentially provide covertly-without distracting the user from the main task-and objectively-not affected by the subjective judgment of an observer or the user itself-information about the operator cognitive state. Different examples of pBCI applications in operational environments and new adaptive interface solutions have been presented and described. In addition, a general overview regarding the correct use of machine learning techniques (e.g., which algorithm to use, common pitfalls to avoid, etc.) in the pBCI field has been provided. Despite recent innovations on algorithms and neurotechnology, pBCI systems are not completely ready to enter the market yet, mainly due to limitations of the EEG electrodes technology, and algorithms reliability and capability in real settings. High complexity and safety critical systems (e.g., airplanes, ATM interfaces) should adapt their behaviors and functionality accordingly to the user' actual mental state. Thus, technologies (i.e., pBCIs) able to measure in real time the user's mental states would result very useful in such "high risk" environments to enhance human machine interaction, and so increase the overall safety.
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Proceeding of human exoskeleton technology and discussions on future research
NASA Astrophysics Data System (ADS)
Li, Zhiqiang; Xie, Hanxing; Li, Weilin; Yao, Zheng
2014-05-01
After more than half a century of intense efforts, the development of exoskeleton has seen major advances, and several remarkable achievements have been made. Reviews of developing history of exoskeleton are presented, both in active and passive categories. Major models are introduced, and typical technologies are commented on. Difficulties in control algorithm, driver system, power source, and man-machine interface are discussed. Current researching routes and major developing methods are mapped and critically analyzed, and in the process, some key problems are revealed. First, the exoskeleton is totally different from biped robot, and relative studies based on the robot technologies are considerably incorrect. Second, biomechanical studies are only used to track the motion of the human body, the interaction between human and machines are seldom studied. Third, the traditional developing ways which focused on servo-controlling have inborn deficiency from making portable systems. Research attention should be shifted to the human side of the coupling system, and the human ability to learn and adapt should play a more significant role in the control algorithms. Having summarized the major difficulties, possible future works are discussed. It is argued that, since a distinct boundary cannot be drawn in such strong-coupling human-exoskeleton system, the more complex the control system gets, the more difficult it is for the user to learn to use. It is suggested that the exoskeleton should be treated as a simple wearable tool, and downgrading its automatic level may be a change toward a brighter research outlook. This effort at simplification is definitely not easy, as it necessitates theoretical supports from fields such as biomechanics, ergonomics, and bionics.
ERIC Educational Resources Information Center
Johnson, Phyllis
2002-01-01
Explores the interface of technology and education for human development in southern Africa. Uses the case of Mozambique to describe the challenges presented by the global marketplace and local policy. Outlines the vision of the New Partnership for Africa's Development Centre (SARDC) to reduce the digital divide for Africa. (CAJ)
Madore, Amy; Rosenberg, Julie; Muyindike, Winnie R; Bangsberg, David R; Bwana, Mwebesa B; Martin, Jeffrey N; Kanyesigye, Michael; Weintraub, Rebecca
2015-12-01
Implementation lessons: • Technology alone does not necessarily lead to improvement in health service delivery, in contrast to the common assumption that advanced technology goes hand in hand with progress. • Implementation of electronic medical record (EMR) systems is a complex, resource-intensive process that, in addition to software, hardware, and human resource investments, requires careful planning, change management skills, adaptability, and continuous engagement of stakeholders. • Research requirements and goals must be balanced with service delivery needs when determining how much information is essential to collect and who should be interfacing with the EMR system. • EMR systems require ongoing monitoring and regular updates to ensure they are responsive to evolving clinical use cases and research questions. • High-quality data and analyses are essential for EMRs to deliver value to providers, researchers, and patients. Copyright © 2015 Elsevier Inc. All rights reserved.
2009-05-30
d’interface fondées sur le comportement et sur la psychologie , ainsi que des méthodes de conception et de mise en œuvre d’interfaces multi-agents. On a mis...réseaucentriques. Ces technologies comprennent des approches de conception d’interface fondées sur le comportement et sur la psychologie , ainsi que des
PC/AT-based architecture for shared telerobotic control
NASA Astrophysics Data System (ADS)
Schinstock, Dale E.; Faddis, Terry N.; Barr, Bill G.
1993-03-01
A telerobotic control system must include teleoperational, shared, and autonomous modes of control in order to provide a robot platform for incorporating the rapid advances that are occurring in telerobotics and associated technologies. These modes along with the ability to modify the control algorithms are especially beneficial for telerobotic control systems used for research purposes. The paper describes an application of the PC/AT platform to the control system of a telerobotic test cell. The paper provides a discussion of the suitability of the PC/AT as a platform for a telerobotic control system. The discussion is based on the many factors affecting the choice of a computer platform for a real time control system. The factors include I/O capabilities, simplicity, popularity, computational performance, and communication with external systems. The paper also includes a description of the actuation, measurement, and sensor hardware of both the master manipulator and the slave robot. It also includes a description of the PC-Bus interface cards. These cards were developed by the researchers in the KAT Laboratory, specifically for interfacing to the master manipulator and slave robot. Finally, a few different versions of the low level telerobotic control software are presented. This software incorporates shared control by supervisory systems and the human operator and traded control between supervisory systems and the human operator.
Visual and tactile interfaces for bi-directional human robot communication
NASA Astrophysics Data System (ADS)
Barber, Daniel; Lackey, Stephanie; Reinerman-Jones, Lauren; Hudson, Irwin
2013-05-01
Seamless integration of unmanned and systems and Soldiers in the operational environment requires robust communication capabilities. Multi-Modal Communication (MMC) facilitates achieving this goal due to redundancy and levels of communication superior to single mode interaction using auditory, visual, and tactile modalities. Visual signaling using arm and hand gestures is a natural method of communication between people. Visual signals standardized within the U.S. Army Field Manual and in use by Soldiers provide a foundation for developing gestures for human to robot communication. Emerging technologies using Inertial Measurement Units (IMU) enable classification of arm and hand gestures for communication with a robot without the requirement of line-of-sight needed by computer vision techniques. These devices improve the robustness of interpreting gestures in noisy environments and are capable of classifying signals relevant to operational tasks. Closing the communication loop between Soldiers and robots necessitates them having the ability to return equivalent messages. Existing visual signals from robots to humans typically require highly anthropomorphic features not present on military vehicles. Tactile displays tap into an unused modality for robot to human communication. Typically used for hands-free navigation and cueing, existing tactile display technologies are used to deliver equivalent visual signals from the U.S. Army Field Manual. This paper describes ongoing research to collaboratively develop tactile communication methods with Soldiers, measure classification accuracy of visual signal interfaces, and provides an integration example including two robotic platforms.
Study About Ceiling Design for Main Control Room of NPP with HFE
NASA Astrophysics Data System (ADS)
Gu, Pengfei; Ni, Ying; Chen, Weihua; Chen, Bo; Zhang, Jianbo; Liang, Huihui
Recently since human factor engineering (HFE) has been used in control room design of nuclear power plant (NPP), the human-machine interface (HMI) has been gradual to develop harmoniously, especially the use of the digital technology. Comparing with the analog technology which was used to human-machine interface in the past, human-machine interaction has been more enhanced. HFE and the main control room (MCR) design engineering of NPP is a combination of multidisciplinary cross, mainly related to electrical and instrument control, reactor, machinery, systems engineering and management disciplines. However, MCR is not only equipped with HMI provided by the equipments, but also more important for the operator to provide a work environment, such as the main control room ceiling. The ceiling design of main control room related to HFE which influences the performance of staff should also be considered in the design of the environment and aesthetic factors, especially the introduction of professional design experience and evaluation method. Based on Ling Ao phase II and Hong Yanhe project implementation experience, the study analyzes lighting effect, space partition, vision load about the ceiling of main control room of NPP. Combining with the requirements of standards, the advantages and disadvantages of the main control room ceiling design has been discussed, and considering the requirements of lightweight, noise reduction, fire prevention, moisture protection, the ceiling design solution of the main control room also has been discussed.
1993-03-25
application of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has been incorporated...through the ap- plication of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has...programming and Human-Computer Interface (HCI) design. Knowledge gained from each is applied to the design of a Form-based interface for database data
Physiologically relevant organs on chips.
Yum, Kyungsuk; Hong, Soon Gweon; Healy, Kevin E; Lee, Luke P
2014-01-01
Recent advances in integrating microengineering and tissue engineering have generated promising microengineered physiological models for experimental medicine and pharmaceutical research. Here we review the recent development of microengineered physiological systems, or also known as "ogans-on-chips", that reconstitute the physiologically critical features of specific human tissues and organs and their interactions. This technology uses microengineering approaches to construct organ-specific microenvironments, reconstituting tissue structures, tissue-tissue interactions and interfaces, and dynamic mechanical and biochemical stimuli found in specific organs, to direct cells to assemble into functional tissues. We first discuss microengineering approaches to reproduce the key elements of physiologically important, dynamic mechanical microenvironments, biochemical microenvironments, and microarchitectures of specific tissues and organs in microfluidic cell culture systems. This is followed by examples of microengineered individual organ models that incorporate the key elements of physiological microenvironments into single microfluidic cell culture systems to reproduce organ-level functions. Finally, microengineered multiple organ systems that simulate multiple organ interactions to better represent human physiology, including human responses to drugs, is covered in this review. This emerging organs-on-chips technology has the potential to become an alternative to 2D and 3D cell culture and animal models for experimental medicine, human disease modeling, drug development, and toxicology. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Fink, D. Hill, J. O'Hara
2004-11-30
Nuclear plant operators face a significant challenge designing and modifying control rooms. This report provides guidance on planning, designing, implementing and operating modernized control rooms and digital human-system interfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terezakis, Stephanie A., E-mail: stereza1@jhmi.edu; Harris, Kendra M.; Ford, Eric
Purpose: Systems to ensure patient safety are of critical importance. The electronic incident reporting systems (IRS) of 2 large academic radiation oncology departments were evaluated for events that may be suitable for submission to a national reporting system (NRS). Methods and Materials: All events recorded in the combined IRS were evaluated from 2007 through 2010. Incidents were graded for potential severity using the validated French Nuclear Safety Authority (ASN) 5-point scale. These incidents were categorized into 7 groups: (1) human error, (2) software error, (3) hardware error, (4) error in communication between 2 humans, (5) error at the human-software interface,more » (6) error at the software-hardware interface, and (7) error at the human-hardware interface. Results: Between the 2 systems, 4407 incidents were reported. Of these events, 1507 (34%) were considered to have the potential for clinical consequences. Of these 1507 events, 149 (10%) were rated as having a potential severity of ≥2. Of these 149 events, the committee determined that 79 (53%) of these events would be submittable to a NRS of which the majority was related to human error or to the human-software interface. Conclusions: A significant number of incidents were identified in this analysis. The majority of events in this study were related to human error and to the human-software interface, further supporting the need for a NRS to facilitate field-wide learning and system improvement.« less
NASA Technical Reports Server (NTRS)
Orr, Joel N.
1995-01-01
This reflection of human-computer interface and its requirements as virtual technology is advanced, proposes a new term: 'Pezonomics'. The term replaces the term ergonomics ('the law of work') with a definition pointing to 'the law of play.' The necessity of this term, the author reasons, comes from the need to 'capture the essence of play and calibrate our computer systems to its cadences.' Pezonomics will ensure that artificial environments, in particular virtual reality, are user friendly.
NASA Astrophysics Data System (ADS)
The present conference discusses topics in multiwavelength network technology and its applications, advanced digital radio systems in their propagation environment, mobile radio communications, switching programmability, advancements in computer communications, integrated-network management and security, HDTV and image processing in communications, basic exchange communications radio advancements in digital switching, intelligent network evolution, speech coding for telecommunications, and multiple access communications. Also discussed are network designs for quality assurance, recent progress in coherent optical systems, digital radio applications, advanced communications technologies for mobile users, communication software for switching systems, AI and expert systems in network management, intelligent multiplexing nodes, video and image coding, network protocols and performance, system methods in quality and reliability, the design and simulation of lightwave systems, local radio networks, mobile satellite communications systems, fiber networks restoration, packet video networks, human interfaces for future networks, and lightwave networking.
EMG and EPP-integrated human-machine interface between the paralyzed and rehabilitation exoskeleton.
Yin, Yue H; Fan, Yuan J; Xu, Li D
2012-07-01
Although a lower extremity exoskeleton shows great prospect in the rehabilitation of the lower limb, it has not yet been widely applied to the clinical rehabilitation of the paralyzed. This is partly caused by insufficient information interactions between the paralyzed and existing exoskeleton that cannot meet the requirements of harmonious control. In this research, a bidirectional human-machine interface including a neurofuzzy controller and an extended physiological proprioception (EPP) feedback system is developed by imitating the biological closed-loop control system of human body. The neurofuzzy controller is built to decode human motion in advance by the fusion of the fuzzy electromyographic signals reflecting human motion intention and the precise proprioception providing joint angular feedback information. It transmits control information from human to exoskeleton, while the EPP feedback system based on haptic stimuli transmits motion information of the exoskeleton back to the human. Joint angle and torque information are transmitted in the form of air pressure to the human body. The real-time bidirectional human-machine interface can help a patient with lower limb paralysis to control the exoskeleton with his/her healthy side and simultaneously perceive motion on the paralyzed side by EPP. The interface rebuilds a closed-loop motion control system for paralyzed patients and realizes harmonious control of the human-machine system.
Humans, Intelligent Technology, and Their Interface: A Study of Brown’s Point
2017-12-01
known about the role of drivers. When combining humans and intelligent technology (machines), such as self-driving vehicles, how people think about...disrupt the entire transportation industry and potentially change how society moves people and goods. The findings of the investigation are likely...The power of suggestion is very important to understand and consider when framing and bringing meaning to new technology, which points to looking at
Guidance for human interface with artificial intelligence systems
NASA Technical Reports Server (NTRS)
Potter, Scott S.; Woods, David D.
1991-01-01
The beginning of a research effort to collect and integrate existing research findings about how to combine computer power and people is discussed, including problems and pitfalls as well as desirable features. The goal of the research is to develop guidance for the design of human interfaces with intelligent systems. Fault management tasks in NASA domains are the focus of the investigation. Research is being conducted to support the development of guidance for designers that will enable them to make human interface considerations into account during the creation of intelligent systems.
Matching brain-machine interface performance to space applications.
Citi, Luca; Tonet, Oliver; Marinelli, Martina
2009-01-01
A brain-machine interface (BMI) is a particular class of human-machine interface (HMI). BMIs have so far been studied mostly as a communication means for people who have little or no voluntary control of muscle activity. For able-bodied users, such as astronauts, a BMI would only be practical if conceived as an augmenting interface. A method is presented for pointing out effective combinations of HMIs and applications of robotics and automation to space. Latency and throughput are selected as performance measures for a hybrid bionic system (HBS), that is, the combination of a user, a device, and a HMI. We classify and briefly describe HMIs and space applications and then compare the performance of classes of interfaces with the requirements of classes of applications, both in terms of latency and throughput. Regions of overlap correspond to effective combinations. Devices requiring simpler control, such as a rover, a robotic camera, or environmental controls are suitable to be driven by means of BMI technology. Free flyers and other devices with six degrees of freedom can be controlled, but only at low-interactivity levels. More demanding applications require conventional interfaces, although they could be controlled by BMIs once the same levels of performance as currently recorded in animal experiments are attained. Robotic arms and manipulators could be the next frontier for noninvasive BMIs. Integrating smart controllers in HBSs could improve interactivity and boost the use of BMI technology in space applications.
Eye Tracking Based Control System for Natural Human-Computer Interaction
Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528
Eye Tracking Based Control System for Natural Human-Computer Interaction.
Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.
The ecobiopolitics of space biomedicine.
Olson, Valerie A
2010-04-01
Using data from an ethnographic study of American astronautics, I argue that, in an inversion of the usual clinical model, astronaut medical subjecthood is fundamentally environmental rather than biological. In extreme environments like outer space, the concept of environment cannot be bracketed out from life processes; as a result, investments of power and knowledge shift from life itself to the sites of interface among living things, technologies, and environments. To illustrate what this means on the ground, I describe space biomedicine as a form of environmental medicine that seeks to optimize and manage technically enabled human ecologies where life and environment are dually problematized. I provide two examples of what I term its ecobiopolitical strategies: creating a new "space normal" physiological category and situating humans as at-risk elements within integrated biological/technological/environmental systems.
Acceptance Testing of the Vapor Phase Catalytic Ammonia Removal Engineering Development Unit
NASA Technical Reports Server (NTRS)
Flynn, Michael; Fisher, John; Kliss, Mark; Tleimat, Maher; Quinn, Gregory; Fort, James; Nalette, Tim; Baker, Gale
2005-01-01
This paper describes the results of acceptance testing of the Vapor Phase Catalytic Ammonia Removal (VPCAR) technology. The VPCAR technology is currently being developed by NASA as a Mars transit vehicle water recycling system. NASA has recently completed a grant to develop a next generation VPCAR system. This grant was peer reviewed and funded through the Advanced Life Support (ALS) National Research Announcement (NRA). The grant funded a contract with Water Reuse Technology Inc. to construct an engineering development unit. This contract concluded with the shipment of the final deliverable to NASA on 8/31/03. The objective of the acceptance testing was to characterize the performance of this new system. This paper presents the results of mass power, and volume measurements for the delivered system. In addition, product water purity analysis for a Mars transit mission and a planetary base wastewater ersatz are provided. Acoustic noise levels, interface specifications and system reliability results are also discussed. An assessment of the readiness of the technology for human testing and recommendations for future improvements are provided.
Overview of Human-Centric Space Situational Awareness Science and Technology
2012-09-01
AGI), the developers of Satellite Tool Kit ( STK ), has provided demonstrations of innovative SSA visualization concepts that take advantage of the...needs inherent with SSA. RH has conducted CTAs and developed work-centered human-computer interfaces, visualizations , and collaboration technologies...all end users. RH’s Battlespace Visualization Branch researches methods to exploit the visual channel primarily to improve decision making and
NASA Astrophysics Data System (ADS)
Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.
2000-08-01
We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.
Eom, Hwisoo; Lee, Sang Hun
2015-06-12
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles
Eom, Hwisoo; Lee, Sang Hun
2015-01-01
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0021] Proposed Information Collection (VA Loan Electronic Reporting Interface (VALERI) System) Activity: Comment Request AGENCY: Veterans... techniques or the use of other forms of information technology. Title: VA Loan Electronic Reporting Interface...
Next Generation Munitions Handler: Human-Machine Interface and Preliminary Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Jansen, J.F.; Pin, F.G.
1999-04-25
The Next Generation Munitions Handler/Advanced Technology Demonstrator (NGMI-VATTD) is a technology demonstrator for the application of an advanced robotic device for re-arming U.S. Air Force (USAF) and U.S. Navy (USN) tactical fighters. It comprises two key hardware components: a heavy-lift dexterous manipulator (HDM) and a nonholonomic mobility platform. The NGMWATTD is capable of lifting weapons up to 4400 kg (2000 lb) and placing them on any weapons rack on existing fighters (including the F-22 Raptor). This report describes the NGMH mission with particular reference to human-machine interfaces. It also describes preliminary testing to garner feedback about the heavy-lift manipulator armmore » from experienced fighter load crewmen. The purpose of the testing was to provide preliminary information about control system parameters and to gather feed- back from users about manipulator arm functionality. To that end, the Air Force load crewmen interacted with the NGMWATTD in an informal testing session and provided feedback about the performance of the system. Certain con- trol system parameters were changed during the course of the testing and feedback from the participants was used to make a rough estimate of "good" initial operating parameters. Later, formal testing will concentrate within this range to identify optimal operating parameters. User reactions to the HDM were generally positive, All of the USAF personnel were favorably impressed with the capabilities of the system. Fine-tuning operating parameters created a system even more favorably regarded by the load crews. Further adjustment to control system parameters will result in a system that is operationally efficient, easy to use, and well accepted by users.« less
Head-Disk Interface Technology: Challenges and Approaches
NASA Astrophysics Data System (ADS)
Liu, Bo
Magnetic hard disk drive (HDD) technology is believed to be one of the most successful examples of modern mechatronics systems. The mechanical beauty of magnetic HDD includes simple but super high accuracy positioning head, positioning technology, high speed and stability spindle motor technology, and head-disk interface technology which keeps the millimeter sized slider flying over a disk surface at nanometer level slider-disk spacing. This paper addresses the challenges and possible approaches on how to further reduce the slider disk spacing whilst retaining the stability and robustness level of head-disk systems for future advanced magnetic disk drives.
Human machine interface display design document.
DOT National Transportation Integrated Search
2008-01-01
The purpose of this document is to describe the design for the human machine interface : (HMI) display for the Next Generation 9-1-1 (NG9-1-1) System (or system of systems) : based on the initial Tier 1 requirements identified for the NG9-1-1 S...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The National Renewable Energy Laboratory's (NREL) controllable grid interface (CGI) test system at the National Wind Technology Center (NWTC) is one of two user facilities at NREL capable of testing and analyzing the integration of megawatt-scale renewable energy systems. The CGI specializes in testing of multimegawatt-scale wind and photovoltaic (PV) technologies as well as energy storage devices, transformers, control and protection equipment at medium-voltage levels, allowing the determination of the grid impacts of the tested technology.
The role of voice input for human-machine communication.
Cohen, P R; Oviatt, S L
1995-01-01
Optimism is growing that the near future will witness rapid growth in human-computer interaction using voice. System prototypes have recently been built that demonstrate speaker-independent real-time speech recognition, and understanding of naturally spoken utterances with vocabularies of 1000 to 2000 words, and larger. Already, computer manufacturers are building speech recognition subsystems into their new product lines. However, before this technology can be broadly useful, a substantial knowledge base is needed about human spoken language and performance during computer-based spoken interaction. This paper reviews application areas in which spoken interaction can play a significant role, assesses potential benefits of spoken interaction with machines, and compares voice with other modalities of human-computer interaction. It also discusses information that will be needed to build a firm empirical foundation for the design of future spoken and multimodal interfaces. Finally, it argues for a more systematic and scientific approach to investigating spoken input and performance with future language technology. PMID:7479803
Ecological Interface Design for Computer Network Defense.
Bennett, Kevin B; Bryant, Adam; Sushereba, Christen
2018-05-01
A prototype ecological interface for computer network defense (CND) was developed. Concerns about CND run high. Although there is a vast literature on CND, there is some indication that this research is not being translated into operational contexts. Part of the reason may be that CND has historically been treated as a strictly technical problem, rather than as a socio-technical problem. The cognitive systems engineering (CSE)/ecological interface design (EID) framework was used in the analysis and design of the prototype interface. A brief overview of CSE/EID is provided. EID principles of design (i.e., direct perception, direct manipulation and visual momentum) are described and illustrated through concrete examples from the ecological interface. Key features of the ecological interface include (a) a wide variety of alternative visual displays, (b) controls that allow easy, dynamic reconfiguration of these displays, (c) visual highlighting of functionally related information across displays, (d) control mechanisms to selectively filter massive data sets, and (e) the capability for easy expansion. Cyber attacks from a well-known data set are illustrated through screen shots. CND support needs to be developed with a triadic focus (i.e., humans interacting with technology to accomplish work) if it is to be effective. Iterative design and formal evaluation is also required. The discipline of human factors has a long tradition of success on both counts; it is time that HF became fully involved in CND. Direct application in supporting cyber analysts.
Demonstrating artificial intelligence for space systems - Integration and project management issues
NASA Technical Reports Server (NTRS)
Hack, Edmund C.; Difilippo, Denise M.
1990-01-01
As part of its Systems Autonomy Demonstration Project (SADP), NASA has recently demonstrated the Thermal Expert System (TEXSYS). Advanced real-time expert system and human interface technology was successfully developed and integrated with conventional controllers of prototype space hardware to provide intelligent fault detection, isolation, and recovery capability. Many specialized skills were required, and responsibility for the various phases of the project therefore spanned multiple NASA centers, internal departments and contractor organizations. The test environment required communication among many types of hardware and software as well as between many people. The integration, testing, and configuration management tools and methodologies which were applied to the TEXSYS project to assure its safe and successful completion are detailed. The project demonstrated that artificial intelligence technology, including model-based reasoning, is capable of the monitoring and control of a large, complex system in real time.
NASA Astrophysics Data System (ADS)
Kajiwara, Yusuke; Murata, Hiroaki; Kimura, Haruhiko; Abe, Koji
As a communication support tool for cases of amyotrophic lateral sclerosis (ALS), researches on eye gaze human-computer interfaces have been active. However, since voluntary and involuntary eye movements cannot be distinguished in the interfaces, their performance is still not sufficient for practical use. This paper presents a high performance human-computer interface system which unites high quality recognitions of horizontal directional eye movements and voluntary blinks. The experimental results have shown that the number of incorrect inputs is decreased by 35.1% in an existing system which equips recognitions of horizontal and vertical directional eye movements in addition to voluntary blinks and character inputs are speeded up by 17.4% from the existing system.
Eye Tracking and Head Movement Detection: A State-of-Art Survey
2013-01-01
Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851
Videodisc-Computer Interfaces.
ERIC Educational Resources Information Center
Zollman, Dean
1984-01-01
Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…
User Interface Technology Transfer to NASA's Virtual Wind Tunnel System
NASA Technical Reports Server (NTRS)
vanDam, Andries
1998-01-01
Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.
Safer Systems: A NextGen Aviation Safety Strategic Goal
NASA Technical Reports Server (NTRS)
Darr, Stephen T.; Ricks, Wendell R.; Lemos, Katherine A.
2008-01-01
The Joint Planning and Development Office (JPDO), is charged by Congress with developing the concepts and plans for the Next Generation Air Transportation System (NextGen). The National Aviation Safety Strategic Plan (NASSP), developed by the Safety Working Group of the JPDO, focuses on establishing the goals, objectives, and strategies needed to realize the safety objectives of the NextGen Integrated Plan. The three goal areas of the NASSP are Safer Practices, Safer Systems, and Safer Worldwide. Safer Practices emphasizes an integrated, systematic approach to safety risk management through implementation of formalized Safety Management Systems (SMS) that incorporate safety data analysis processes, and the enhancement of methods for ensuring safety is an inherent characteristic of NextGen. Safer Systems emphasizes implementation of safety-enhancing technologies, which will improve safety for human-centered interfaces and enhance the safety of airborne and ground-based systems. Safer Worldwide encourages coordinating the adoption of the safer practices and safer systems technologies, policies and procedures worldwide, such that the maximum level of safety is achieved across air transportation system boundaries. This paper introduces the NASSP and its development, and focuses on the Safer Systems elements of the NASSP, which incorporates three objectives for NextGen systems: 1) provide risk reducing system interfaces, 2) provide safety enhancements for airborne systems, and 3) provide safety enhancements for ground-based systems. The goal of this paper is to expose avionics and air traffic management system developers to NASSP objectives and Safer Systems strategies.
I want what you've got: Cross platform portabiity and human-robot interaction assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Julie L. Marble, Ph.D.*.; Douglas A. Few; David J. Bruemmer
2005-08-01
Human-robot interaction is a subtle, yet critical aspect of design that must be assessed during the development of both the human-robot interface and robot behaviors if the human-robot team is to effectively meet the complexities of the task environment. Testing not only ensures that the system can successfully achieve the tasks for which it was designed, but more importantly, usability testing allows the designers to understand how humans and robots can, will, and should work together to optimize workload distribution. A lack of human-centered robot interface design, the rigidity of sensor configuration, and the platform-specific nature of research robot developmentmore » environments are a few factors preventing robotic solutions from reaching functional utility in real word environments. Often the difficult engineering challenge of implementing adroit reactive behavior, reliable communication, trustworthy autonomy that combines with system transparency and usable interfaces is overlooked in favor of other research aims. The result is that many robotic systems never reach a level of functional utility necessary even to evaluate the efficacy of the basic system, much less result in a system that can be used in a critical, real-world environment. Further, because control architectures and interfaces are often platform specific, it is difficult or even impossible to make usability comparisons between them. This paper discusses the challenges inherent to the conduct of human factors testing of variable autonomy control architectures and across platforms within a complex, real-world environment. It discusses the need to compare behaviors, architectures, and interfaces within a structured environment that contains challenging real-world tasks, and the implications for system acceptance and trust of autonomous robotic systems for how humans and robots interact in true interactive teams.« less
Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments
Víctor Rodrigo, Mercado-García
2017-01-01
Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI). Those cognitive processes take place while a user navigates and explores a virtual environment (VE) and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence) that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI). BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1) set out working environmental conditions, (2) maximize the efficiency of BCI control panels, (3) implement navigation systems based not only on user intentions but also on user emotions, and (4) regulate user mental state to increase the differentiation between control and noncontrol modalities. PMID:29317861
de Carvalho, Sarah Negreiros; Costa, Thiago Bulhões da Silva; Attux, Romis; Hornung, Heiko Horst; Arantes, Dalton Soares
2018-01-01
This paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP). The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named “Get Coins,” through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user. PMID:29849549
Leite, Harlei Miguel de Arruda; de Carvalho, Sarah Negreiros; Costa, Thiago Bulhões da Silva; Attux, Romis; Hornung, Heiko Horst; Arantes, Dalton Soares
2018-01-01
This paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP). The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named "Get Coins," through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user.
Motor-commands decoding using peripheral nerve signals: a review
NASA Astrophysics Data System (ADS)
Hong, Keum-Shik; Aziz, Nida; Ghafoor, Usman
2018-06-01
During the last few decades, substantial scientific and technological efforts have been focused on the development of neuroprostheses. The major emphasis has been on techniques for connecting the human nervous system with a robotic prosthesis via natural-feeling interfaces. The peripheral nerves provide access to highly processed and segregated neural command signals from the brain that can in principle be used to determine user intent and control muscles. If these signals could be used, they might allow near-natural and intuitive control of prosthetic limbs with multiple degrees of freedom. This review summarizes the history of neuroprosthetic interfaces and their ability to record from and stimulate peripheral nerves. We also discuss the types of interfaces available and their applications, the kinds of peripheral nerve signals that are used, and the algorithms used to decode them. Finally, we explore the prospects for future development in this area.
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1990-01-01
An advanced human-system interface is being developed for evolutionary Space Station Freedom as part of the NASA Office of Space Station (OSS) Advanced Development Program. The human-system interface is based on body-pointed display and control devices. The project will identify and document the design accommodations ('hooks and scars') required to support virtual workstations and telepresence interfaces, and prototype interface systems will be built, evaluated, and refined. The project is a joint enterprise of Marquette University, Astronautics Corporation of America (ACA), and NASA's ARC. The project team is working with NASA's JSC and McDonnell Douglas Astronautics Company (the Work Package contractor) to ensure that the project is consistent with space station user requirements and program constraints. Documentation describing design accommodations and tradeoffs will be provided to OSS, JSC, and McDonnell Douglas, and prototype interface devices will be delivered to ARC and JSC. ACA intends to commercialize derivatives of the interface for use with computer systems developed for scientific visualization and system simulation.
Deep Space Network information system architecture study
NASA Technical Reports Server (NTRS)
Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.
1992-01-01
The purpose of this article is to describe an architecture for the Deep Space Network (DSN) information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990s. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies, such as the following: computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.
Nursing acceptance of a speech-input interface: a preliminary investigation.
Dillon, T W; McDowell, D; Norcio, A F; DeHaemer, M J
1994-01-01
Many new technologies are being developed to improve the efficiency and productivity of nursing staffs. User acceptance is a key to the success of these technologies. In this article, the authors present a discussion of nursing acceptance of computer systems, review the basic design issues for creating a speech-input interface, and report preliminary findings of a study of nursing acceptance of a prototype speech-input interface. Results of the study showed that the 19 nursing subjects expressed acceptance of the prototype speech-input interface.
NASA Technical Reports Server (NTRS)
1981-01-01
The impact of modern technology on the role, responsibility, authority, and performance of human operators in modern aircraft and ATC systems was examined in terms of principles defined by Paul Fitts. Research into human factors in aircraft operations and the use of human factors engineering for aircraft safety improvements were discussed, and features of the man-machine interface in computerized cockpit warning systems are examined. The design and operational features of computerized avionics displays and HUDs are described, along with results of investigations into pilot decision-making behavior, aircrew procedural compliance, and aircrew judgment training programs. Experiments in vision and visual perception are detailed, as are behavioral studies of crew workload, coordination, and complement. The effectiveness of pilot selection, screening, and training techniques are assessed, as are methods for evaluating pilot performance.
NASA Astrophysics Data System (ADS)
Lin, Y.; Zhang, W. J.
2005-02-01
This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.
A Survey of CAD/CAM Technology Applications in the U.S. Shipbuilding Industry
1984-01-01
operation for drafting. Computer Aided Engineering (CAE) analysis is used primarily to determine the validity of design characteristics and produc- tion...include time standard generation, sea trial analysis , and group Systems integration While no systems surveyed Aided Design (CAD) is the technology... analysis . is the largest problem involving software packages. are truly integrated, many are interfaced. Computer most interfaced category with links
Standardized Modular Power Interfaces for Future Space Explorations Missions
NASA Technical Reports Server (NTRS)
Oeftering, Richard
2015-01-01
Earlier studies show that future human explorations missions are composed of multi-vehicle assemblies with interconnected electric power systems. Some vehicles are often intended to serve as flexible multi-purpose or multi-mission platforms. This drives the need for power architectures that can be reconfigured to support this level of flexibility. Power system developmental costs can be reduced, program wide, by utilizing a common set of modular building blocks. Further, there are mission operational and logistics cost benefits of using a common set of modular spares. These benefits are the goals of the Advanced Exploration Systems (AES) Modular Power System (AMPS) project. A common set of modular blocks requires a substantial level of standardization in terms of the Electrical, Data System, and Mechanical interfaces. The AMPS project is developing a set of proposed interface standards that will provide useful guidance for modular hardware developers but not needlessly constrain technology options, or limit future growth in capability. In 2015 the AMPS project focused on standardizing the interfaces between the elements of spacecraft power distribution and energy storage. The development of the modular power standard starts with establishing mission assumptions and ground rules to define design application space. The standards are defined in terms of AMPS objectives including Commonality, Reliability-Availability, Flexibility-Configurability and Supportability-Reusability. The proposed standards are aimed at assembly and sub-assembly level building blocks. AMPS plans to adopt existing standards for spacecraft command and data, software, network interfaces, and electrical power interfaces where applicable. Other standards including structural encapsulation, heat transfer, and fluid transfer, are governed by launch and spacecraft environments and bound by practical limitations of weight and volume. Developing these mechanical interface standards is more difficult but an essential part of defining physical building blocks of modular power. This presentation describes the AMPS projects progress towards standardized modular power interfaces.
Propulsion/flight control integration technology (PROFIT) design analysis status
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The propulsion flight control integration technology (PROFIT) program was designed to develop a flying testbed dedicated to controls research. The preliminary design, analysis, and feasibility studies conducted in support of the PROFIT program are reported. The PROFIT system was built around existing IPCS hardware. In order to achieve the desired system flexibility and capability, additional interfaces between the IPCS hardware and F-15 systems were required. The requirements for additions and modifications to the existing hardware were defined. Those interfaces involving the more significant changes were studied. The DCU memory expansion to 32K with flight qualified hardware was completed on a brassboard basis. The uplink interface breadboard and a brassboard of the central computer interface were also tested. Two preliminary designs and corresponding program plans are presented.
Micro-nano-biosystems: An overview of European research.
Lymberis, Andreas
2010-06-01
New developments in science, technologies and applications are blurring the boundaries between information and communications technology (ICT), micro-nano systems and life sciences, e.g. through miniaturisation and the ability to manipulate matter at the atomic scale and to interface live and man-made systems. Interdisciplinary research towards integrated systems and their applications based on emerging convergence of information & communication technologies, micro-nano and bio technologies is expected to have a direct influence on healthcare, ageing population and well being. Micro-Nano-Bio Systems (MNBS) research and development activities under the European Union's R&D Programs, Information & Communication Technologies priority address miniaturised, smart and integrated systems for in-vitro testing e.g. lab-on-chips and systems interacting with the human e.g. autonomous implants, endoscopic capsules and robotics for minimally invasive surgery. The MNBS group involves hundreds of key public and private international organisations working on system development and validation in diverse applications such as cancer detection and therapy follow-up, minimally invasive surgery, capsular endocsopy, wearable biochemical monitoring and repairing of vital functions with active implant devices. The paper presents MNBS rationale and activities, discusses key research and innovation challenges and proposes R&D directions to achieve the expected impact on healthcare and quality of life.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
Mehmood, Raja Majid; Lee, Hyo Jong
2017-01-01
Human computer interaction is a growing field in terms of helping people in their daily life to improve their living. Especially, people with some disability may need an interface which is more appropriate and compatible with their needs. Our research is focused on similar kinds of problems, such as students with some mental disorder or mood disruption problems. To improve their learning process, an intelligent emotion recognition system is essential which has an ability to recognize the current emotional state of the brain. Nowadays, in special schools, instructors are commonly use some conventional methods for managing special students for educational purposes. In this paper, we proposed a novel computer aided method for instructors at special schools where they can teach special students with the support of our system using wearable technologies. PMID:28208734
Human factors issues for interstellar spacecraft
NASA Technical Reports Server (NTRS)
Cohen, Marc M.; Brody, Adam R.
1991-01-01
Developments in research on space human factors are reviewed in the context of a self-sustaining interstellar spacecraft based on the notion of traveling space settlements. Assumptions about interstellar travel are set forth addressing costs, mission durations, and the need for multigenerational space colonies. The model of human motivation by Maslow (1970) is examined and directly related to the design of space habitat architecture. Human-factors technology issues encompass the human-machine interface, crew selection and training, and the development of spaceship infrastructure during transtellar flight. A scenario for feasible instellar travel is based on a speed of 0.5c, a timeframe of about 100 yr, and an expandable multigenerational crew of about 100 members. Crew training is identified as a critical human-factors issue requiring the development of perceptual and cognitive aids such as expert systems and virtual reality.
A haptic interface for virtual simulation of endoscopic surgery.
Rosenberg, L B; Stredney, D
1996-01-01
Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.
Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.
2001-01-01
The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.
Freestanding Triboelectric Nanogenerator Enables Noncontact Motion-Tracking and Positioning.
Guo, Huijuan; Jia, Xueting; Liu, Lue; Cao, Xia; Wang, Ning; Wang, Zhong Lin
2018-04-24
Recent development of interactive motion-tracking and positioning technologies is attracting increasing interests in many areas, such as wearable electronics, intelligent electronics, and the internet of things. For example, the so-called somatosensory technology can afford users strong empathy of immersion and realism due to their consistent interaction with the game. Here, we report a noncontact self-powered positioning and motion-tracking system based on a freestanding triboelectric nanogenerator (TENG). The TENG was fabricated by a nanoengineered surface in the contact-separation mode with the use of a free moving human body (hands or feet) as the trigger. The poly(tetrafluoroethylene) (PTFE) arrays based interactive interface can give an output of 222 V from casual human motions. Different from previous works, this device also responses to a small action at certain heights of 0.01-0.11 m from the device with a sensitivity of about 315 V·m -1 , so that the mechanical sensing is possible. Such a distinctive noncontact sensing feature promotes a wide range of potential applications in smart interaction systems.
Epilepsy in Ireland: towards the primary-tertiary care continuum.
Varley, Jarlath; Delanty, Norman; Normand, Charles; Coyne, Imelda; McQuaid, Louise; Collins, Claire; Boland, Michael; Grimson, Jane; Fitzsimons, Mary
2010-01-01
Epilepsy is a chronic neurological disease affecting people of every age, gender, race and socio-economic background. The diagnosis and optimal management relies on contribution from a number of healthcare disciplines in a variety of healthcare settings. To explore the interface between primary care and specialist epilepsy services in Ireland. Using appreciative inquiry, focus groups were held with healthcare professionals (n=33) from both primary and tertiary epilepsy specialist services in Ireland. There are significant challenges to delivering a consistent high standard of epilepsy care in Ireland. The barriers that were identified are: the stigma of epilepsy, unequal access to care services, insufficient human resources, unclear communication between primary-tertiary services and lack of knowledge. Improving the management of people with epilepsy requires reconfiguration of the primary-tertiary interface and establishing clearly defined roles and formalised clinical pathways. Such initiatives require resources in the form of further education and training and increased usage of information communication technology (ICT). Epilepsy services across the primary-tertiary interface can be significantly enhanced through the implementation of a shared model of care underpinned by an electronic patient record (EPR) system and information communication technology (ICT). Better chronic disease management has the potential to halt the progression of epilepsy with ensuing benefits for patients and the healthcare system. Copyright 2009 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Conceptual design for a lunar-base CELSS
NASA Technical Reports Server (NTRS)
Schwartzkopf, Steven H.; Cullingford, Hatice S.
1990-01-01
Future human exploration is key to the United States National Space Policy goal of maintaining a world leadership position in space. In the past, spacecraft life support systems have used open-loop technologies that were simple and sufficiently reliable to demonstrate the feasibility of spaceflight. A critical technology area needing development in support of both long duration missions and the establishment of lunar or planetary bases is regenerative life support. The information presented in this paper describes a conceptual design of a Lunar Base Controlled Ecological Life Support System (LCELSS) which supports a crew size ranging from 4 to 100. The system includes, or incorporates interfaces with, eight primary subsystems. An initial description of the Lunar-Base CELSS subsystems is provided within the framework of the conceptual design. The system design includes both plant (algae and higher plant) and animal species as potential food sources.
NASA Technical Reports Server (NTRS)
Jiang, Jian-Ping; Murphy, Elizabeth D.; Bailin, Sidney C.; Truszkowski, Walter F.
1993-01-01
Capturing human factors knowledge about the design of graphical user interfaces (GUI's) and applying this knowledge on-line are the primary objectives of the Computer-Human Interaction Models (CHIMES) project. The current CHIMES prototype is designed to check a GUI's compliance with industry-standard guidelines, general human factors guidelines, and human factors recommendations on color usage. Following the evaluation, CHIMES presents human factors feedback and advice to the GUI designer. The paper describes the approach to modeling human factors guidelines, the system architecture, a new method developed to convert quantitative RGB primaries into qualitative color representations, and the potential for integrating CHIMES with user interface management systems (UIMS). Both the conceptual approach and its implementation are discussed. This paper updates the presentation on CHIMES at the first International Symposium on Ground Data Systems for Spacecraft Control.
Interfacing with the nervous system: a review of current bioelectric technologies.
Sahyouni, Ronald; Mahmoodi, Amin; Chen, Jefferson W; Chang, David T; Moshtaghi, Omid; Djalilian, Hamid R; Lin, Harrison W
2017-10-23
The aim of this study is to discuss the state of the art with regard to established or promising bioelectric therapies meant to alter or control neurologic function. We present recent reports on bioelectric technologies that interface with the nervous system at three potential sites-(1) the end organ, (2) the peripheral nervous system, and (3) the central nervous system-while exploring practical and clinical considerations. A literature search was executed on PubMed, IEEE, and Web of Science databases. A review of the current literature was conducted to examine functional and histomorphological effects of neuroprosthetic interfaces with a focus on end-organ, peripheral, and central nervous system interfaces. Innovations in bioelectric technologies are providing increasing selectivity in stimulating distinct nerve fiber populations in order to activate discrete muscles. Significant advances in electrode array design focus on increasing selectivity, stability, and functionality of implantable neuroprosthetics. The application of neuroprosthetics to paretic nerves or even directly stimulating or recording from the central nervous system holds great potential in advancing the field of nerve and tissue bioelectric engineering and contributing to clinical care. Although current physiotherapeutic and surgical treatments seek to restore function, structure, or comfort, they bear significant limitations in enabling cosmetic or functional recovery. Instead, the introduction of bioelectric technology may play a role in the restoration of function in patients with neurologic deficits.
The Role of Trust in Information Science and Technology.
ERIC Educational Resources Information Center
Marsh, Stephen; Dibben, Mark R.
2003-01-01
Discusses the notion of trust as it relates to information science and technology, specifically user interfaces, autonomous agents, and information systems. Highlights include theoretical meaning of trust; trust and levels of analysis, including organizational trust; electronic commerce, user interfaces, and static trust; dynamic trust; and trust…
ERIC Educational Resources Information Center
Fryda, Lawrence J.; Harrington, Robert; Szumal, Clint
Electronics Engineering Technology majors in the Industrial and Engineering Technology department at Central Michigan University have developed many real-world projects that represent the type of problem-solving projects encouraged by industry. Two projects that can be used by other educators as freestanding projects or as the core for further…
Human-centered systems : the next challenge in transportation
DOT National Transportation Integrated Search
1999-06-01
The "human-centered systems" approach focuses on human capabili : ties and limitations with respect to human/system interfaces, opera : tions, and system integration. The goal is to design transportation : systems that facilitate task completion, so ...
NASA Astrophysics Data System (ADS)
Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun
2006-06-01
This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.
Future developments in brain-machine interface research.
Lebedev, Mikhail A; Tate, Andrew J; Hanson, Timothy L; Li, Zheng; O'Doherty, Joseph E; Winans, Jesse A; Ifft, Peter J; Zhuang, Katie Z; Fitzsimmons, Nathan A; Schwarz, David A; Fuller, Andrew M; An, Je Hi; Nicolelis, Miguel A L
2011-01-01
Neuroprosthetic devices based on brain-machine interface technology hold promise for the restoration of body mobility in patients suffering from devastating motor deficits caused by brain injury, neurologic diseases and limb loss. During the last decade, considerable progress has been achieved in this multidisciplinary research, mainly in the brain-machine interface that enacts upper-limb functionality. However, a considerable number of problems need to be resolved before fully functional limb neuroprostheses can be built. To move towards developing neuroprosthetic devices for humans, brain-machine interface research has to address a number of issues related to improving the quality of neuronal recordings, achieving stable, long-term performance, and extending the brain-machine interface approach to a broad range of motor and sensory functions. Here, we review the future steps that are part of the strategic plan of the Duke University Center for Neuroengineering, and its partners, the Brazilian National Institute of Brain-Machine Interfaces and the École Polytechnique Fédérale de Lausanne (EPFL) Center for Neuroprosthetics, to bring this new technology to clinical fruition.
Bringing Control System User Interfaces to the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xihui; Kasemir, Kay
With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less
Interfaces for Distributed Systems of Information Servers.
ERIC Educational Resources Information Center
Kahle, Brewster; And Others
1992-01-01
Describes two systems--Wide Area Information Servers (WAIS) and Rosebud--that provide protocol-based mechanisms for accessing remote full-text information servers. Design constraints, human interface design, and implementation are examined for five interfaces to these systems developed to run on the Macintosh or Unix terminals. Sample screen…
Zhu, Huaping; Sun, Yaoru; Zeng, Jinhua; Sun, Hongyu
2011-05-01
Previous studies have suggested that the dysfunction of the human mirror neuron system (hMNS) plays an important role in the autism spectrum disorder (ASD). In this work, we propose a novel training program from our interdisciplinary research to improve mirror neuron functions of autistic individuals by using a BCI system with virtual reality technology. It is a promising approach for the autism to learn and develop social communications in a VR environment. A test method for this hypothesis is also provided. Copyright © 2011 Elsevier Ltd. All rights reserved.
Acquisition and production of skilled behavior in dynamic decision-making tasks
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1992-01-01
Currently, two main approaches exist for improving the human-machine interface component of a system in order to improve overall system performance - display enhancement and intelligent decision making. Discussed here are the characteristic issues of these two decision-making strategies. Differences in expert and novice decision making are described in order to help determine whether a particular strategy may be better for a particular type of user. Research is outlined to compare and contrast the two technologies, as well as to examine the interaction effects introduced by the different skill levels and the different methods for training operators.
A COSTAR interface using WWW technology.
Rabbani, U.; Morgan, M.; Barnett, O.
1998-01-01
The concentration of industry on modern relational databases has left many nonrelational and proprietary databases without support for integration with new technologies. Emerging interface tools and data-access methodologies can be applied with difficulty to medical record systems which have proprietary data representation. Users of such medical record systems usually must access the clinical content of such record systems with keyboard-intensive and time-consuming interfaces. COSTAR is a legacy ambulatory medical record system developed over 25 years ago that is still popular and extensively used at the Massachusetts General Hospital. We define a model for using middle layer services to extract and cache data from non-relational databases, and present an intuitive World-Wide Web interface to COSTAR. This model has been implemented and successfully piloted in the Internal Medicine Associates at Massachusetts General Hospital. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:9929310
The Virtual Tablet: Virtual Reality as a Control System
NASA Technical Reports Server (NTRS)
Chronister, Andrew
2016-01-01
In the field of human-computer interaction, Augmented Reality (AR) and Virtual Reality (VR) have been rapidly growing areas of interest and concerted development effort thanks to both private and public research. At NASA, a number of groups have explored the possibilities afforded by AR and VR technology, among which is the IT Advanced Concepts Lab (ITACL). Within ITACL, the AVR (Augmented/Virtual Reality) Lab focuses on VR technology specifically for its use in command and control. Previous work in the AVR lab includes the Natural User Interface (NUI) project and the Virtual Control Panel (VCP) project, which created virtual three-dimensional interfaces that users could interact with while wearing a VR headset thanks to body- and hand-tracking technology. The Virtual Tablet (VT) project attempts to improve on these previous efforts by incorporating a physical surrogate which is mirrored in the virtual environment, mitigating issues with difficulty of visually determining the interface location and lack of tactile feedback discovered in the development of previous efforts. The physical surrogate takes the form of a handheld sheet of acrylic glass with several infrared-range reflective markers and a sensor package attached. Using the sensor package to track orientation and a motion-capture system to track the marker positions, a model of the surrogate is placed in the virtual environment at a position which corresponds with the real-world location relative to the user's VR Head Mounted Display (HMD). A set of control mechanisms is then projected onto the surface of the surrogate such that to the user, immersed in VR, the control interface appears to be attached to the object they are holding. The VT project was taken from an early stage where the sensor package, motion-capture system, and physical surrogate had been constructed or tested individually but not yet combined or incorporated into the virtual environment. My contribution was to combine the pieces of hardware, write software to incorporate each piece of position or orientation data into a coherent description of the object's location in space, place the virtual analogue accordingly, and project the control interface onto it, resulting in a functioning object which has both a physical and a virtual presence. Additionally, the virtual environment was enhanced with two live video feeds from cameras mounted on the robotic device being used as an example target of the virtual interface. The working VT allows users to naturally interact with a control interface with little to no training and without the issues found in previous efforts.
NASA Technical Reports Server (NTRS)
Bubenheim, David L.; Flynn, Michael T.; Lamparter, Richard; Bates, Maynard; Kliss, Mark (Technical Monitor)
1998-01-01
The Controlled Ecological Life Support System (CELSS) Antarctic Analog Project (CAAP) is a joint endeavor between the National Science Foundation, Office of Polar Programs (NSF-OPP), and the National Aeronautics and Space Administration (NASA). The fundamental objective is to develop, deploy, and operate a testbed of advanced life support technologies at the Amundsen-Scott South Pole Station that enable the objectives of both the NSF and NASA. The functions of food production, water purification, and waste treatment, recycle, and reduction provided by CAAP will improve the quality of life for the South Pole inhabitants, reduce logistics dependence, enhance safety, and minimize environmental impacts associated with human presence on the polar plateau. Because of the analogous technical, scientific, and mission features with Planetary missions, such as a mission to Mars, CAAP provides NASA with a method for validating technologies and overall approaches to supporting humans. Prototype systems for waste treatment, water recycle, resource recovery and crop production are being evaluated in a testbed at Ames Research Center. The combined performance of these biological and physical/chemical systems as an integrated function in support of the human habitat will be discussed. Overall system performance will be emphasized. The effectiveness and efficiency of component technologies will be discussed in the context of energy and mass flow within the system and contribution to achieving a mass and energy conservative system. Critical to the discussion are interfaces with habitat functions outside of the closed-loop life support: the ability of the system to satisfy the life support requirements of the habitat and the ability to define input requirements. The significance of analog functions in relation to future Mars habitats will be discussed.
A 2-D Interface Element for Coupled Analysis of Independently Modeled 3-D Finite Element Subdomains
NASA Technical Reports Server (NTRS)
Kandil, Osama A.
1998-01-01
Over the past few years, the development of the interface technology has provided an analysis framework for embedding detailed finite element models within finite element models which are less refined. This development has enabled the use of cascading substructure domains without the constraint of coincident nodes along substructure boundaries. The approach used for the interface element is based on an alternate variational principle often used in deriving hybrid finite elements. The resulting system of equations exhibits a high degree of sparsity but gives rise to a non-positive definite system which causes difficulties with many of the equation solvers in general-purpose finite element codes. Hence the global system of equations is generally solved using, a decomposition procedure with pivoting. The research reported to-date for the interface element includes the one-dimensional line interface element and two-dimensional surface interface element. Several large-scale simulations, including geometrically nonlinear problems, have been reported using the one-dimensional interface element technology; however, only limited applications are available for the surface interface element. In the applications reported to-date, the geometry of the interfaced domains exactly match each other even though the spatial discretization within each domain may be different. As such, the spatial modeling of each domain, the interface elements and the assembled system is still laborious. The present research is focused on developing a rapid modeling procedure based on a parametric interface representation of independently defined subdomains which are also independently discretized.
NASA Astrophysics Data System (ADS)
Kido, Michael H.; Mundt, Carsten W.; Montgomery, Kevin N.; Asquith, Adam; Goodale, David W.; Kaneshiro, Kenneth Y.
2008-10-01
Monitoring the complex environmental relationships and feedbacks of ecosystems on catchment (or mountain)-to-sea scales is essential for social systems to effectively deal with the escalating impacts of expanding human populations globally on watersheds. However, synthesis of emerging technologies into a robust observing platform for the monitoring of coupled human-natural environments on extended spatial scales has been slow to develop. For this purpose, the authors produced a new cyberinfrastructure for environmental monitoring which successfully merged the use of wireless sensor technologies, grid computing with three-dimensional (3D) geospatial data visualization/exploration, and a secured internet portal user interface, into a working prototype for monitoring mountain-to-sea environments in the high Hawaiian Islands. A use-case example is described in which native Hawaiian residents of Waipa Valley (Kauai) utilized the technology to monitor the effects of regional weather variation on surface water quality/quantity response, to better understand their local hydrologic cycle, monitor agricultural water use, and mitigate the effects of lowland flooding.
Kido, Michael H; Mundt, Carsten W; Montgomery, Kevin N; Asquith, Adam; Goodale, David W; Kaneshiro, Kenneth Y
2008-10-01
Monitoring the complex environmental relationships and feedbacks of ecosystems on catchment (or mountain)-to-sea scales is essential for social systems to effectively deal with the escalating impacts of expanding human populations globally on watersheds. However, synthesis of emerging technologies into a robust observing platform for the monitoring of coupled human-natural environments on extended spatial scales has been slow to develop. For this purpose, the authors produced a new cyberinfrastructure for environmental monitoring which successfully merged the use of wireless sensor technologies, grid computing with three-dimensional (3D) geospatial data visualization/exploration, and a secured internet portal user interface, into a working prototype for monitoring mountain-to-sea environments in the high Hawaiian Islands. A use-case example is described in which native Hawaiian residents of Waipa Valley (Kauai) utilized the technology to monitor the effects of regional weather variation on surface water quality/quantity response, to better understand their local hydrologic cycle, monitor agricultural water use, and mitigate the effects of lowland flooding.
Embedded Control System for Smart Walking Assistance Device.
Bosnak, Matevz; Skrjanc, Igor
2017-03-01
This paper presents the design and implementation of a unique control system for a smart hoist, a therapeutic device that is used in rehabilitation of walking. The control system features a unique human-machine interface that allows the human to intuitively control the system just by moving or rotating its body. The paper contains an overview of the complete system, including the design and implementation of custom sensors, dc servo motor controllers, communication interfaces and embedded-system based central control system. The prototype of the complete system was tested by conducting a 6-runs experiment on 11 subjects and results are showing that the proposed control system interface is indeed intuitive and simple to adopt by the user.
Kim, Kwangtaek; Kim, Joongrock; Choi, Jaesung; Kim, Junghyun; Lee, Sangyoun
2015-01-01
Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback. PMID:25580901
Kim, Kwangtaek; Kim, Joongrock; Choi, Jaesung; Kim, Junghyun; Lee, Sangyoun
2015-01-08
Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.
Young, K L; Koppel, S; Charlton, J L
2017-09-01
Older adults are the fastest growing segment of the driving population. While there is a strong emphasis for older people to maintain their mobility, the safety of older drivers is a serious community concern. Frailty and declines in a range of age-related sensory, cognitive, and physical impairments can place older drivers at an increased risk of crash-related injuries and death. A number of studies have indicated that in-vehicle technologies such as Advanced Driver Assistance Systems (ADAS) and In-Vehicle Information Systems (IVIS) may provide assistance to older drivers. However, these technologies will only benefit older drivers if their design is congruent with the complex needs and diverse abilities of this driving cohort. The design of ADAS and IVIS is largely informed by automotive Human Machine Interface (HMI) guidelines. However, it is unclear to what extent the declining sensory, cognitive and physical capabilities of older drivers are addressed in the current guidelines. This paper provides a review of key current design guidelines for IVIS and ADAS with respect to the extent they address age-related changes in functional capacities. The review revealed that most of the HMI guidelines do not address design issues related to older driver impairments. In fact, in many guidelines driver age and sensory cognitive and physical impairments are not mentioned at all and where reference is made, it is typically very broad. Prescriptive advice on how to actually design a system so that it addresses the needs and limitations of older drivers is not provided. In order for older drivers to reap the full benefits that in-vehicle technology can afford, it is critical that further work establish how older driver limitations and capabilities can be supported by the system design process, including their inclusion into HMI design guidelines. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
2002-10-01
The success of automation for intelligent transportation systems is ultimately contingent upon the Interface between the users (humans) and the system (ITS). The issues of variable message signs (VMS) and traffic signal device (TSD) design were studi...
Perspectives on Human-Computer Interface: Introduction and Overview.
ERIC Educational Resources Information Center
Harman, Donna; Lunin, Lois F.
1992-01-01
Discusses human-computer interfaces in information seeking that focus on end users, and provides an overview of articles in this section that (1) provide librarians and information specialists with guidelines for selecting information-seeking systems; (2) provide producers of information systems with directions for production or research; and (3)…
NASA Astrophysics Data System (ADS)
Mack, Ian W.; Potts, Stephen; McMenemy, Karen R.; Ferguson, R. S.
2006-02-01
The laparoscopic technique for performing abdominal surgery requires a very high degree of skill in the medical practitioner. Much interest has been focused on using computer graphics to provide simulators for training surgeons. Unfortunately, these tend to be complex and have a very high cost, which limits availability and restricts the length of time over which individuals can practice their skills. With computer game technology able to provide the graphics required for a surgical simulator, the cost does not have to be high. However, graphics alone cannot serve as a training simulator. Human interface hardware, the equivalent of the force feedback joystick for a flight simulator game, is required to complete the system. This paper presents a design for a very low cost device to address this vital issue. The design encompasses: the mechanical construction, the electronic interfaces and the software protocols to mimic a laparoscopic surgical set-up. Thus the surgeon has the capability of practicing two-handed procedures with the possibility of force feedback. The force feedback and collision detection algorithms allow surgeons to practice realistic operating theatre procedures with a good degree of authenticity.
Emerging In Vitro Liver Technologies for Drug Metabolism and Inter-Organ Interactions
Bale, Shyam Sundhar; Moore, Laura
2016-01-01
In vitro liver models provide essential information for evaluating drug metabolism, metabolite formation, and hepatotoxicity. Interfacing liver models with other organ models could provide insights into the desirable as well as unintended systemic side effects of therapeutic agents and their metabolites. Such information is invaluable for drug screening processes particularly in the context of secondary organ toxicity. While interfacing of liver models with other organ models has been achieved, platforms that effectively provide human-relevant precise information are needed. In this concise review, we discuss the current state-of-the-art of liver-based multiorgan cell culture platforms primarily from a drug and metabolite perspective, and highlight the importance of media-to-cell ratio in interfacing liver models with other organ models. In addition, we briefly discuss issues related to development of optimal liver models that include recent advances in hepatic cell lines, stem cells, and challenges associated with primary hepatocyte-based liver models. Liver-based multiorgan models that achieve physiologically relevant coupling of different organ models can have a broad impact in evaluating drug efficacy and toxicity, as well as mechanistic investigation of human-relevant disease conditions. PMID:27049038
NASA Technical Reports Server (NTRS)
Boulanger, Richard; Overland, David
2004-01-01
Technologies that facilitate the design and control of complex, hybrid, and resource-constrained systems are examined. This paper focuses on design methodologies, and system architectures, not on specific control methods that may be applied to life support subsystems. Honeywell and Boeing have estimated that 60-80Y0 of the effort in developing complex control systems is software development, and only 20-40% is control system development. It has also been shown that large software projects have failure rates of as high as 50-65%. Concepts discussed include the Unified Modeling Language (UML) and design patterns with the goal of creating a self-improving, self-documenting system design process. Successful architectures for control must not only facilitate hardware to software integration, but must also reconcile continuously changing software with much less frequently changing hardware. These architectures rely on software modules or components to facilitate change. Architecting such systems for change leverages the interfaces between these modules or components.
Constellation Program Human-System Integration Requirements. Revision E, Nov. 19, 2010
NASA Technical Reports Server (NTRS)
Dory, Jonathan
2010-01-01
The Human-Systems Integration Requirements (HSIR) in this document drive the design of space vehicles, their systems, and equipment with which humans interface in the Constellation Program (CxP). These requirements ensure that the design of Constellation (Cx) systems is centered on the needs, capabilities, and limitations of the human. The HSIR provides requirements to ensure proper integration of human-to-system interfaces. These requirements apply to all mission phases, including pre-launch, ascent, Earth orbit, trans-lunar flight, lunar orbit, lunar landing, lunar ascent, Earth return, Earth entry, Earth landing, post-landing, and recovery. The Constellation Program must meet NASA's Agency-level human rating requirements, which are intended to ensure crew survival without permanent disability. The HSIR provides a key mechanism for achieving human rating of Constellation systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winters, J.M.
Some background is given on the field of human factors. The nature of problems with current human/computer interfaces is discussed, some costs are identified, ideal attributes of graceful system interfaces are outlined, and some reasons are indicated why it's not easy to fix the problems. (LEW)
Exploration Life Support Technology Development for Lunar Missions
NASA Technical Reports Server (NTRS)
Ewert, Michael K.; Barta, Daniel J.; McQuillan, Jeffrey
2009-01-01
Exploration Life Support (ELS) is one of NASA's Exploration Technology Development Projects. ELS plans, coordinates and implements the development of new life support technologies for human exploration missions as outlined in NASA's Vision for Space Exploration. ELS technology development currently supports three major projects of the Constellation Program - the Orion Crew Exploration Vehicle (CEV), the Altair Lunar Lander and Lunar Surface Systems. ELS content includes Air Revitalization Systems (ARS), Water Recovery Systems (WRS), Waste Management Systems (WMS), Habitation Engineering, Systems Integration, Modeling and Analysis (SIMA), and Validation and Testing. The primary goal of the ELS project is to provide different technology options to Constellation which fill gaps or provide substantial improvements over the state-of-the-art in life support systems. Since the Constellation missions are so challenging, mass, power, and volume must be reduced from Space Shuttle and Space Station technologies. Systems engineering analysis also optimizes the overall architecture by considering all interfaces with the life support system and potential for reduction or reuse of resources. For long duration missions, technologies which aid in closure of air and water loops with increased reliability are essential as well as techniques to minimize or deal with waste. The ELS project utilizes in-house efforts at five NASA centers, aerospace industry contracts, Small Business Innovative Research contracts and other means to develop advanced life support technologies. Testing, analysis and reduced gravity flight experiments are also conducted at the NASA field centers. This paper gives a current status of technologies under development by ELS and relates them to the Constellation customers who will eventually use them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Crawford
MODIFIED PAPER TITLE AND ABSTRACT DUE TO SLIGHTLY MODIFIED SCOPE: TITLE: Nonlinear Force Profile Used to Increase the Performance of a Haptic User Interface for Teleoperating a Robotic Hand Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space. The research associated with this paper hypothesizes that a user interface and complementary radiation compatible robotic hand that integrates the human hand’s anthropometric properties, speed capability, nonlinear strength profile, reduction of active degrees of freedommore » during the transition from manipulation to grasping, and just noticeable difference force sensation characteristics will enhance a user’s teleoperation performance. The main contribution of this research is in that a system that concisely integrates all these factors has yet to be developed and furthermore has yet to be applied to a hazardous environment as those referenced above. In fact, the most prominent slave manipulator teleoperation technology in use today is based on a design patented in 1945 (Patent 2632574) [1]. The robotic hand/user interface systems of similar function as the one being developed in this research limit their design input requirements in the best case to only complementing the hand’s anthropometric properties, speed capability, and linearly scaled force application relationship (e.g. robotic force is a constant, 4 times that of the user). In this paper a nonlinear relationship between the force experienced between the user interface and the robotic hand was devised based on property differences of manipulation and grasping activities as they pertain to the human hand. The results show that such a relationship when subjected to a manipulation task and grasping task produces increased performance compared to the traditional linear scaling techniques used by other systems. Key Words: Teleoperation, Robotic Hand, Robotic Force Scaling« less
Being human in a global age of technology.
Whelton, Beverly J B
2016-01-01
This philosophical enquiry considers the impact of a global world view and technology on the meaning of being human. The global vision increases our awareness of the common bond between all humans, while technology tends to separate us from an understanding of ourselves as human persons. We review some advances in connecting as community within our world, and many examples of technological changes. This review is not exhaustive. The focus is to understand enough changes to think through the possibility of healthcare professionals becoming cyborgs, human-machine units that are subsequently neither human and nor machine. It is seen that human technology interfaces are a different way of interacting but do not change what it is to be human in our rational capacities of providing meaningful speech and freely chosen actions. In the highly technical environment of the ICU, expert nurses work in harmony with both the technical equipment and the patient. We used Heidegger to consider the nature of equipment, and Descartes to explore unique human capacities. Aristotle, Wallace, Sokolowski, and Clarke provide a summary of humanity as substantial and relational. © 2015 John Wiley & Sons Ltd.
Extravehicular Activity and Planetary Protection
NASA Technical Reports Server (NTRS)
Buffington, J. A.; Mary, N. A.
2015-01-01
The first human mission to Mars will be the farthest distance that humans have traveled from Earth and the first human boots on Martian soil in the Exploration EVA Suit. The primary functions of the Exploration EVA Suit are to provide a habitable, anthropometric, pressurized environment for up to eight hours that allows crewmembers to perform autonomous and robotically assisted extravehicular exploration, science/research, construction, servicing, and repair operations on the exterior of the vehicle, in hazardous external conditions of the Mars local environment. The Exploration EVA Suit has the capability to structurally interface with exploration vehicles via next generation ingress/egress systems. Operational concepts and requirements are dependent on the mission profile, surface assets, and the Mars environment. This paper will discuss the effects and dependencies of the EVA system design with the local Mars environment and Planetary Protection. Of the three study areas listed for the workshop, EVA identifies most strongly with technology and operations for contamination control.
Deng, Li; Wang, Guohua; Yu, Suihuai
2016-01-01
In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method.
Deng, Li; Wang, Guohua; Yu, Suihuai
2016-01-01
In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method. PMID:26884745
Human factors issues in telerobotic systems for Space Station Freedom servicing
NASA Technical Reports Server (NTRS)
Malone, Thomas B.; Permenter, Kathryn E.
1990-01-01
Requirements for Space Station Freedom servicing are described and the state-of-the-art for telerobotic system on-orbit servicing of spacecraft is defined. The projected requirements for the Space Station Flight Telerobotic Servicer (FTS) are identified. Finally, the human factors issues in telerobotic servicing are discussed. The human factors issues are basically three: the definition of the role of the human versus automation in system control; the identification of operator-device interface design requirements; and the requirements for development of an operator-machine interface simulation capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, C.; Spielman, Z.; LeBlanc, K.
An important element of human factors engineering (HFE) pertains to measurement and evaluation (M&E). The role of HFE-M&E should be integrated throughout the entire control room modernization (CRM) process and be used for human-system performance evaluation and diagnostic purposes with resolving potential human engineering deficiencies (HEDs) and other human machine interface (HMI) design issues. NUREG-0711 describes how HFE in CRM should employ a hierarchical set of measures, particularly during integrated system validation (ISV), including plant performance, personnel task performance, situation awareness, cognitive workload, and anthropometric/ physiological factors. Historically, subjective measures have been primarily used since they are easier to collectmore » and do not require specialized equipment. However, there are pitfalls with relying solely on subjective measures in M&E such that negatively impact reliability, sensitivity, and objectivity. As part of comprehensively capturing a diverse set of measures that strengthen findings and inferences made of the benefits from emerging technologies like advanced displays, this paper discusses the value of using eye tracking as an objective method that can be used in M&E. A brief description of eye tracking technology and relevant eye tracking measures is provided. Additionally, technical considerations and the unique challenges with using eye tracking in full-scaled simulations are addressed. Finally, this paper shares preliminary findings regarding the use of a wearable eye tracking system in a full-scale simulator study. These findings should help guide future full-scale simulator studies using eye tracking as a methodology to evaluate human-system performance.« less
Interface Management for a NASA Flight Project Using Model-Based Systems Engineering (MBSE)
NASA Technical Reports Server (NTRS)
Vipavetz, Kevin; Shull, Thomas A.; Infeld, Samatha; Price, Jim
2016-01-01
The goal of interface management is to identify, define, control, and verify interfaces; ensure compatibility; provide an efficient system development; be on time and within budget; while meeting stakeholder requirements. This paper will present a successful seven-step approach to interface management used in several NASA flight projects. The seven-step approach using Model Based Systems Engineering will be illustrated by interface examples from the Materials International Space Station Experiment-X (MISSE-X) project. The MISSE-X was being developed as an International Space Station (ISS) external platform for space environmental studies, designed to advance the technology readiness of materials and devices critical for future space exploration. Emphasis will be given to best practices covering key areas such as interface definition, writing good interface requirements, utilizing interface working groups, developing and controlling interface documents, handling interface agreements, the use of shadow documents, the importance of interface requirement ownership, interface verification, and product transition.
ERIC Educational Resources Information Center
Presperin, Jessica J., Ed.
This proceedings document contains approximately 250 papers and posters presented at a conference on the advancement of rehabilitation and assistive technology. Individual sessions focused on the following topics: quantitative functional evaluation, upper limb and therapeutic stimulation, human-computer interface developments, information…
Simulation of the human-telerobot interface on the Space Station
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Smith, Randy L.
1993-01-01
Many issues remain unresolved concerning the components of the human-telerobot interface presented in this work. It is critical that these components be optimally designed and arranged to ensure, not only that the overall system's goals are met, but but that the intended end-user has been optimally accommodated. With sufficient testing and evaluation throughout the development cycle, the selection of the components to use in the final telerobotic system can promote efficient, error-free performance. It is recommended that whole-system simulation with full-scale mockups be used to help design the human-telerobot interface. It is contended that the use of simulation can facilitate this design and evaluation process.
A Framework and Implementation of User Interface and Human-Computer Interaction Instruction
ERIC Educational Resources Information Center
Peslak, Alan
2005-01-01
Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…
NASA Astrophysics Data System (ADS)
Schieber, Marc H.
2016-07-01
Control of the human hand has been both difficult to understand scientifically and difficult to emulate technologically. The article by Santello and colleagues in the current issue of Physics of Life Reviews[1] highlights the accelerating pace of interaction between the neuroscience of controlling body movement and the engineering of robotic hands that can be used either autonomously or as part of a motor neuroprosthesis, an artificial body part that moves under control from a human subject's own nervous system. Motor neuroprostheses typically involve a brain-computer interface (BCI) that takes signals from the subject's nervous system or muscles, interprets those signals through a decoding algorithm, and then applies the resulting output to control the artificial device.
Language evolution and human-computer interaction
NASA Technical Reports Server (NTRS)
Grudin, Jonathan; Norman, Donald A.
1991-01-01
Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.
Multi-interface level in oil tanks and applications of optical fiber sensors
NASA Astrophysics Data System (ADS)
Leal-Junior, Arnaldo G.; Marques, Carlos; Frizera, Anselmo; Pontes, Maria José
2018-01-01
On the oil production also involves the production of water, gas and suspended solids, which are separated from the oil on three-phase separators. However, the control strategies of an oil separator are limited due to unavailability of suitable multi-interface level sensors. This paper presents a description of the multi-phase level problem on the oil industry and a review of the current technologies for multi-interface level assessment. Since optical fiber sensors present chemical stability, intrinsic safety, electromagnetic immunity, lightweight and multiplexing capabilities, it can be an alternative for multi-interface level measurement that can overcome some of the limitations of the current technologies. For this reason, Fiber Bragg Gratings (FBGs) based optical fiber sensor system for multi-interface level assessment is proposed, simulated and experimentally assessed. The results show that the proposed sensor system is capable of measuring interface level with a relative error of only 2.38%. Furthermore, the proposed sensor system is also capable of measuring the oil density with an error of 0.8 kg/m3.
Real-World Neuroimaging Technologies
2013-05-10
system enables long-term wear of up to 10 consecutive hours of operation time. The system’s wireless technologies, light weight (200g), and dry sensor ...biomarkers, body sensor networks , brain computer interactionbrain, computer interfaces, data acquisition, electroencephalography monitoring, translational...brain activity in real-world scenarios. INDEX TERMS Behavioral science, biomarkers, body sensor networks , brain computer interfaces, brain computer
NASA Technical Reports Server (NTRS)
2003-01-01
Each year, health care costs for managing chronically ill patients increase as the life expectancy of Americans continues to grow. To handle this situation, many hospitals, doctors practices, and home care providers are turning to disease management, a system of coordinated health care interventions and communications, to improve outpatient care. By participating in daily monitoring programs, patients with congestive heart failure, chronic obstructive pulmonary disease, diabetes, and other chronic conditions requiring significant self-care are facing fewer emergency situations and hospitalizations. Cybernet Medical, a division of Ann Arbor, Michigan-based Cybernet Systems Corporation, is using the latest communications technology to augment the ways health care professionals monitor and assess patients with chronic diseases, while at the same time simplifying the patients interaction with technology. Cybernet s newest commercial product for this purpose evolved from research funded by NASA, the National Institute of Mental Health, and the Advanced Research Projects Agency. The research focused on the physiological assessment of astronauts and soldiers, human performance evaluation, and human-computer interaction. Cybernet Medical's MedStar Disease Management Data Collection System is an affordable, widely deployable solution for improving in-home-patient chronic disease management. The system's battery-powered and portable interface device collects physiological data from off-the-shelf instruments.
Phan, Duc Tt; Bender, R Hugh F; Andrejecsk, Jillian W; Sobrino, Agua; Hachey, Stephanie J; George, Steven C; Hughes, Christopher Cw
2017-11-01
The blood-brain barrier is a dynamic and highly organized structure that strictly regulates the molecules allowed to cross the brain vasculature into the central nervous system. The blood-brain barrier pathology has been associated with a number of central nervous system diseases, including vascular malformations, stroke/vascular dementia, Alzheimer's disease, multiple sclerosis, and various neurological tumors including glioblastoma multiforme. There is a compelling need for representative models of this critical interface. Current research relies heavily on animal models (mostly mice) or on two-dimensional (2D) in vitro models, neither of which fully capture the complexities of the human blood-brain barrier. Physiological differences between humans and mice make translation to the clinic problematic, while monolayer cultures cannot capture the inherently three-dimensional (3D) nature of the blood-brain barrier, which includes close association of the abluminal side of the endothelium with astrocyte foot-processes and pericytes. Here we discuss the central nervous system diseases associated with blood-brain barrier pathology, recent advances in the development of novel 3D blood-brain barrier -on-a-chip systems that better mimic the physiological complexity and structure of human blood-brain barrier, and provide an outlook on how these blood-brain barrier-on-a-chip systems can be used for central nervous system disease modeling. Impact statement The field of microphysiological systems is rapidly evolving as new technologies are introduced and our understanding of organ physiology develops. In this review, we focus on Blood-Brain Barrier (BBB) models, with a particular emphasis on how they relate to neurological disorders such as Alzheimer's disease, multiple sclerosis, stroke, cancer, and vascular malformations. We emphasize the importance of capturing the three-dimensional nature of the brain and the unique architecture of the BBB - something that until recently had not been well modeled by in vitro systems. Our hope is that this review will provide a launch pad for new ideas and methodologies that can provide us with truly physiological BBB models capable of yielding new insights into the function of this critical interface.
The UMLS Knowledge Source Server: an experience in Web 2.0 technologies.
Thorn, Karen E; Bangalore, Anantha K; Browne, Allen C
2007-10-11
The UMLS Knowledge Source Server (UMLSKS), developed at the National Library of Medicine (NLM), makes the knowledge sources of the Unified Medical Language System (UMLS) available to the research community over the Internet. In 2003, the UMLSKS was redesigned utilizing state-of-the-art technologies available at that time. That design offered a significant improvement over the prior version but presented a set of technology-dependent issues that limited its functionality and usability. Four areas of desired improvement were identified: software interfaces, web interface content, system maintenance/deployment, and user authentication. By employing next generation web technologies, newer authentication paradigms and further refinements in modular design methods, these areas could be addressed and corrected to meet the ever increasing needs of UMLSKS developers. In this paper we detail the issues present with the existing system and describe the new system's design using new technologies considered entrants in the Web 2.0 development era.
Drajsajtl, Tomáš; Struk, Petr; Bednárová, Alice
2013-01-01
AsTeRICS - "The Assistive Technology Rapid Integration & Construction Set" is a construction set for assistive technologies which can be adapted to the motor abilities of end-users. AsTeRICS allows access to different devices such as PCs, cell phones and smart home devices, with all of them integrated in a platform adapted as much as possible to each user. People with motor disabilities in the upper limbs, with no cognitive impairment, no perceptual limitations (neither visual nor auditory) and with basic skills in using technologies such as PCs, cell phones, electronic agendas, etc. have available a flexible and adaptable technology which enables them to access the Human-Machine-Interfaces (HMI) on the standard desktop and beyond. AsTeRICS provides graphical model design tools, a middleware and hardware support for the creation of tailored AT-solutions involving bioelectric signal acquisition, Brain-/Neural Computer Interfaces, Computer-Vision techniques and standardized actuator and device controls and allows combining several off-the-shelf AT-devices in every desired combination. Novel, end-user ready solutions can be created and adapted via a graphical editor without additional programming efforts. The AsTeRICS open-source framework provides resources for utilization and extension of the system to developers and researches. AsTeRICS was developed by the AsTeRICS project and was partially funded by EC.
Techniques and applications for binaural sound manipulation in human-machine interfaces
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Wenzel, Elizabeth M.
1990-01-01
The implementation of binaural sound to speech and auditory sound cues (auditory icons) is addressed from both an applications and technical standpoint. Techniques overviewed include processing by means of filtering with head-related transfer functions. Application to advanced cockpit human interface systems is discussed, although the techniques are extendable to any human-machine interface. Research issues pertaining to three-dimensional sound displays under investigation at the Aerospace Human Factors Division at NASA Ames Research Center are described.
Techniques and applications for binaural sound manipulation in human-machine interfaces
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Wenzel, Elizabeth M.
1992-01-01
The implementation of binaural sound to speech and auditory sound cues (auditory icons) is addressed from both an applications and technical standpoint. Techniques overviewed include processing by means of filtering with head-related transfer functions. Application to advanced cockpit human interface systems is discussed, although the techniques are extendable to any human-machine interface. Research issues pertaining to three-dimensional sound displays under investigation at the Aerospace Human Factors Division at NASA Ames Research Center are described.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schrenkenghost, Debra K.
2001-01-01
The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.
NASA Astrophysics Data System (ADS)
Shirakawa, Tomohiro; Sato, Hiroshi; Imao, Tomoya
2017-07-01
Recently, a variety of user interfaces have been developed based on human-robot and human-agent interaction, and anthropomorphic agents are used as one type of interface. However, the use of anthropomorphic agents is applied mainly to the medical and cognitive sciences, and there are few studies of their application to other fields. Therefore, we used an anthropomorphic agent of MMD in a virtual lecture to analyze the effect of gestures on students and search for ways to apply anthropomorphic agents to the field of educational technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric A. Wernert; William R. Sherman; Patrick O'Leary
Immersive visualization makes use of the medium of virtual reality (VR) - it is a subset of virtual reality focused on the application of VR technologies to scientific and information visualization. As the name implies, there is a particular focus on the physically immersive aspect of VR that more fully engages the perceptual and kinesthetic capabilities of the scientist with the goal of producing greater insight. The immersive visualization community is uniquely positioned to address the analysis needs of the wide spectrum of domain scientists who are becoming increasingly overwhelmed by data. The outputs of computational science simulations and high-resolutionmore » sensors are creating a data deluge. Data is coming in faster than it can be analyzed, and there are countless opportunities for discovery that are missed as the data speeds by. By more fully utilizing the scientists visual and other sensory systems, and by offering a more natural user interface with which to interact with computer-generated representations, immersive visualization offers great promise in taming this data torrent. However, increasing the adoption of immersive visualization in scientific research communities can only happen by simultaneously lowering the engagement threshold while raising the measurable benefits of adoption. Scientists time spent immersed with their data will thus be rewarded with higher productivity, deeper insight, and improved creativity. Immersive visualization ties together technologies and methodologies from a variety of related but frequently disjoint areas, including hardware, software and human-computer interaction (HCI) disciplines. In many ways, hardware is a solved problem. There are well established technologies including large walk-in systems such as the CAVE{trademark} and head-based systems such as the Wide-5{trademark}. The advent of new consumer-level technologies now enable an entirely new generation of immersive displays, with smaller footprints and costs, widening the potential consumer base. While one would be hard-pressed to call software a solved problem, we now understand considerably more about best practices for designing and developing sustainable, scalable software systems, and we have useful software examples that illuminate the way to even better implementations. As with any research endeavour, HCI will always be exploring new topics in interface design, but we now have a sizable knowledge base of the strengths and weaknesses of the human perceptual systems and we know how to design effective interfaces for immersive systems. So, in a research landscape with a clear need for better visualization and analysis tools, a methodology in immersive visualization that has been shown to effectively address some of those needs, and vastly improved supporting technologies and knowledge of hardware, software, and HCI, why hasn't immersive visualization 'caught on' more with scientists? What can we do as a community of immersive visualization researchers and practitioners to facilitate greater adoption by scientific communities so as to make the transition from 'the promise of virtual reality' to 'the reality of virtual reality'.« less
Man-machine interface issues in space telerobotics: A JPL research and development program
NASA Technical Reports Server (NTRS)
Bejczy, A. K.
1987-01-01
Technology issues related to the use of robots as man-extension or telerobot systems in space are discussed and exemplified. General considerations are presentd on control and information problems in space teleoperation and on the characteristics of Earth orbital teleoperation. The JPL R and D work in the area of man-machine interface devices and techniques for sensing and computer-based control is briefly summarized. The thrust of this R and D effort is to render space teleoperation efficient and safe through the use of devices and techniques which will permit integrated and task-level (intelligent) two-way control communication between human operator and telerobot machine in Earth orbit. Specific control and information display devices and techniques are discussed and exemplified with development results obtained at JPL in recent years.
Intelligent systems technology infrastructure for integrated systems
NASA Technical Reports Server (NTRS)
Lum, Henry, Jr.
1991-01-01
Significant advances have occurred during the last decade in intelligent systems technologies (a.k.a. knowledge-based systems, KBS) including research, feasibility demonstrations, and technology implementations in operational environments. Evaluation and simulation data obtained to date in real-time operational environments suggest that cost-effective utilization of intelligent systems technologies can be realized for Automated Rendezvous and Capture applications. The successful implementation of these technologies involve a complex system infrastructure integrating the requirements of transportation, vehicle checkout and health management, and communication systems without compromise to systems reliability and performance. The resources that must be invoked to accomplish these tasks include remote ground operations and control, built-in system fault management and control, and intelligent robotics. To ensure long-term evolution and integration of new validated technologies over the lifetime of the vehicle, system interfaces must also be addressed and integrated into the overall system interface requirements. An approach for defining and evaluating the system infrastructures including the testbed currently being used to support the on-going evaluations for the evolutionary Space Station Freedom Data Management System is presented and discussed. Intelligent system technologies discussed include artificial intelligence (real-time replanning and scheduling), high performance computational elements (parallel processors, photonic processors, and neural networks), real-time fault management and control, and system software development tools for rapid prototyping capabilities.
Interaction design challenges and solutions for ALMA operations monitoring and control
NASA Astrophysics Data System (ADS)
Pietriga, Emmanuel; Cubaud, Pierre; Schwarz, Joseph; Primet, Romain; Schilling, Marcus; Barkats, Denis; Barrios, Emilio; Vila Vilaro, Baltasar
2012-09-01
The ALMA radio-telescope, currently under construction in northern Chile, is a very advanced instrument that presents numerous challenges. From a software perspective, one critical issue is the design of graphical user interfaces for operations monitoring and control that scale to the complexity of the system and to the massive amounts of data users are faced with. Early experience operating the telescope with only a few antennas has shown that conventional user interface technologies are not adequate in this context. They consume too much screen real-estate, require many unnecessary interactions to access relevant information, and fail to provide operators and astronomers with a clear mental map of the instrument. They increase extraneous cognitive load, impeding tasks that call for quick diagnosis and action. To address this challenge, the ALMA software division adopted a user-centered design approach. For the last two years, astronomers, operators, software engineers and human-computer interaction researchers have been involved in participatory design workshops, with the aim of designing better user interfaces based on state-of-the-art visualization techniques. This paper describes the process that led to the development of those interface components and to a proposal for the science and operations console setup: brainstorming sessions, rapid prototyping, joint implementation work involving software engineers and human-computer interaction researchers, feedback collection from a broader range of users, further iterations and testing.
The Berlin Brain–Computer Interface: Non-Medical Uses of BCI Technology
Blankertz, Benjamin; Tangermann, Michael; Vidaurre, Carmen; Fazli, Siamac; Sannelli, Claudia; Haufe, Stefan; Maeder, Cecilia; Ramsey, Lenny; Sturm, Irene; Curio, Gabriel; Müller, Klaus-Robert
2010-01-01
Brain–computer interfacing (BCI) is a steadily growing area of research. While initially BCI research was focused on applications for paralyzed patients, increasingly more alternative applications in healthy human subjects are proposed and investigated. In particular, monitoring of mental states and decoding of covert user states have seen a strong rise of interest. Here, we present some examples of such novel applications which provide evidence for the promising potential of BCI technology for non-medical uses. Furthermore, we discuss distinct methodological improvements required to bring non-medical applications of BCI technology to a diversity of layperson target groups, e.g., ease of use, minimal training, general usability, short control latencies. PMID:21165175
Speech and gesture interfaces for squad-level human-robot teaming
NASA Astrophysics Data System (ADS)
Harris, Jonathan; Barber, Daniel
2014-06-01
As the military increasingly adopts semi-autonomous unmanned systems for military operations, utilizing redundant and intuitive interfaces for communication between Soldiers and robots is vital to mission success. Currently, Soldiers use a common lexicon to verbally and visually communicate maneuvers between teammates. In order for robots to be seamlessly integrated within mixed-initiative teams, they must be able to understand this lexicon. Recent innovations in gaming platforms have led to advancements in speech and gesture recognition technologies, but the reliability of these technologies for enabling communication in human robot teaming is unclear. The purpose for the present study is to investigate the performance of Commercial-Off-The-Shelf (COTS) speech and gesture recognition tools in classifying a Squad Level Vocabulary (SLV) for a spatial navigation reconnaissance and surveillance task. The SLV for this study was based on findings from a survey conducted with Soldiers at Fort Benning, GA. The items of the survey focused on the communication between the Soldier and the robot, specifically in regards to verbally instructing them to execute reconnaissance and surveillance tasks. Resulting commands, identified from the survey, were then converted to equivalent arm and hand gestures, leveraging existing visual signals (e.g. U.S. Army Field Manual for Visual Signaling). A study was then run to test the ability of commercially available automated speech recognition technologies and a gesture recognition glove to classify these commands in a simulated intelligence, surveillance, and reconnaissance task. This paper presents classification accuracy of these devices for both speech and gesture modalities independently.
Rationale and Roadmap for Moon Exploration
NASA Astrophysics Data System (ADS)
Foing, B. H.; ILEWG Team
We discuss the different rationale for Moon exploration. This starts with areas of scientific investigations: clues on the formation and evolution of rocky planets, accretion and bombardment in the inner solar system, comparative planetology processes (tectonic, volcanic, impact cratering, volatile delivery), records astrobiology, survival of organics; past, present and future life. The rationale includes also the advancement of instrumentation: Remote sensing miniaturised instruments; Surface geophysical and geochemistry package; Instrument deployment and robotic arm, nano-rover, sampling, drilling; Sample finder and collector. There are technologies in robotic and human exploration that are a drive for the creativity and economical competitivity of our industries: Mecha-electronics-sensors; Tele control, telepresence, virtual reality; Regional mobility rover; Autonomy and Navigation; Artificially intelligent robots, Complex systems, Man-Machine interface and performances. Moon-Mars Exploration can inspire solutions to global Earth sustained development: In-Situ Utilisation of resources; Establishment of permanent robotic infrastructures, Environmental protection aspects; Life sciences laboratories; Support to human exploration. We also report on the IAA Cosmic Study on Next Steps In Exploring Deep Space, and ongoing IAA Cosmic Studies, ILEWG/IMEWG ongoing activities, and we finally discuss possible roadmaps for robotic and human exploration, starting with the Moon-Mars missions for the coming decade, and building effectively on joint technology developments.
Adding a Visualization Feature to Web Search Engines: It’s Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.
Since the first world wide web (WWW) search engine quietly entered our lives in 1994, the “information need” behind web searching has rapidly grown into a multi-billion dollar business that dominates the internet landscape, drives e-commerce traffic, propels global economy, and affects the lives of the whole human race. Today’s search engines are faster, smarter, and more powerful than those released just a few years ago. With the vast investment pouring into research and development by leading web technology providers and the intense emotion behind corporate slogans such as “win the web” or “take back the web,” I can’t helpmore » but ask why are we still using the very same “text-only” interface that was used 13 years ago to browse our search engine results pages (SERPs)? Why has the SERP interface technology lagged so far behind in the web evolution when the corresponding search technology has advanced so rapidly? In this article I explore some current SERP interface issues, suggest a simple but practical visual-based interface design approach, and argue why a visual approach can be a strong candidate for tomorrow’s SERP interface.« less
Development and evaluation of nursing user interface screens using multiple methods.
Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne
2009-12-01
Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.
Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia
1996-01-01
The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.
Applying Cognitive Psychology to User Interfaces
NASA Astrophysics Data System (ADS)
Durrani, Sabeen; Durrani, Qaiser S.
This paper explores some key aspects of cognitive psychology that may be mapped onto user interfaces. Major focus in existing user interface guidelines is on consistency, simplicity, feedback, system messages, display issues, navigation, colors, graphics, visibility and error prevention [8-10]. These guidelines are effective indesigning user interfaces. However, these guidelines do not handle the issues that may arise due to the innate structure of human brain and human limitations. For example, where to place graphics on the screen so that user can easily process them and what kind of background should be given on the screen according to the limitation of human motor system. In this paper we have collected some available guidelines from the area of cognitive psychology [1, 5, 7]. In addition, we have extracted few guidelines from theories and studies of cognitive psychology [3, 11] which may be mapped to user interfaces.
Flying Unmanned Aircraft: A Pilot's Perspective
NASA Technical Reports Server (NTRS)
Pestana, Mark E.
2011-01-01
The National Aeronautics and Space Administration (NASA) is pioneering various Unmanned Aircraft System (UAS) technologies and procedures which may enable routine access to the National Airspace System (NAS), with an aim for Next Gen NAS. These tools will aid in the development of technologies and integrated capabilities that will enable high value missions for science, security, and defense, and open the door to low-cost, extreme-duration, stratospheric flight. A century of aviation evolution has resulted in accepted standards and best practices in the design of human-machine interfaces, the displays and controls of which serve to optimize safe and efficient flight operations and situational awareness. The current proliferation of non-standard, aircraft-specific flight crew interfaces in UAS, coupled with the inherent limitations of operating UAS without in-situ sensory input and feedback (aural, visual, and vestibular cues), has increased the risk of mishaps associated with the design of the "cockpit." The examples of current non- or sub- standard design features range from "annoying" and "inefficient", to those that are difficult to manipulate or interpret in a timely manner, as well as to those that are "burdensome" and "unsafe." A concerted effort is required to establish best practices and standards for the human-machine interfaces, for the pilot as well as the air traffic controller. In addition, roles, responsibilities, knowledge, and skill sets are subject to redefining the terms, "pilot" and "air traffic controller", with respect to operating UAS, especially in the Next-Gen NAS. The knowledge, skill sets, training, and qualification standards for UAS operations must be established, and reflect the aircraft-specific human-machine interfaces and control methods. NASA s recent experiences flying its MQ-9 Ikhana in the NAS for extended duration, has enabled both NASA and the FAA to realize the full potential for UAS, as well as understand the implications of current limitations. Ikhana is a Predator-B/Reaper UAS, built by General Atomics, Aeronautical Systems, Inc., and modified for research. Since 2007, the aircraft has been flown seasonally with a wing-mounted pod containing an infrared scanner, utilized to provide real-time wildfire geo-location data to various fire-fighting agencies in the western U.S. The multi-agency effort included an extensive process to obtain flight clearance from the FAA to operate under special provisions, given that UAS in general do not fully comply with current airspace regulations (e.g. sense-and-avoid requirements).
The intelligent user interface for NASA's advanced information management systems
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas, Jr.; Rolofs, Larry H.; Wattawa, Scott L.
1987-01-01
NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.
In-Space Crew-Collaborative Task Scheduling
NASA Technical Reports Server (NTRS)
Jaap, John; Meyer, Patrick; Davis, Elizabeth; Richardson, Lea
2006-01-01
As humans venture farther from Earth for longer durations, it will become essential for those on the journey to have significant control over the scheduling of their own activities as well as the activities of their companion systems and robots. However, the crew will not do all the scheduling; timelines will be the result of collaboration with ground personnel. Emerging technologies such as in-space message buses, delay-tolerant networks, and in-space internet will be the carriers on which the collaboration rides. Advances in scheduling technology, in the areas of task modeling, scheduling engines, and user interfaces will allow the crew to become virtual scheduling experts. New concepts of operations for producing the timeline will allow the crew and the ground support to collaborate while providing safeguards to ensure that the mission will be effectively accomplished without endangering the systems or personnel.
Wireless control of powered wheelchairs with tongue motion using tongue drive assistive technology.
Huo, Xueliang; Wang, Jia; Ghovanloo, Maysam
2008-01-01
Tongue Drive system (TDS) is a tongue-operated unobtrusive wireless assistive technology, which can potentially provide people with severe disabilities with effective computer access and environment control. It translates users' intentions into control commands by detecting and classifying their voluntary tongue motion utilizing a small permanent magnet, secured on the tongue, and an array of magnetic sensors mounted on a headset outside the mouth or an orthodontic brace inside. We have developed customized interface circuitry and implemented four control strategies to drive a powered wheelchair (PWC) using an external TDS prototype. The system has been evaluated by five able-bodied human subjects. The results showed that all subjects could easily operate the PWC using their tongue movements, and different control strategies worked better depending on the users' familiarity with the TDS.
Applying systems engineering methodologies to the micro- and nanoscale realm
NASA Astrophysics Data System (ADS)
Garrison Darrin, M. Ann
2012-06-01
Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.
Yu, Xunyi; Ganz, Aura
2011-01-01
In this paper we introduce a Mixed Reality Triage and Evacuation game, MiRTE, that is used in the development, testing and training of Mass Casualty Incident (MCI) information systems for first responders. Using the Source game engine from Valve software, MiRTE creates immersive virtual environments to simulate various incident scenarios, and enables interactions between multiple players/first responders. What distinguishes it from a pure computer simulation game is that it can interface with external mass casualty incident management systems, such as DIORAMA. The game will enable system developers to specify technical requirements of underlying technology, and test different alternatives of design. After the information system hardware and software are completed, the game can simulate various algorithms such as localization technologies, and interface with an actual user interface on PCs and Smartphones. We implemented and tested the game with the DIORAMA system.
Buzzelli, Michelle M; Morgan, Paula; Muschek, Alexander G; Macgregor-Skinner, Gavin
2014-01-01
Lack of success in disaster recovery occurs for many reasons, with one predominant catalyst for catastrophic failure being flawed and inefficient communication systems. Increased occurrences of devastating environmental hazards and human-caused disasters will continue to promulgate throughout the United States and around the globe as a result of the continuous intensive urbanization forcing human population into more concentrated and interconnected societies. With the rapid evolutions in technology and the advent of Information and communication technology (ICT) interfaces such as Facebook, Twitter, Flickr, Myspace, and Smartphone technology, communication is no longer a unidirectional source of information traveling from the newsroom to the public. In the event of a disaster, time critical information can be exchanged to and from any person or organization simultaneously with the capability to receive feedback. A literature review of current information regarding the use of ICT as information infrastructures in disaster management during human-caused and natural disasters will be conducted. This article asserts that the integrated use of ICTs as multidirectional information sharing tools throughout the disaster cycle will increase a community's resiliency and supplement the capabilities of first responders and emergency management officials by providing real-time updates and information needed to assist and recover from a disaster.
Schermer, Maartje H N
2013-01-01
New biomedical technologies make it possible to replace parts of the human body or to substitute its functions. Examples include artificial joints, eye lenses and arterial stents. Newer technologies use electronics and software, for example in brain-computer interfaces such as retinal implants and the exoskeleton MindWalker. Gradually we are creating cyborgs: hybrids of man and machine. This raises the question: are cyborgs still humans? It is argued that they are. First, because employing technology is a typically human characteristic. Second, because in western thought the human mind, and not the body, is considered to be the seat of personhood. However, it has been argued by phenomenological philosophers that the body is more than just an object but is also a subject, important for human identity. From this perspective, we can appreciate that a bionic body does not make one less human, but it does influence the experience of being human.
NASA Technical Reports Server (NTRS)
Ross, Amy
2011-01-01
A NASA spacesuit under the EVA Technology Domain consists of a suit system; a PLSS; and a Power, Avionics, and Software (PAS) system. Ross described the basic functions, components, and interfaces of the PLSS, which consists of oxygen, ventilation, and thermal control subsystems; electronics; and interfaces. Design challenges were reviewed from a packaging perspective. Ross also discussed the development of the PLSS over the last two decades.
Yang, Ya; Zhang, Hulin; Lin, Zong-Hong; Zhou, Yu Sheng; Jing, Qingshen; Su, Yuanjie; Yang, Jin; Chen, Jun; Hu, Chenguo; Wang, Zhong Lin
2013-10-22
We report human skin based triboelectric nanogenerators (TENG) that can either harvest biomechanical energy or be utilized as a self-powered tactile sensor system for touch pad technology. We constructed a TENG utilizing the contact/separation between an area of human skin and a polydimethylsiloxane (PDMS) film with a surface of micropyramid structures, which was attached to an ITO electrode that was grounded across a loading resistor. The fabricated TENG delivers an open-circuit voltage up to -1000 V, a short-circuit current density of 8 mA/m(2), and a power density of 500 mW/m(2) on a load of 100 MΩ, which can be used to directly drive tens of green light-emitting diodes. The working mechanism of the TENG is based on the charge transfer between the ITO electrode and ground via modulating the separation distance between the tribo-charged skin patch and PDMS film. Furthermore, the TENG has been used in designing an independently addressed matrix for tracking the location and pressure of human touch. The fabricated matrix has demonstrated its self-powered and high-resolution tactile sensing capabilities by recording the output voltage signals as a mapping figure, where the detection sensitivity of the pressure is about 0.29 ± 0.02 V/kPa and each pixel can have a size of 3 mm × 3 mm. The TENGs may have potential applications in human-machine interfacing, micro/nano-electromechanical systems, and touch pad technology.
Emotion-prints: interaction-driven emotion visualization on multi-touch interfaces
NASA Astrophysics Data System (ADS)
Cernea, Daniel; Weber, Christopher; Ebert, Achim; Kerren, Andreas
2015-01-01
Emotions are one of the unique aspects of human nature, and sadly at the same time one of the elements that our technological world is failing to capture and consider due to their subtlety and inherent complexity. But with the current dawn of new technologies that enable the interpretation of emotional states based on techniques involving facial expressions, speech and intonation, electrodermal response (EDS) and brain-computer interfaces (BCIs), we are finally able to access real-time user emotions in various system interfaces. In this paper we introduce emotion-prints, an approach for visualizing user emotional valence and arousal in the context of multi-touch systems. Our goal is to offer a standardized technique for representing user affective states in the moment when and at the location where the interaction occurs in order to increase affective self-awareness, support awareness in collaborative and competitive scenarios, and offer a framework for aiding the evaluation of touch applications through emotion visualization. We show that emotion-prints are not only independent of the shape of the graphical objects on the touch display, but also that they can be applied regardless of the acquisition technique used for detecting and interpreting user emotions. Moreover, our representation can encode any affective information that can be decomposed or reduced to Russell's two-dimensional space of valence and arousal. Our approach is enforced by a BCI-based user study and a follow-up discussion of advantages and limitations.
Remote surface inspection system
NASA Astrophysics Data System (ADS)
Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.
1993-02-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
Remote surface inspection system
NASA Technical Reports Server (NTRS)
Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.
1993-01-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
Wearable computer technology for dismounted applications
NASA Astrophysics Data System (ADS)
Daniels, Reginald
2010-04-01
Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.
Generating a Reduced Gravity Environment on Earth
NASA Technical Reports Server (NTRS)
Dungan, Larry K.; Cunningham, Tom; Poncia, Dina
2010-01-01
Since the 1950s several reduced gravity simulators have been designed and utilized in preparing humans for spaceflight and in reduced gravity system development. The Active Response Gravity Offload System (ARGOS) is the newest and most realistic gravity offload simulator. ARGOS provides three degrees of motion within the test area and is scalable for full building deployment. The inertia of the overhead system is eliminated by an active motor and control system. This presentation will discuss what ARGOS is, how it functions, and the unique challenges of interfacing to the human. Test data and video for human and robotic systems will be presented. A major variable in the human machine interaction is the interface of ARGOS to the human. These challenges along with design solutions will be discussed.
Pilot vehicle interface on the advanced fighter technology integration F-16
NASA Technical Reports Server (NTRS)
Dana, W. H.; Smith, W. B.; Howard, J. D.
1986-01-01
This paper focuses on the work load aspects of the pilot vehicle interface in regard to the new technologies tested during AMAS Phase II. Subjects discussed in this paper include: a wide field-of-view head-up display; automated maneuvering attack system/sensor tracker system; master modes that configure flight controls and mission avionics; a modified helmet mounted sight; improved multifunction display capability; a voice interactive command system; ride qualities during automated weapon delivery; a color moving map; an advanced digital map display; and a g-induced loss-of-consciousness and spatial disorientation autorecovery system.
Network speech systems technology program
NASA Astrophysics Data System (ADS)
Weinstein, C. J.
1981-09-01
This report documents work performed during FY 1981 on the DCA-sponsored Network Speech Systems Technology Program. The two areas of work reported are: (1) communication system studies in support of the evolving Defense Switched Network (DSN) and (2) design and implementation of satellite/terrestrial interfaces for the Experimental Integrated Switched Network (EISN). The system studies focus on the development and evaluation of economical and endurable network routing procedures. Satellite/terrestrial interface development includes circuit-switched and packet-switched connections to the experimental wideband satellite network. Efforts in planning and coordination of EISN experiments are reported in detail in a separate EISN Experiment Plan.
Three-dimensional user interfaces for scientific visualization
NASA Technical Reports Server (NTRS)
VanDam, Andries (Principal Investigator)
1996-01-01
The focus of this grant was to experiment with novel user interfaces for scientific visualization applications using both desktop and virtual reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past three years, and subsumes all prior reports.
OASIS: A GEOGRAPHICAL DECISION SUPPORT SYSTEM FOR GROUND-WATER CONTAMINANT MODELING
Three new software technologies were applied to develop an efficient and easy to use decision support system for ground-water contaminant modeling. Graphical interfaces create a more intuitive and effective form of communication with the computer compared to text-based interfaces...
Speech-recognition interfaces for music information retrieval
NASA Astrophysics Data System (ADS)
Goto, Masataka
2005-09-01
This paper describes two hands-free music information retrieval (MIR) systems that enable a user to retrieve and play back a musical piece by saying its title or the artist's name. Although various interfaces for MIR have been proposed, speech-recognition interfaces suitable for retrieving musical pieces have not been studied. Our MIR-based jukebox systems employ two different speech-recognition interfaces for MIR, speech completion and speech spotter, which exploit intentionally controlled nonverbal speech information in original ways. The first is a music retrieval system with the speech-completion interface that is suitable for music stores and car-driving situations. When a user only remembers part of the name of a musical piece or an artist and utters only a remembered fragment, the system helps the user recall and enter the name by completing the fragment. The second is a background-music playback system with the speech-spotter interface that can enrich human-human conversation. When a user is talking to another person, the system allows the user to enter voice commands for music playback control by spotting a special voice-command utterance in face-to-face or telephone conversations. Experimental results from use of these systems have demonstrated the effectiveness of the speech-completion and speech-spotter interfaces. (Video clips: http://staff.aist.go.jp/m.goto/MIR/speech-if.html)
The Last Millimeter: Interfacing the New Public Radio Satellite System. Info. Packets No. 14.
ERIC Educational Resources Information Center
Pizzi, Skip
Public radio is about to achieve a new technological level as the new Public Radio Satellite System (PRSS) is deployed. The network will dramatically improve the capacity and quality of its interconnection system, but proper interfacing at member stations will be required to realize the full benefits of the new system. The new system uses digital…
ERIC Educational Resources Information Center
Johnson, Christopher W.
1996-01-01
The development of safety-critical systems (aircraft cockpits and reactor control rooms) is qualitatively different from that of other interactive systems. These differences impose burdens on design teams that must ensure the development of human-machine interfaces. Analyzes strengths and weaknesses of formal methods for the design of user…
Future developments in brain-machine interface research
Lebedev, Mikhail A; Tate, Andrew J; Hanson, Timothy L; Li, Zheng; O'Doherty, Joseph E; Winans, Jesse A; Ifft, Peter J; Zhuang, Katie Z; Fitzsimmons, Nathan A; Schwarz, David A; Fuller, Andrew M; An, Je Hi; Nicolelis, Miguel A L
2011-01-01
Neuroprosthetic devices based on brain-machine interface technology hold promise for the restoration of body mobility in patients suffering from devastating motor deficits caused by brain injury, neurologic diseases and limb loss. During the last decade, considerable progress has been achieved in this multidisciplinary research, mainly in the brain-machine interface that enacts upper-limb functionality. However, a considerable number of problems need to be resolved before fully functional limb neuroprostheses can be built. To move towards developing neuroprosthetic devices for humans, brain-machine interface research has to address a number of issues related to improving the quality of neuronal recordings, achieving stable, long-term performance, and extending the brain-machine interface approach to a broad range of motor and sensory functions. Here, we review the future steps that are part of the strategic plan of the Duke University Center for Neuroengineering, and its partners, the Brazilian National Institute of Brain-Machine Interfaces and the École Polytechnique Fédérale de Lausanne (EPFL) Center for Neuroprosthetics, to bring this new technology to clinical fruition. PMID:21779720
Quantifying Pilot Visual Attention in Low Visibility Terminal Operations
NASA Technical Reports Server (NTRS)
Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.
2012-01-01
Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation
NASA Technical Reports Server (NTRS)
Gietl, Eric B.; Gholdston, Edward W.; Manners, Bruce A.; Delventhal, Rex A.
2000-01-01
The electrical power system developed for the International Space Station represents the largest space-based power system ever designed and, consequently, has driven some key technology aspects and operational challenges. The full U.S.-built system consists of a 160-Volt dc primary network, and a more tightly regulated 120-Volt dc secondary network. Additionally, the U.S. system interfaces with the 28-Volt system in the Russian segment. The international nature of the Station has resulted in modular converters, switchgear, outlet panels, and other components being built by different countries, with the associated interface challenges. This paper provides details of the architecture and unique hardware developed for the Space Station, and examines the opportunities it provides for further long-term space power technology development, such as concentrating solar arrays and flywheel energy storage systems.
Designing the user interface: strategies for effective human-computer interaction
NASA Astrophysics Data System (ADS)
Shneiderman, B.
1998-03-01
In revising this popular book, Ben Shneiderman again provides a complete, current and authoritative introduction to user-interface design. The user interface is the part of every computer system that determines how people control and operate that system. When the interface is well designed, it is comprehensible, predictable, and controllable; users feel competent, satisfied, and responsible for their actions. Shneiderman discusses the principles and practices needed to design such effective interaction. Based on 20 years experience, Shneiderman offers readers practical techniques and guidelines for interface design. He also takes great care to discuss underlying issues and to support conclusions with empirical results. Interface designers, software engineers, and product managers will all find this book an invaluable resource for creating systems that facilitate rapid learning and performance, yield low error rates, and generate high user satisfaction. Coverage includes the human factors of interactive software (with a new discussion of diverse user communities), tested methods to develop and assess interfaces, interaction styles such as direct manipulation for graphical user interfaces, and design considerations such as effective messages, consistent screen design, and appropriate color.
TheBrain Technologies Corporation: Collapsing the Time to Knowledge.
ERIC Educational Resources Information Center
Misek, Marla
2003-01-01
TheBrain was created to take advantage of the most powerful information processor in existence - the human mind. Explains products of TheBrain Technologies Corporation,, which has developed computer interfaces to help individual users and corporations organize information in ways that make sense to them in the proper context. Describes a…
Dominici, Nadia; Keller, Urs; Vallery, Heike; Friedli, Lucia; van den Brand, Rubia; Starkey, Michelle L; Musienko, Pavel; Riener, Robert; Courtine, Grégoire
2012-07-01
Central nervous system (CNS) disorders distinctly impair locomotor pattern generation and balance, but technical limitations prevent independent assessment and rehabilitation of these subfunctions. Here we introduce a versatile robotic interface to evaluate, enable and train pattern generation and balance independently during natural walking behaviors in rats. In evaluation mode, the robotic interface affords detailed assessments of pattern generation and dynamic equilibrium after spinal cord injury (SCI) and stroke. In enabling mode,the robot acts as a propulsive or postural neuroprosthesis that instantly promotes unexpected locomotor capacities including overground walking after complete SCI, stair climbing following partial SCI and precise paw placement shortly after stroke. In training mode, robot-enabled rehabilitation, epidural electrical stimulation and monoamine agonists reestablish weight-supported locomotion, coordinated steering and balance in rats with a paralyzing SCI. This new robotic technology and associated concepts have broad implications for both assessing and restoring motor functions after CNS disorders, both in animals and in humans.
Time to address the problems at the neural interface
NASA Astrophysics Data System (ADS)
Durand, Dominique M.; Ghovanloo, Maysam; Krames, Elliot
2014-04-01
Neural engineers have made significant, if not remarkable, progress in interfacing with the nervous system in the last ten years. In particular, neuromodulation of the brain has generated significant therapeutic benefits [1-5]. EEG electrodes can be used to communicate with patients with locked-in syndrome [6]. In the central nervous system (CNS), electrode arrays placed directly over or within the cortex can record neural signals related to the intent of the subject or patient [7, 8]. A similar technology has allowed paralyzed patients to control an otherwise normal skeletal system with brain signals [9, 10]. This technology has significant potential to restore function in these and other patients with neural disorders such as stroke [11]. Although there are several multichannel arrays described in the literature, the workhorse for these cortical interfaces has been the Utah array [12]. This 100-channel electrode array has been used in most studies on animals and humans since the 1990s and is commercially available. This array and other similar microelectrode arrays can record neural signals with high quality (high signal-to-noise ratio), but these signals fade and disappear after a few months and therefore the current technology is not reliable for extended periods of time. Therefore, despite these major advances in communicating with the brain, clinical translation cannot be implemented. The reasons for this failure are not known but clearly involve the interface between the electrode and the neural tissue. The Defense Advanced Research Project Agency (DARPA) as well as other federal funding agencies such as the National Science Foundation (NSF) and the National Institutes of Health have provided significant financial support to investigate this problem without much success. A recent funding program from DARPA was designed to establish the failure modes in order to generate a reliable neural interface technology and again was unsuccessful at producing a robust interface with the CNS. In 2013, two symposia were held independently to discuss this problem: one was held at the International Neuromodulation Society's 11th World Congress in Berlin and supported by the International Neuromodulation Society1 and the other at the 6th International Neural Engineering conference in San Diego2 and was supported by the NSF. Clearly, the neuromodulation and the neural engineering communities are keen to solve this problem. Experts from the field were assembled to discuss the problems and potential solutions. Although many important points were raised, few emerged as key issues. (1) The ability to access remotely and reliably internal neural signals . Although some of the technological problems have already been solved, this ability to access neural signals is still a significant problem since reliable and robust transcutaneous telemetry systems with large numbers of signals, each with wide bandwidth, are not readily available to researchers. (2) A translation strategy taking basic research to the clinic . The lack of understanding of the biological response to implanted constructs and the inability to monitor the sites and match the mechanical properties of the probe to the neural tissue properties continue to be an unsolved problem. In addition, the low levels of collaboration among neuroscientists, clinicians, patients and other stakeholders throughout different phases of research and development were considered to be significant impediments to progress. (3) Fundamental tools development procedures for neural interfacing . There are many laboratories testing various devices with different sets of criteria, but there is no consensus on the failure modes. The reliability, robustness of metrics and testing standards for such devices have not been established, either in academia or in industry. To start addressing this problem, the FDA has established a laboratory to test the reliability of some neural devices. Although the discussion was mostly centered on interfacing with the CNS, it has recently become clear that the peripheral nervous system (PNS) could be an important target for interfacing, perhaps even more accessible for interfacing than the CNS. A recent initiative called Bioelectronic Medicines3 is a step in that direction. A recent summit held in New York was organized to investigate novel and disruptive neural technologies to interface specifically with the PNS in order to restore health and biological function to organs. With significant interest in neurotechnology for neural interfacing (see footnotes 1, 2 and 3) and uncovering new ways to treat, prevent and cure brain disorders (President Obama's brain initiative4), it seems clear that the problems at the interface will not remain unsolved for long. Finding solutions to the problem at the neural interface for interacting with the nervous system (PNS and CNS) is crucial for understanding and restoring brain function. This would in turn have a significant impact on health care and quality of life for patients with neural disorders. References [1] Follett K A et al 2010 Pallidal versus subthalamic deep-brain stimulation for Parkinson's disease New Engl. J. Med. 362 2077-91 [2] Holtzheimer P E et al 2012 Subcallosal cingulate deep brain stimulation for treatment-resistant unipolar and bipolar depression Arch. Gen. Psychiatry 69 150 [3] Carron R, Chabardes S and Hammond C 2012 Mechanisms of action of high-frequency deep brain stimulation. A review of the literature and current concepts NeuroChirurgie 58 209-17 [4] Vidailhet M et al 2005 Bilateral deep-brain stimulation of the globus pallidus in primary generalized dystonia New Engl. J. Med. 352 459-67 [5] Theodore W H and Fisher R S 2004 Brain stimulation for epilepsy Lancet Neurol. 3 111-8 [6] Kübler A, Kotchoubey B, Kaiser J, Wolpaw J R and Birbaumer N 2001 Brain-computer communication: unlocking the locked Psychol. Bull. 127 358-75 [7] Schalk G, Miller K J, Anderson N R, Wilson J A, Smyth M D, Ojemann J G, Moran D W, Wolpaw J R and Leuthardt E C 2008 Two-dimensional movement control using electrocorticographic signals in humans J. Neural Eng. 5 75 [8] Serruya M D, Hatsopoulos N G, Paninski L, Fellows M R and Donoghue J P 2002 Brain-machine interface: instant neural control of a movement signal Nature 416 141-2 [9] Hochberg L R, Serruya M D, Friehs G M, Mukand J A, Saleh M, Caplan A H, Branner A, Chen D, Penn R D and Donoghue J P 2006 Neuronal ensemble control of prosthetic devices by a human with tetraplegia Nature 442 164-71 [10] Collinger J L et al 2013 High-performance neuroprosthetic control by an individual with tetraplegia Lancet 381 557-64 [11] Leuthardt E C, Schalk G, Wolpaw J R, Ojemann J G and Moran D W 2004 A brain-computer interface using electrocorticographic signals in humans J. Neural Eng. 1 63 [12] Maynard E M, Nordhausen C T and Normann R A 1997 The Utah intracortical electrode array: a recording structure for potential brain-computer interfaces Electroencephalogr. Clin. Neurophysiol. 102 228-39 1 www.neuromodulation.com/8-june-2013 2 http://neuro.embs.org/wp-content/uploads/sites/2/2013/05/SymposiumAdvert1.pdf 3 www.gsk.com/explore-gsk/how-we-do-r-and-d/bioelectronics.html 4 www.whitehouse.gov/share/brain-initiative
Development of the Computer Interface Literacy Measure.
ERIC Educational Resources Information Center
Turner, G. Marc; Sweany, Noelle Wall; Husman, Jenefer
2000-01-01
Discussion of computer literacy and the rapidly changing face of technology focuses on a study that redefined computer literacy to include competencies for using graphical user interfaces for operating systems, hypermedia applications, and the Internet. Describes the development and testing of the Computer Interface Literacy Measure with…
Transfer of control system interface solutions from other domains to the thermal power industry.
Bligård, L-O; Andersson, J; Osvalder, A-L
2012-01-01
In a thermal power plant the operators' roles are to control and monitor the process to achieve efficient and safe production. To achieve this, the human-machine interfaces have a central part. The interfaces need to be updated and upgraded together with the technical functionality to maintain optimal operation. One way of achieving relevant updates is to study other domains and see how they have solved similar issues in their design solutions. The purpose of this paper is to present how interface design solution ideas can be transferred from domains with operator control to thermal power plants. In the study 15 domains were compared using a model for categorisation of human-machine systems. The result from the domain comparison showed that nuclear power, refinery and ship engine control were most similar to thermal power control. From the findings a basic interface structure and three specific display solutions were proposed for thermal power control: process parameter overview, plant overview, and feed water view. The systematic comparison of the properties of a human-machine system allowed interface designers to find suitable objects, structures and navigation logics in a range of domains that could be transferred to the thermal power domain.
The Advanced Linked Extended Reconnaissance & Targeting Technology Demonstration project
NASA Astrophysics Data System (ADS)
Edwards, Mark
2008-04-01
The Advanced Linked Extended Reconnaissance & Targeting (ALERT) Technology Demonstration (TD) project is addressing many operational needs of the future Canadian Army's Surveillance and Reconnaissance forces. Using the surveillance system of the Coyote reconnaissance vehicle as an experimental platform, the ALERT TD project aims to significantly enhance situational awareness by fusing multi-sensor and tactical data, developing automated processes, and integrating beyond line-of-sight sensing. The project is exploiting important advances made in computer processing capability, displays technology, digital communications, and sensor technology since the design of the original surveillance system. As the major research area within the project, concepts are discussed for displaying and fusing multi-sensor and tactical data within an Enhanced Operator Control Station (EOCS). The sensor data can originate from the Coyote's own visible-band and IR cameras, laser rangefinder, and ground-surveillance radar, as well as from beyond line-of-sight systems such as mini-UAVs and unattended ground sensors. Video-rate image processing has been developed to assist the operator to detect poorly visible targets. As a second major area of research, automatic target cueing capabilities have been added to the system. These include scene change detection, automatic target detection and aided target recognition algorithms processing both IR and visible-band images to draw the operator's attention to possible targets. The merits of incorporating scene change detection algorithms are also discussed. In the area of multi-sensor data fusion, up to Joint Defence Labs level 2 has been demonstrated. The human factors engineering aspects of the user interface in this complex environment are presented, drawing upon multiple user group sessions with military surveillance system operators. The paper concludes with Lessons Learned from the project. The ALERT system has been used in a number of C4ISR field trials, most recently at Exercise Empire Challenge in China Lake CA, and at Trial Quest in Norway. Those exercises provided further opportunities to investigate operator interactions. The paper concludes with recommendations for future work in operator interface design.
Facial recognition in education system
NASA Astrophysics Data System (ADS)
Krithika, L. B.; Venkatesh, K.; Rathore, S.; Kumar, M. Harish
2017-11-01
Human beings exploit emotions comprehensively for conveying messages and their resolution. Emotion detection and face recognition can provide an interface between the individuals and technologies. The most successful applications of recognition analysis are recognition of faces. Many different techniques have been used to recognize the facial expressions and emotion detection handle varying poses. In this paper, we approach an efficient method to recognize the facial expressions to track face points and distances. This can automatically identify observer face movements and face expression in image. This can capture different aspects of emotion and facial expressions.
Sustainable Software Decisions for Long-term Projects (Invited)
NASA Astrophysics Data System (ADS)
Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.
2013-12-01
Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of the Virtuoso Open-Source triplestore.
77 FR 8217 - Evaluating the Usability of Electronic Health Record (EHR) Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... interface design guidelines for EHRs. Manufacturers interested in participating in this research will be... the usability of health information technology (HIT) systems. NIST research is designed to: (1... develop performance-oriented user interface design guidelines for EHRs, and a framework for assessing the...
The interface of genomic technologies and nursing.
Loescher, Lois J; Merkle, Carrie J
2005-01-01
(a) to summarize views of the interface of technology, genomic technology, and nursing; (b) provide an overview of current and emerging genomic technologies; (c) present clinical exemplars of uses of genomic technology in two disease conditions; and (d) list genomic-focused nursing research on genomic technologies. A discussion of genomic technology in the context of nurses' views of technology, the importance of genomic technology for nurses, linking the central dogma of molecular biology to state-of-the-art tests and assays, and nurses' current use of technologies. Human genome discoveries will continue to be an integral part of disease prevention, diagnosis, treatment, and management. These discoveries also have the potential for being integrated into nursing science. Genomic technologies are becoming a driving force in patient management, so that nurses will be unable to provide quality care without knowledge of the types of genomic technologies, the rationale for their use, and the possible sequelae that can result from genetic diagnosis or treatment. Many nurses already are using genomic technologies to conduct genomic-focused nursing research. The biobehavioral nature of much of this research further indicates the important contributions of nurses in genomics.
Kloosterman, Ate; Mapes, Anna; Geradts, Zeno; van Eijk, Erwin; Koper, Carola; van den Berg, Jorrit; Verheij, Saskia; van der Steen, Marcel; van Asten, Arian
2015-01-01
In this paper, the importance of modern technology in forensic investigations is discussed. Recent technological developments are creating new possibilities to perform robust scientific measurements and studies outside the controlled laboratory environment. The benefits of real-time, on-site forensic investigations are manifold and such technology has the potential to strongly increase the speed and efficacy of the criminal justice system. However, such benefits are only realized when quality can be guaranteed at all times and findings can be used as forensic evidence in court. At the Netherlands Forensic Institute, innovation efforts are currently undertaken to develop integrated forensic platform solutions that allow for the forensic investigation of human biological traces, the chemical identification of illicit drugs and the study of large amounts of digital evidence. These platforms enable field investigations, yield robust and validated evidence and allow for forensic intelligence and targeted use of expert capacity at the forensic institutes. This technological revolution in forensic science could ultimately lead to a paradigm shift in which a new role of the forensic expert emerges as developer and custodian of integrated forensic platforms. PMID:26101289
A design for a ground-based data management system
NASA Technical Reports Server (NTRS)
Lambird, Barbara A.; Lavine, David
1988-01-01
An initial design for a ground-based data management system which includes intelligent data abstraction and cataloging is described. The large quantity of data on some current and future NASA missions leads to significant problems in providing scientists with quick access to relevant data. Human screening of data for potential relevance to a particular study is time-consuming and costly. Intelligent databases can provide automatic screening when given relevent scientific parameters and constraints. The data management system would provide, at a minimum, information of availability of the range of data, the type available, specific time periods covered together with data quality information, and related sources of data. The system would inform the user about the primary types of screening, analysis, and methods of presentation available to the user. The system would then aid the user with performing the desired tasks, in such a way that the user need only specify the scientific parameters and objectives, and not worry about specific details for running a particular program. The design contains modules for data abstraction, catalog plan abstraction, a user-friendly interface, and expert systems for data handling, data evaluation, and application analysis. The emphasis is on developing general facilities for data representation, description, analysis, and presentation that will be easily used by scientists directly, thus bypassing the knowledge acquisition bottleneck. Expert system technology is used for many different aspects of the data management system, including the direct user interface, the interface to the data analysis routines, and the analysis of instrument status.
The human role in space (THURIS) applications study. Final briefing
NASA Technical Reports Server (NTRS)
Maybee, George W.
1987-01-01
The THURIS (The Human Role in Space) application is an iterative process involving successive assessments of man/machine mixes in terms of performance, cost and technology to arrive at an optimum man/machine mode for the mission application. The process begins with user inputs which define the mission in terms of an event sequence and performance time requirements. The desired initial operational capability date is also an input requirement. THURIS terms and definitions (e.g., generic activities) are applied to the input data converting it into a form which can be analyzed using the THURIS cost model outputs. The cost model produces tabular and graphical outputs for determining the relative cost-effectiveness of a given man/machine mode and generic activity. A technology database is provided to enable assessment of support equipment availability for selected man/machine modes. If technology gaps exist for an application, the database contains information supportive of further investigation into the relevant technologies. The present study concentrated on testing and enhancing the THURIS cost model and subordinate data files and developing a technology database which interfaces directly with the user via technology readiness displays. This effort has resulted in a more powerful, easy-to-use applications system for optimization of man/machine roles. Volume 1 is an executive summary.
Timpka, T
2001-08-01
In an analysis departing from the global health situation, the foundation for a change of paradigm in health informatics based on socially embedded information infrastructures and technologies is identified and discussed. It is shown how an increasing computing and data transmitting capacity can be employed for proactive health computing. As a foundation for ubiquitous health promotion and prevention of disease and injury, proactive health systems use data from multiple sources to supply individuals and communities evidence-based information on means to improve their state of health and avoid health risks. The systems are characterised by: (1) being profusely connected to the world around them, using perceptual interfaces, sensors and actuators; (2) responding to external stimuli at faster than human speeds; (3) networked feedback loops; and (4) humans remaining in control, while being left outside the primary computing loop. The extended scientific mission of this new partnership between computer science, electrical engineering and social medicine is suggested to be the investigation of how the dissemination of information and communication technology on democratic grounds can be made even more important for global health than sanitation and urban planning became a century ago.
Human factors with nonhumans - Factors that affect computer-task performance
NASA Technical Reports Server (NTRS)
Washburn, David A.
1992-01-01
There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.
When Medicine Meets Engineering—Paradigm Shifts in Diagnostics and Therapeutics
Wang, Hann; Silva, Aleidy; Ho, Chih-Ming
2013-01-01
During the last two decades, the manufacturing techniques of microfluidics-based devices have been phenomenally advanced, offering unlimited potential for bio-medical technologies. However, the direct applications of these technologies toward diagnostics and therapeutics are still far from maturity. The present challenges lay at the interfaces between the engineering systems and the biocomplex systems. A precisely designed engineering system with narrow dynamic range is hard to seamlessly integrate with the adaptive biological system in order to achieve the design goals. These differences remain as the roadblock between two fundamentally non-compatible systems. This paper will not extensively review the existing microfluidic sensors and actuators; rather, we will discuss the sources of the gaps for integration. We will also introduce system interface technologies for bridging the differences to lead toward paradigm shifts in diagnostics and therapeutics. PMID:26835672
Multimodal neuroelectric interface development
NASA Technical Reports Server (NTRS)
Trejo, Leonard J.; Wheeler, Kevin R.; Jorgensen, Charles C.; Rosipal, Roman; Clanton, Sam T.; Matthews, Bryan; Hibbs, Andrew D.; Matthews, Robert; Krupka, Michael
2003-01-01
We are developing electromyographic and electroencephalographic methods, which draw control signals for human-computer interfaces from the human nervous system. We have made progress in four areas: 1) real-time pattern recognition algorithms for decoding sequences of forearm muscle activity associated with control gestures; 2) signal-processing strategies for computer interfaces using electroencephalogram (EEG) signals; 3) a flexible computation framework for neuroelectric interface research; and d) noncontact sensors, which measure electromyogram or EEG signals without resistive contact to the body.
Assessment of brain-machine interfaces from the perspective of people with paralysis.
Blabe, Christine H; Gilja, Vikash; Chestek, Cindy A; Shenoy, Krishna V; Anderson, Kim D; Henderson, Jaimie M
2015-08-01
One of the main goals of brain-machine interface (BMI) research is to restore function to people with paralysis. Currently, multiple BMI design features are being investigated, based on various input modalities (externally applied and surgically implantable sensors) and output modalities (e.g. control of computer systems, prosthetic arms, and functional electrical stimulation systems). While these technologies may eventually provide some level of benefit, they each carry associated burdens for end-users. We sought to assess the attitudes of people with paralysis toward using various technologies to achieve particular benefits, given the burdens currently associated with the use of each system. We designed and distributed a technology survey to determine the level of benefit necessary for people with tetraplegia due to spinal cord injury to consider using different technologies, given the burdens currently associated with them. The survey queried user preferences for 8 BMI technologies including electroencephalography, electrocorticography, and intracortical microelectrode arrays, as well as a commercially available eye tracking system for comparison. Participants used a 5-point scale to rate their likelihood to adopt these technologies for 13 potential control capabilities. Survey respondents were most likely to adopt BMI technology to restore some of their natural upper extremity function, including restoration of hand grasp and/or some degree of natural arm movement. High speed typing and control of a fast robot arm were also of interest to this population. Surgically implanted wireless technologies were twice as 'likely' to be adopted as their wired equivalents. Assessing end-user preferences is an essential prerequisite to the design and implementation of any assistive technology. The results of this survey suggest that people with tetraplegia would adopt an unobtrusive, autonomous BMI system for both restoration of upper extremity function and control of external devices such as communication interfaces.
NASA Technical Reports Server (NTRS)
1983-01-01
Space station architectural options, habitability considerations and subsystem analyses, technology, and programmatics are reviewed. The methodology employed for conceiving and defining space station concepts is presented. As a result of this approach, architectures were conceived and along with their supporting rationale are described within this portion of the report. Habitability consideration and subsystem analyses describe the human factors associated with space station operations and includes subsections covering (1) data management, (2) communications and tracking, (3) environmental control and life support, (4) manipulator systems, (5) resupply, (6) pointing, (7) thermal management and (8) interface standardization. A consolidated matrix of subsystems technology issues as related to meeting the mission needs for a 1990's era space station is presented. Within the programmatics portion, a brief description of costing and program strategies is outlined.
Data storage technology: Hardware and software, Appendix B
NASA Technical Reports Server (NTRS)
Sable, J. D.
1972-01-01
This project involves the development of more economical ways of integrating and interfacing new storage devices and data processing programs into a computer system. It involves developing interface standards and a software/hardware architecture which will make it possible to develop machine independent devices and programs. These will interface with the machine dependent operating systems of particular computers. The development project will not be to develop the software which would ordinarily be the responsibility of the manufacturer to supply, but to develop the standards with which that software is expected to confirm in providing an interface with the user or storage system.
FRIEND: a brain-monitoring agent for adaptive and assistive systems.
Morris, Alexis; Ulieru, Mihaela
2012-01-01
This paper presents an architectural design for adaptive-systems agents (FRIEND) that use brain state information to make more effective decisions on behalf of a user; measuring brain context versus situational demands. These systems could be useful for alerting users to cognitive workload levels or fatigue, and could attempt to compensate for higher cognitive activity by filtering noise information. In some cases such systems could also share control of devices, such as pulling over in an automated vehicle. These aim to assist people in everyday systems to perform tasks better and be more aware of internal states. Achieving a functioning system of this sort is a challenge, involving a unification of brain- computer-interfaces, human-computer-interaction, soft-computin deliberative multi-agent systems disciplines. Until recently, these were not able to be combined into a usable platform due largely to technological limitations (e.g., size, cost, and processing speed), insufficient research on extracting behavioral states from EEG signals, and lack of low-cost wireless sensing headsets. We aim to surpass these limitations and develop control architectures for making sense of brain state in applications by realizing an agent architecture for adaptive (human-aware) technology. In this paper we present an early, high-level design towards implementing a multi-purpose brain-monitoring agent system to improve user quality of life through the assistive applications of psycho-physiological monitoring, noise-filtering, and shared system control.
2014-04-30
performance is to create a computational system to mimic human game-play patterns. The objective of this study is to see to what extent we can...estimates as a function of task load. We conducted a pair of studies towards’ this end. In a first study , described in detail in Appendix D...could inform a system as to the relative workload of a user. In a second study , described in detail in Appendix E, participants were exposed to a 40
Crew/Automation Interaction in Space Transportation Systems: Lessons Learned from the Glass Cockpit
NASA Technical Reports Server (NTRS)
Rudisill, Marianne
2000-01-01
The progressive integration of automation technologies in commercial transport aircraft flight decks - the 'glass cockpit' - has had a major, and generally positive, impact on flight crew operations. Flight deck automation has provided significant benefits, such as economic efficiency, increased precision and safety, and enhanced functionality within the crew interface. These enhancements, however, may have been accrued at a price, such as complexity added to crew/automation interaction that has been implicated in a number of aircraft incidents and accidents. This report briefly describes 'glass cockpit' evolution. Some relevant aircraft accidents and incidents are described, followed by a more detailed description of human/automation issues and problems (e.g., crew error, monitoring, modes, command authority, crew coordination, workload, and training). This paper concludes with example principles and guidelines for considering 'glass cockpit' human/automation integration within space transportation systems.
Computer interfaces for the visually impaired
NASA Technical Reports Server (NTRS)
Higgins, Gerry
1991-01-01
Information access via computer terminals extends to blind and low vision persons employed in many technical and nontechnical disciplines. Two aspects are detailed of providing computer technology for persons with a vision related handicap. First, research into the most effective means of integrating existing adaptive technologies into information systems was made. This was conducted to integrate off the shelf products with adaptive equipment for cohesive integrated information processing systems. Details are included that describe the type of functionality required in software to facilitate its incorporation into a speech and/or braille system. The second aspect is research into providing audible and tactile interfaces to graphics based interfaces. Parameters are included for the design and development of the Mercator Project. The project will develop a prototype system for audible access to graphics based interfaces. The system is being built within the public domain architecture of X windows to show that it is possible to provide access to text based applications within a graphical environment. This information will be valuable to suppliers to ADP equipment since new legislation requires manufacturers to provide electronic access to the visually impaired.
NASA Astrophysics Data System (ADS)
Trujillo, Eddie J.; Ellersick, Steven D.
2006-05-01
The Boeing Electronic Flight Bag (EFB) is a key element in the evolutionary process of an "e-enabled" flight deck. The EFB is designed to improve the overall safety, efficiency, and operation of the flight deck and corresponding airline operations by providing the flight crew with better information and enhanced functionality in a user-friendly digital format. The EFB is intended to increase the pilots' situational awareness of the airplane and systems, as well as improve the efficiency of information management. The system will replace documents and forms that are currently stored or carried onto the flight deck and put them, in digital format, at the crew's fingertips. This paper describes what the Boeing EFB is and the significant human factors and interface design issues, trade-offs, and decisions made during development of the display system. In addition, EFB formats, graphics, input control methods, challenges using COTS (commercial-off-the-shelf)-leveraged glass and formatting technology are discussed. The optical design requirements, display technology utilized, brightness control system, reflection challenge, and the resulting optical performance are presented.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J.; Latorre, José M.; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis. PMID:29209193
Three-dimensional virtual acoustic displays
NASA Technical Reports Server (NTRS)
Wenzel, Elizabeth M.
1991-01-01
The development of an alternative medium for displaying information in complex human-machine interfaces is described. The 3-D virtual acoustic display is a means for accurately transferring information to a human operator using the auditory modality; it combines directional and semantic characteristics to form naturalistic representations of dynamic objects and events in remotely sensed or simulated environments. Although the technology can stand alone, it is envisioned as a component of a larger multisensory environment and will no doubt find its greatest utility in that context. The general philosophy in the design of the display has been that the development of advanced computer interfaces should be driven first by an understanding of human perceptual requirements, and later by technological capabilities or constraints. In expanding on this view, current and potential uses are addressed of virtual acoustic displays, such displays are characterized, and recent approaches to their implementation and application are reviewed, the research project at NASA-Ames is described in detail, and finally some critical research issues for the future are outlined.
Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto
2017-01-01
This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.
Common Database Interface for Heterogeneous Software Engineering Tools.
1987-12-01
SUB-GROUP Database Management Systems ;Programming(Comuters); 1e 05 Computer Files;Information Transfer;Interfaces; 19. ABSTRACT (Continue on reverse...Air Force Institute of Technology Air University In Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems ...Literature ..... 8 System 690 Configuration ......... 8 Database Functionis ............ 14 Software Engineering Environments ... 14 Data Manager
Secure Web-based Ground System User Interfaces over the Open Internet
NASA Technical Reports Server (NTRS)
Langston, James H.; Murray, Henry L.; Hunt, Gary R.
1998-01-01
A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.
Monitoring activities of daily living based on wearable wireless body sensor network.
Kańtoch, E; Augustyniak, P; Markiewicz, M; Prusak, D
2014-01-01
With recent advances in microprocessor chip technology, wireless communication, and biomedical engineering it is possible to develop miniaturized ubiquitous health monitoring devices that are capable of recording physiological and movement signals during daily life activities. The aim of the research is to implement and test the prototype of health monitoring system. The system consists of the body central unit with Bluetooth module and wearable sensors: the custom-designed ECG sensor, the temperature sensor, the skin humidity sensor and accelerometers placed on the human body or integrated with clothes and a network gateway to forward data to a remote medical server. The system includes custom-designed transmission protocol and remote web-based graphical user interface for remote real time data analysis. Experimental results for a group of humans who performed various activities (eg. working, running, etc.) showed maximum 5% absolute error compared to certified medical devices. The results are promising and indicate that developed wireless wearable monitoring system faces challenges of multi-sensor human health monitoring during performing daily activities and opens new opportunities in developing novel healthcare services.
NASA Technical Reports Server (NTRS)
Howard, S. D.
1987-01-01
Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.
NASA Technical Reports Server (NTRS)
Dischinger, H. Charles., Jr.; Mullins, Jeffrey B.
2005-01-01
The United States is entering a new period of human exploration of the inner Solar System, and robotic human helpers will be partners in that effort. In order to support integration of these new worker robots into existing and new human systems, a new design standard should be developed, to be called the Robot-Systems Integration Standard (RSIS). It will address the requirements for and constraints upon robotic collaborators with humans. These workers are subject to the same functional constraints as humans of work, reach, and visibility/situational awareness envelopes, and they will deal with the same maintenance and communication interfaces. Thus, the RSIS will be created by discipline experts with the same sort of perspective on these and other interface concerns as human engineers.
Skegro, Darko; Stutz, Cian; Ollier, Romain; Svensson, Emelie; Wassmann, Paul; Bourquin, Florence; Monney, Thierry; Gn, Sunitha; Blein, Stanislas
2017-06-09
Bispecific antibodies (bsAbs) are of significant importance to the development of novel antibody-based therapies, and heavy chain (Hc) heterodimers represent a major class of bispecific drug candidates. Current technologies for the generation of Hc heterodimers are suboptimal and often suffer from contamination by homodimers posing purification challenges. Here, we introduce a new technology based on biomimicry wherein the protein-protein interfaces of two different immunoglobulin (Ig) constant domain pairs are exchanged in part or fully to design new heterodimeric domains. The method can be applied across Igs to design Fc heterodimers and bsAbs. We investigated interfaces from human IgA CH3, IgD CH3, IgG1 CH3, IgM CH4, T-cell receptor (TCR) α/β, and TCR γ/δ constant domain pairs, and we found that they successfully drive human IgG1 CH3 or IgM CH4 heterodimerization to levels similar to or above those of reference methods. A comprehensive interface exchange between the TCR α/β constant domain pair and the IgG1 CH3 homodimer was evidenced by X-ray crystallography and used to engineer examples of bsAbs for cancer therapy. Parental antibody pairs were rapidly reformatted into scalable bsAbs that were free of homodimer traces by combining interface exchange, asymmetric Protein A binding, and the scFv × Fab format. In summary, we successfully built several new CH3- or CH4-based heterodimers that may prove useful for designing new bsAb-based therapeutics, and we anticipate that our approach could be broadly implemented across the Ig constant domain family. To our knowledge, CH4-based heterodimers have not been previously reported. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Opoku-Boateng, Gloria A.
2015-01-01
User frustration research has been one way of looking into clinicians’ experience with health information technology use and interaction. In order to understand how clinician frustration with Health Information Technology (HIT) use occurs, there is the need to explore Human-Computer Interaction (HCI) literature that addresses both frustration and HIT use. In the past three decades, HCI frustration research has increased and expanded. Researchers have done a lot of work to understand emotions, end-user frustration and affect. This paper uses a historical literature review approach to review the origins of emotion and frustration research and explore the research question; Does HCI research on frustration provide insights on clinicians’ frustration with HIT interfaces? From the literature review HCI research on emotion and frustration provides additional insights that can indeed help explain user frustration in HIT. Different approaches and HCI perspectives also help frame HIT user frustration research as well as inform HIT system design. The paper concludes with a suggested directions on how future design and research may take. PMID:26958238
Opoku-Boateng, Gloria A
2015-01-01
User frustration research has been one way of looking into clinicians' experience with health information technology use and interaction. In order to understand how clinician frustration with Health Information Technology (HIT) use occurs, there is the need to explore Human-Computer Interaction (HCI) literature that addresses both frustration and HIT use. In the past three decades, HCI frustration research has increased and expanded. Researchers have done a lot of work to understand emotions, end-user frustration and affect. This paper uses a historical literature review approach to review the origins of emotion and frustration research and explore the research question; Does HCI research on frustration provide insights on clinicians' frustration with HIT interfaces? From the literature review HCI research on emotion and frustration provides additional insights that can indeed help explain user frustration in HIT. Different approaches and HCI perspectives also help frame HIT user frustration research as well as inform HIT system design. The paper concludes with a suggested directions on how future design and research may take.
Fusion interfaces for tactical environments: An application of virtual reality technology
NASA Technical Reports Server (NTRS)
Haas, Michael W.
1994-01-01
The term Fusion Interface is defined as a class of interface which integrally incorporates both virtual and nonvirtual concepts and devices across the visual, auditory, and haptic sensory modalities. A fusion interface is a multisensory virtually-augmented synthetic environment. A new facility has been developed within the Human Engineering Division of the Armstrong Laboratory dedicated to exploratory development of fusion interface concepts. This new facility, the Fusion Interfaces for Tactical Environments (FITE) Facility is a specialized flight simulator enabling efficient concept development through rapid prototyping and direct experience of new fusion concepts. The FITE Facility also supports evaluation of fusion concepts by operation fighter pilots in an air combat environment. The facility is utilized by a multidisciplinary design team composed of human factors engineers, electronics engineers, computer scientists, experimental psychologists, and oeprational pilots. The FITE computational architecture is composed of twenty-five 80486-based microcomputers operating in real-time. The microcomputers generate out-the-window visuals, in-cockpit and head-mounted visuals, localized auditory presentations, haptic displays on the stick and rudder pedals, as well as executing weapons models, aerodynamic models, and threat models.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1990-01-01
The Transportable Applications Environment Plus (TAE PLUS), developed at NASA's Goddard Space Flight Center, is a portable What You See Is What You Get (WYSIWYG) user interface development and management system. Its primary objective is to provide an integrated software environment that allows interactive prototyping and development that of user interfaces, as well as management of the user interface within the operational domain. Although TAE Plus is applicable to many types of applications, its focus is supporting user interfaces for space applications. This paper discusses what TAE Plus provides and how the implementation has utilized state-of-the-art technologies within graphic workstations, windowing systems and object-oriented programming languages.
Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.
Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph
2014-01-01
This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high levels of assistance/automation.
Risk Interfaces to Support Integrated Systems Analysis and Development
NASA Technical Reports Server (NTRS)
Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria
2016-01-01
Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4
Performance Evaluation Methods for Assistive Robotic Technology
NASA Astrophysics Data System (ADS)
Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.
Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.
NAS infrastructure management system build 1.5 computer-human interface
DOT National Transportation Integrated Search
2001-01-01
Human factors engineers from the National Airspace System (NAS) Human Factors Branch (ACT-530) of the Federal Aviation Administration William J. Hughes Technical Center conducted an evaluation of the NAS Infrastructure Management System (NIMS) Build ...
The NASA automation and robotics technology program
NASA Technical Reports Server (NTRS)
Holcomb, Lee B.; Montemerlo, Melvin D.
1986-01-01
The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.
Human-Robot Interface: Issues in Operator Performance, Interface Design, and Technologies
2006-07-01
and the use of lightweight portable robotic sensor platforms. 5 robotics has reached a point where some generalities of HRI transcend specific...displays with control devices such as joysticks, wheels, and pedals (Kamsickas, 2003). Typical control stations include panels displaying (a) sensor ...tasks that do not involve mobility and usually involve camera control or data fusion from sensors Active search: Search tasks that involve mobility
The scientific data acquisition system of the GAMMA-400 space project
NASA Astrophysics Data System (ADS)
Bobkov, S. G.; Serdin, O. V.; Gorbunov, M. S.; Arkhangelskiy, A. I.; Topchiev, N. P.
2016-02-01
The description of scientific data acquisition system (SDAS) designed by SRISA for the GAMMA-400 space project is presented. We consider the problem of different level electronics unification: the set of reliable fault-tolerant integrated circuits fabricated on Silicon-on-Insulator 0.25 mkm CMOS technology and the high-speed interfaces and reliable modules used in the space instruments. The characteristics of reliable fault-tolerant very large scale integration (VLSI) technology designed by SRISA for the developing of computation systems for space applications are considered. The scalable net structure of SDAS based on Serial RapidIO interface including real-time operating system BAGET is described too.
Exploration Space Suit Architecture and Destination Environmental-Based Technology Development
NASA Technical Reports Server (NTRS)
Hill, Terry R.; Korona, F. Adam; McFarland, Shane
2012-01-01
This paper continues forward where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars [1] left off in the development of a space suit architecture that is modular in design and could be reconfigured prior to launch or during any given mission depending on the tasks or destination. This paper will address the space suit system architecture and technologies required based upon human exploration extravehicular activity (EVA) destinations, and describe how they should evolve to meet the future exploration EVA needs of the US human space flight program.1, 2, 3 In looking forward to future US space exploration to a space suit architecture with maximum reuse of technology and functionality across a range of mission profiles and destinations, a series of exercises and analyses have provided a strong indication that the Constellation Program (CxP) space suit architecture is postured to provide a viable solution for future exploration missions4. The destination environmental analysis presented in this paper demonstrates that the modular architecture approach could provide the lowest mass and mission cost for the protection of the crew given any human mission outside of low-Earth orbit (LEO). Additionally, some of the high-level trades presented here provide a review of the environmental and non-environmental design drivers that will become increasingly important the farther away from Earth humans venture. This paper demonstrates a logical clustering of destination design environments that allows a focused approach to technology prioritization, development, and design that will maximize the return on investment, independent of any particular program, and provide architecture and design solutions for space suit systems in time or ahead of need dates for any particular crewed flight program in the future. The approach to space suit design and interface definition discussion will show how the architecture is very adaptable to programmatic and funding changes with minimal redesign effort such that the modular architecture can be quickly and efficiently honed into a specific mission point solution if required. Additionally, the modular system will allow for specific technology incorporation and upgrade as required with minimal redesign of the system.
Vermeulen, Joan; Neyens, Jacques C. L; Spreeuwenberg, Marieke D; Van Rossum, Erik; Hewson, David; De Witte, Luc P
2012-01-01
Background The number of frail elderly people is increasing. Unfortunately, the number of caregivers is not increasing at the same pace, which affects older people, caregivers and healthcare systems. Because of these developments, self-management is becoming more important in healthcare. To support community-dwelling elderly people in their self-management, a system was developed that monitors their physical functioning. This system provides feedback to elderly people and their caregivers regarding physical indicators of frailty. The feedback is provided to elderly people via the screen of a mobile phone. It is important that elderly people understand the content of the feedback and are able to use the mobile phone properly. If not, it is unlikely that the system can support self-management. Many interactive health technologies that have been developed do not fulfil their promises. An important reason for this is that human and other non-technology issues are not sufficiently taken into consideration during the development process. Objective To collaborate with elderly people during the development and evaluation of a feedback system for community-dwelling elderly people regarding physical functioning. Methods An iterative user-centered design that consists of five phases was used to develop and evaluate the feedback system. These five phases were: 1) Selection of users, 2) Analysis of users and their context, 3) Identification of user needs, 4) Development of a prototype, and 5) Evaluation of the prototype. Three representatives of a target group panel for elderly people were selected in phase 1. They shared their needs and preferences during three expert group meetings that took place during the development process. This resulted in the development of a prototype which was first evaluated in a heuristic evaluation. Once adjustments were made, 11 elderly people evaluated the adjusted prototype using a think aloud procedure. They rated the usability and acceptability of the developed interface on a scale from 1 till 7 using an adapted version of the Post-Study System Usability Questionnaire (PSSUQ). Results A feedback system was developed that provides feedback regarding physical indicators of frailty via a touch screen mobile phone. The interface uses colours, smiley’s, and spoken/written messages to provide feedback that is easy to understand. The heuristic evaluation revealed that there were some problems with consistency and the use of user language. The think aloud evaluation showed that the 11 elderly people were able to navigate through the interface without much difficulty despite some small problems related to the lay-out of the interface. The mean score on an adapted version of the PSSUQ was 5.90 (SD 1.09) which indicates high user satisfaction and good usability. Conclusions The involvement of end-users significantly influenced the lay-out of the interface that was developed. This resulted in an interface that was accepted by the target group. Evaluation of the prototype revealed that the usability of the interface was good. The feedback system will only succeed in supporting self-management when elderly people are able to use the interface and understand the feedback. The input of elderly people during the development process contributed to this.
Designing an operator interface? Consider user`s `psychology`
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toffer, D.E.
The modern operator interface is a channel of communication between operators and the plant that, ideally, provides them with information necessary to keep the plant running at maximum efficiency. Advances in automation technology have increased information flow from the field to the screen. New and improved Supervisory Control and Data Acquisition (SCADA) packages provide designers with powerful and open design considerations. All too often, however, systems go to the field designed for the software rather than the operator. Plant operators` jobs have changed fundamentally, from controlling their plants from out in the field to doing so from within control rooms.more » Control room-based operation does not denote idleness. Trained operators should be engaged in examination of plant status and cognitive evaluation of plant efficiencies. Designers who are extremely computer literate, often do not consider demographics of field operators. Many field operators have little knowledge of modern computer systems. As a result, they do not take full advantage of the interface`s capabilities. Designers often fail to understand the true nature of how operators run their plants. To aid field operators, designers must provide familiar controls and intuitive choices. To achieve success in interface design, it is necessary to understand the ways in which humans think conceptually, and to understand how they process this information physically. The physical and the conceptual are closely related when working with any type of interface. Designers should ask themselves: {open_quotes}What type of information is useful to the field operator?{close_quotes} Let`s explore an integration model that contains the following key elements: (1) Easily navigated menus; (2) Reduced chances for misunderstanding; (3) Accurate representations of the plant or operation; (4) Consistent and predictable operation; (5) A pleasant and engaging interface that conforms to the operator`s expectations. 4 figs.« less
User requirements for a patient scheduling system
NASA Technical Reports Server (NTRS)
Zimmerman, W.
1979-01-01
A rehabilitation institute's needs and wants from a scheduling system were established by (1) studying the existing scheduling system and the variables that affect patient scheduling, (2) conducting a human-factors study to establish the human interfaces that affect patients' meeting prescribed therapy schedules, and (3) developing and administering a questionnaire to the staff which pertains to the various interface problems in order to identify staff requirements to minimize scheduling problems and other factors that may limit the effectiveness of any new scheduling system.
Portable Computer Technology (PCT) Research and Development Program Phase 2
NASA Technical Reports Server (NTRS)
Castillo, Michael; McGuire, Kenyon; Sorgi, Alan
1995-01-01
The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.
NASA Technical Reports Server (NTRS)
Mccann, Robert S.; Spirkovska, Lilly; Smith, Irene
2013-01-01
Integrated System Health Management (ISHM) technologies have advanced to the point where they can provide significant automated assistance with real-time fault detection, diagnosis, guided troubleshooting, and failure consequence assessment. To exploit these capabilities in actual operational environments, however, ISHM information must be integrated into operational concepts and associated information displays in ways that enable human operators to process and understand the ISHM system information rapidly and effectively. In this paper, we explore these design issues in the context of an advanced caution and warning system (ACAWS) for next-generation crewed spacecraft missions. User interface concepts for depicting failure diagnoses, failure effects, redundancy loss, "what-if" failure analysis scenarios, and resolution of ambiguity groups are discussed and illustrated.
Design of Plant Eco-physiology Monitoring System Based on Embedded Technology
NASA Astrophysics Data System (ADS)
Li, Yunbing; Wang, Cheng; Qiao, Xiaojun; Liu, Yanfei; Zhang, Xinlu
A real time system has been developed to collect plant's growth information comprehensively. Plant eco-physiological signals can be collected and analyzed effectively. The system adopted embedded technology: wireless sensors network collect the eco-physiological information. Touch screen and ARM microprocessor make the system work independently without PC. The system is versatile and all parameters can be set by the touch screen. Sensors' intelligent compensation can be realized in this system. Information can be displayed by either graphically or in table mode. The ARM microprocessor provides the interface to connect with the internet, so the system support remote monitoring and controlling. The system has advantages of friendly interface, flexible construction and extension. It's a good tool for plant's management.
NASA Technical Reports Server (NTRS)
Smyth, R. K. (Editor)
1979-01-01
The state of the art survey (SOAS) covers six technology areas including flightpath management, aircraft control system, crew station technology, interface & integration technology, military technology, and fundamental technology. The SOAS included contributions from over 70 individuals in industry, government, and the universities.
Towards Effective Non-Invasive Brain-Computer Interfaces Dedicated to Gait Rehabilitation Systems
Castermans, Thierry; Duvinage, Matthieu; Cheron, Guy; Dutoit, Thierry
2014-01-01
In the last few years, significant progress has been made in the field of walk rehabilitation. Motor cortex signals in bipedal monkeys have been interpreted to predict walk kinematics. Epidural electrical stimulation in rats and in one young paraplegic has been realized to partially restore motor control after spinal cord injury. However, these experimental trials are far from being applicable to all patients suffering from motor impairments. Therefore, it is thought that more simple rehabilitation systems are desirable in the meanwhile. The goal of this review is to describe and summarize the progress made in the development of non-invasive brain-computer interfaces dedicated to motor rehabilitation systems. In the first part, the main principles of human locomotion control are presented. The paper then focuses on the mechanisms of supra-spinal centers active during gait, including results from electroencephalography, functional brain imaging technologies [near-infrared spectroscopy (NIRS), functional magnetic resonance imaging (fMRI), positron-emission tomography (PET), single-photon emission-computed tomography (SPECT)] and invasive studies. The first brain-computer interface (BCI) applications to gait rehabilitation are then presented, with a discussion about the different strategies developed in the field. The challenges to raise for future systems are identified and discussed. Finally, we present some proposals to address these challenges, in order to contribute to the improvement of BCI for gait rehabilitation. PMID:24961699
1990-11-01
to design and implement an adaptive intelligent interface for a command-and-control-style domain. The primary functionality of the resulting...technical tasks, as follows: 1. Analysis of Current Interface Technologies 2. Dejineation of User Roles 3. Development of User Models 4. Design of Interface...Management Association (FEMA). In the initial version of the prototype, two distin-t user models were designed . One type of user modeled by the system is
In-Space Crew-Collaborative Task Scheduling
NASA Technical Reports Server (NTRS)
Jaap, John; Meyer, Patrick; Davis, Elizabeth; Richardson, Lea
2006-01-01
As humans venture farther from earth for longer durations, it will become essential for those on the journey to have significant control over the scheduling of their own activities as well as the activities of their companion systems and robots. However, there are many reasons why the crew will not do all the scheduling; timelines will be the result of collaboration with ground personnel. Emerging technologies such as in-space message buses, delay-tolerant networks, and in-space internet will be the carriers on which the collaboration rides. Advances in scheduling technology, in the areas of task modeling, scheduling engines, and user interfaces will allow the crew to become virtual scheduling experts. New concepts of operations for producing the timeline will allow the crew and the ground support to collaborate while providing safeguards to ensure that the mission will be effectively accomplished without endangering the systems or personnel.
DARPA challenge: developing new technologies for brain and spinal injuries
NASA Astrophysics Data System (ADS)
Macedonia, Christian; Zamisch, Monica; Judy, Jack; Ling, Geoffrey
2012-06-01
The repair of traumatic injuries to the central nervous system remains among the most challenging and exciting frontiers in medicine. In both traumatic brain injury and spinal cord injuries, the ultimate goals are to minimize damage and foster recovery. Numerous DARPA initiatives are in progress to meet these goals. The PREventing Violent Explosive Neurologic Trauma program focuses on the characterization of non-penetrating brain injuries resulting from explosive blast, devising predictive models and test platforms, and creating strategies for mitigation and treatment. To this end, animal models of blast induced brain injury are being established, including swine and non-human primates. Assessment of brain injury in blast injured humans will provide invaluable information on brain injury associated motor and cognitive dysfunctions. The Blast Gauge effort provided a device to measure warfighter's blast exposures which will contribute to diagnosing the level of brain injury. The program Cavitation as a Damage Mechanism for Traumatic Brain Injury from Explosive Blast developed mathematical models that predict stresses, strains, and cavitation induced from blast exposures, and is devising mitigation technologies to eliminate injuries resulting from cavitation. The Revolutionizing Prosthetics program is developing an avant-garde prosthetic arm that responds to direct neural control and provides sensory feedback through electrical stimulation. The Reliable Neural-Interface Technology effort will devise technologies to optimally extract information from the nervous system to control next generation prosthetic devices with high fidelity. The emerging knowledge and technologies arising from these DARPA programs will significantly improve the treatment of brain and spinal cord injured patients.
Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces
NASA Technical Reports Server (NTRS)
Ellman, Alvin; Carlton, Magdi
1993-01-01
The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.
3min. poster presentations of B01
NASA Astrophysics Data System (ADS)
Foing, Bernard H.
We give a report on recommendations from ILEWG International conferences held at Cape Canaveral in 2008 (ICEUM10), and in Beijing in May 2010 with IAF (GLUC -ICEUM11). We discuss the different rationale for Moon exploration. Priorities for scientific investigations include: clues on the formation and evolution of rocky planets, accretion and bombardment in the inner solar system, comparative planetology processes (tectonic, volcanic, impact cratering, volatile delivery), historical records, astrobiology, survival of organics; past, present and future life. The ILEWG technology task group set priorities for the advancement of instrumenta-tion: Remote sensing miniaturised instruments; Surface geophysical and geochemistry package; Instrument deployment and robotic arm, nano-rover, sampling, drilling; Sample finder and collector. Regional mobility rover; Autonomy and Navigation; Artificially intelligent robots, Complex systems. The ILEWG ExogeoLab pilot project was developed as support for instru-ments, landers, rovers,and preparation for cooperative robotic village. The ILEWG lunar base task group looked at minimal design concepts, technologies in robotic and human exploration with Tele control, telepresence, virtual reality; Man-Machine interface and performances. The ILEWG ExoHab pilot project has been started with support from agencies and partners. We discuss ILEWG terrestrial Moon-Mars campaigns for validation of technologies, research and human operations. We indicate how Moon-Mars Exploration can inspire solutions to global Earth sustained development: In-Situ Utilisation of resources; Establishment of permanent robotic infrastructures, Environmental protection aspects; Life sciences laboratories; Support to human exploration. Co-Authors: ILEWG Task Groups on: Science, Technology, Robotic village, Lunar Bases , Commercial and Societal aspects, Roadmap synergies with other programmes, Public en-gagemnet and Outreach, Young Lunar Explorers.
Embedded CMOS basecalling for nanopore DNA sequencing.
Chengjie Wang; Junli Zheng; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim
2016-08-01
DNA sequencing based on nanopore sensors is now entering the marketplace. The ability to interface this technology to established CMOS microelectronics promises significant improvements in functionality and miniaturization. Among the key functions to benefit from this interface will be basecalling, the conversion of raw electronic molecular signatures to nucleotide sequence predictions. This paper presents the design and performance potential of custom CMOS base-callers embedded alongside nanopore sensors. A basecalliing architecture implemented in 32-nm technology is discussed with the ability to process the equivalent of 20 human genomes per day in real-time at a power density of 5 W/cm2 assuming a 3-mer nanopore sensor.
Advances in neuroprosthetic learning and control.
Carmena, Jose M
2013-01-01
Significant progress has occurred in the field of brain-machine interfaces (BMI) since the first demonstrations with rodents, monkeys, and humans controlling different prosthetic devices directly with neural activity. This technology holds great potential to aid large numbers of people with neurological disorders. However, despite this initial enthusiasm and the plethora of available robotic technologies, existing neural interfaces cannot as yet master the control of prosthetic, paralyzed, or otherwise disabled limbs. Here I briefly discuss recent advances from our laboratory into the neural basis of BMIs that should lead to better prosthetic control and clinically viable solutions, as well as new insights into the neurobiology of action.
Advances in Neuroprosthetic Learning and Control
Carmena, Jose M.
2013-01-01
Significant progress has occurred in the field of brain–machine interfaces (BMI) since the first demonstrations with rodents, monkeys, and humans controlling different prosthetic devices directly with neural activity. This technology holds great potential to aid large numbers of people with neurological disorders. However, despite this initial enthusiasm and the plethora of available robotic technologies, existing neural interfaces cannot as yet master the control of prosthetic, paralyzed, or otherwise disabled limbs. Here I briefly discuss recent advances from our laboratory into the neural basis of BMIs that should lead to better prosthetic control and clinically viable solutions, as well as new insights into the neurobiology of action. PMID:23700383
Phase I Report: DARPA Exoskeleton Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jansen, J.F.
2004-01-21
The Defense Advanced Research Projects Agency (DARPA) inaugurated a program addressing research and development for an Exoskeleton for Human Performance Augmentation in FY!2001. A team consisting of Oak Ridge National Laboratory, the prime contractor, AeroVironment, Inc., the Army Research Laboratory, the University of Minnesota, and the Virginia Polytechnic Institute has recently completed an 18-month Phase I effort in support of this DARPA program. The Phase I effort focused on the development and proof-of-concept demonstrations for key enabling technologies, laying the foundation for subsequently building and demonstrating a prototype exoskeleton. The overall approach was driven by the need to optimize energymore » efficiency while providing a system that augmented the operator in as transparent manner as possible (non-impeding). These needs led to the evolution of two key distinguishing features of this team's approach. The first is the ''no knee contact'' concept. This concept is dependent on a unique Cartesian-based control scheme that uses force sensing at the foot and backpack attachments to allow the exoskeleton to closely follow the operator while avoiding the difficulty of connecting and sensing position at the knee. The second is an emphasis on energy efficiency manifested by an energetic, power, actuation and controls approach designed to enhance energy efficiency as well as a reconfigurable kinematic structure that provides a non-anthropomorphic configuration to support an energy saving long-range march/transport mode. The enabling technologies addressed in the first phase were controls and sensing, the soft tissue interface between the machine and the operator, the power system, and actuation. The controller approach was implemented and demonstrated on a test stand with an actual operator. Control stability, low operator fatigue, force amplification and the human interface were all successfully demonstrated, validating the controls approach. A unique, lightweight, low profile, multi-axis foot sensor (an integral element of the controls approach) was designed, fabricated, and its performance verified. A preliminary conceptual design of the human coupling and soft tissue interface, based on biomechanics research has been developed along with a test plan to support an iterative design process. The power system concept, a fuel cell hybrid power supply using chemical generated hydrogen, was successfully demonstrated and shown to be able to efficiently meet both steady-state and transient peak loads. Two actuator approaches, a piezoelectric actuator, with theoretical high power densities and an approach based on a high-performance, high-speed electric motor driving a miniature hydraulic pump have been investigated. The first shows great potential but will require further research before reaching that promise. The other approach has been modeled and simulated and shown to provide the possibility for significant energy savings (>30%) and improved power densities in comparison to conventional hydraulics. Biomechanics analysis and testing were also performed in support of these enabling technologies, to provide a basis for design criteria. An analysis was performed to determine baseline data for initial mechanical design and power supply sizing. Testing conducted to evaluate boot sole thickness found that thickness increases up to two inches could be accommodated without significant impact on human factors issues. This 18-month long Phase I effort has evaluated key enabling technologies and demonstrated advances in these technologies that have significantly increased the likelihood of building a functional prototype exoskeleton.« less
Exploring Life Support Architectures for Evolution of Deep Space Human Exploration
NASA Technical Reports Server (NTRS)
Anderson, Molly S.; Stambaugh, Imelda C.
2015-01-01
Life support system architectures for long duration space missions are often explored analytically in the human spaceflight community to find optimum solutions for mass, performance, and reliability. But in reality, many other constraints can guide the design when the life support system is examined within the context of an overall vehicle, as well as specific programmatic goals and needs. Between the end of the Constellation program and the development of the "Evolvable Mars Campaign", NASA explored a broad range of mission possibilities. Most of these missions will never be implemented but the lessons learned during these concept development phases may color and guide future analytical studies and eventual life support system architectures. This paper discusses several iterations of design studies from the life support system perspective to examine which requirements and assumptions, programmatic needs, or interfaces drive design. When doing early concept studies, many assumptions have to be made about technology and operations. Data can be pulled from a variety of sources depending on the study needs, including parametric models, historical data, new technologies, and even predictive analysis. In the end, assumptions must be made in the face of uncertainty. Some of these may introduce more risk as to whether the solution for the conceptual design study will still work when designs mature and data becomes available.
Implementation of an Adaptive Controller System from Concept to Flight Test
NASA Technical Reports Server (NTRS)
Larson, Richard R.; Burken, John J.; Butler, Bradley S.; Yokum, Steve
2009-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) is conducting ongoing flight research using adaptive controller algorithms. A highly modified McDonnell-Douglas NF-15B airplane called the F-15 Intelligent Flight Control System (IFCS) is used to test and develop these algorithms. Modifications to this airplane include adding canards and changing the flight control systems to interface a single-string research controller processor for neural network algorithms. Research goals include demonstration of revolutionary control approaches that can efficiently optimize aircraft performance in both normal and failure conditions and advancement of neural-network-based flight control technology for new aerospace system designs. This report presents an overview of the processes utilized to develop adaptive controller algorithms during a flight-test program, including a description of initial adaptive controller concepts and a discussion of modeling formulation and performance testing. Design finalization led to integration with the system interfaces, verification of the software, validation of the hardware to the requirements, design of failure detection, development of safety limiters to minimize the effect of erroneous neural network commands, and creation of flight test control room displays to maximize human situational awareness; these are also discussed.
A framework for analyzing the cognitive complexity of computer-assisted clinical ordering.
Horsky, Jan; Kaufman, David R; Oppenheim, Michael I; Patel, Vimla L
2003-01-01
Computer-assisted provider order entry is a technology that is designed to expedite medical ordering and to reduce the frequency of preventable errors. This paper presents a multifaceted cognitive methodology for the characterization of cognitive demands of a medical information system. Our investigation was informed by the distributed resources (DR) model, a novel approach designed to describe the dimensions of user interfaces that introduce unnecessary cognitive complexity. This method evaluates the relative distribution of external (system) and internal (user) representations embodied in system interaction. We conducted an expert walkthrough evaluation of a commercial order entry system, followed by a simulated clinical ordering task performed by seven clinicians. The DR model was employed to explain variation in user performance and to characterize the relationship of resource distribution and ordering errors. The analysis revealed that the configuration of resources in this ordering application placed unnecessarily heavy cognitive demands on the user, especially on those who lacked a robust conceptual model of the system. The resources model also provided some insight into clinicians' interactive strategies and patterns of associated errors. Implications for user training and interface design based on the principles of human-computer interaction in the medical domain are discussed.
Watershed System Model: The Essentials to Model Complex Human-Nature System at the River Basin Scale
NASA Astrophysics Data System (ADS)
Li, Xin; Cheng, Guodong; Lin, Hui; Cai, Ximing; Fang, Miao; Ge, Yingchun; Hu, Xiaoli; Chen, Min; Li, Weiyue
2018-03-01
Watershed system models are urgently needed to understand complex watershed systems and to support integrated river basin management. Early watershed modeling efforts focused on the representation of hydrologic processes, while the next-generation watershed models should represent the coevolution of the water-land-air-plant-human nexus in a watershed and provide capability of decision-making support. We propose a new modeling framework and discuss the know-how approach to incorporate emerging knowledge into integrated models through data exchange interfaces. We argue that the modeling environment is a useful tool to enable effective model integration, as well as create domain-specific models of river basin systems. The grand challenges in developing next-generation watershed system models include but are not limited to providing an overarching framework for linking natural and social sciences, building a scientifically based decision support system, quantifying and controlling uncertainties, and taking advantage of new technologies and new findings in the various disciplines of watershed science. The eventual goal is to build transdisciplinary, scientifically sound, and scale-explicit watershed system models that are to be codesigned by multidisciplinary communities.
Remote surface inspection system. [of large space platforms
NASA Technical Reports Server (NTRS)
Hayati, Samad; Balaram, J.; Seraji, Homayoun; Kim, Won S.; Tso, Kam S.
1993-01-01
This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.
Reflections on human error - Matters of life and death
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1989-01-01
The last two decades have witnessed a rapid growth in the introduction of automatic devices into aircraft cockpits, and eleswhere in human-machine systems. This was motivated in part by the assumption that when human functioning is replaced by machine functioning, human error is eliminated. Experience to date shows that this is far from true, and that automation does not replace humans, but changes their role in the system, as well as the types and severity of the errors they make. This altered role may lead to fewer, but more critical errors. Intervention strategies to prevent these errors, or ameliorate their consequences include basic human factors engineering of the interface, enhanced warning and alerting systems, and more intelligent interfaces that understand the strategic intent of the crew and can detect and trap inconsistent or erroneous input before it affects the system.
Human factors opportunities to improve Ohio's transportation system : executive summary report.
DOT National Transportation Integrated Search
2005-06-01
Human factors engineering or ergonomics is the : area of engineering concerned with the humanmachine : interface. As Ohios road systems are : driven on by people, human factors engineering : is certainly relevant. However, human factors : have oft...
Nurses using futuristic technology in today's healthcare setting.
Wolf, Debra M; Kapadia, Amar; Kintzel, Jessie; Anton, Bonnie B
2009-01-01
Human computer interaction (HCI) equates nurses using voice assisted technology within a clinical setting to document patient care real time, retrieve patient information from care plans, and complete routine tasks. This is a reality currently utilized by clinicians today in acute and long term care settings. Voice assisted documentation provides hands & eyes free accurate documentation while enabling effective communication and task management. The speech technology increases the accuracy of documentation, while interfacing directly into the electronic health record (EHR). Using technology consisting of a light weight headset and small fist size wireless computer, verbal responses to easy to follow cues are converted into a database systems allowing staff to obtain individualized care status reports on demand. To further assist staff in their daily process, this innovative technology allows staff to send and receive pages as needed. This paper will discuss how leading edge and award winning technology is being integrated within the United States. Collaborative efforts between clinicians and analyst will be discussed reflecting the interactive design and build functionality. Features such as the system's voice responses and directed cues will be shared and how easily data can be documented, viewed and retrieved. Outcome data will be presented on how the technology impacted organization's quality outcomes, financial reimbursement, and employee's level of satisfaction.
Generic worklist handler for workflow-enabled products
NASA Astrophysics Data System (ADS)
Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas
1999-07-01
Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
ISS Interface Mechanisms and their Heritage
NASA Technical Reports Server (NTRS)
Cook, John G.; Aksamentov, Valery; Hoffman, Thomas; Bruner, Wes
2011-01-01
The International Space Station, by nurturing technological development of a variety of pressurized and unpressurized interface mechanisms fosters "competition at the technology level". Such redundancy and diversity allows for the development and testing of mechanisms that might be used for future exploration efforts. The International Space Station, as a test-bed for exploration, has 4 types of pressurized interfaces between elements and 6 unpressurized attachment mechanisms. Lessons learned from the design, test and operations of these mechanisms will help inform the design for a new international standard pressurized docking mechanism for the NASA Docking System. This paper will examine the attachment mechanisms on the ISS and their attributes. It will also look ahead at the new NASA docking system and trace its lineage to heritage mechanisms.
Development of 6-DOF painting robot control system
NASA Astrophysics Data System (ADS)
Huang, Junbiao; Liu, Jianqun; Gao, Weiqiang
2017-01-01
With the development of society, the spraying technology of manufacturing industry in China has changed from the manual operation to the 6-DOF (Degree Of Freedom)robot automatic spraying. Spraying painting robot can not only complete the work which does harm to human being, but also improve the production efficiency and save labor costs. Control system is the most critical part of the 6-DOF robots, however, there is still a lack of relevant technology research in China. It is very necessary to study a kind of control system of 6-DOF spraying painting robots which is easy to operation, and has high efficiency and stable performance. With Googol controller platform, this paper develops programs based on Windows CE embedded systems to control the robot to finish the painting work. Software development is the core of the robot control system, including the direct teaching module, playback module, motion control module, setting module, man-machine interface, alarm module, log module, etc. All the development work of the entire software system has been completed, and it has been verified that the entire software works steady and efficient.
A Brain–Spinal Interface Alleviating Gait Deficits after Spinal Cord Injury in Primates
Capogrosso, Marco; Milekovic, Tomislav; Borton, David; Wagner, Fabien; Moraud, Eduardo Martin; Mignardot, Jean-Baptiste; Buse, Nicolas; Gandar, Jerome; Barraud, Quentin; Xing, David; Rey, Elodie; Duis, Simone; Jianzhong, Yang; Ko, Wai Kin D.; Li, Qin; Detemple, Peter; Denison, Tim; Micera, Silvestro; Bezard, Erwan; Bloch, Jocelyne; Courtine, Grégoire
2016-01-01
Spinal cord injury disrupts the communication between the brain and the spinal circuits that orchestrate movement. To bypass the lesion, brain–computer interfaces1–3 have directly linked cortical activity to electrical stimulation of muscles, which have restored grasping abilities after hand paralysis1,4. Theoretically, this strategy could also restore control over leg muscle activity for walking5. However, replicating the complex sequence of individual muscle activation patterns underlying natural and adaptive locomotor movements poses formidable conceptual and technological challenges6,7. Recently, we showed in rats that epidural electrical stimulation of the lumbar spinal cord can reproduce the natural activation of synergistic muscle groups producing locomotion8–10. Here, we interfaced leg motor cortex activity with epidural electrical stimulation protocols to establish a brain–spinal interface that alleviated gait deficits after a spinal cord injury in nonhuman primates. Rhesus monkeys were implanted with an intracortical microelectrode array into the leg area of motor cortex; and a spinal cord stimulation system composed of a spatially selective epidural implant and a pulse generator with real-time triggering capabilities. We designed and implemented wireless control systems that linked online neural decoding of extension and flexion motor states with stimulation protocols promoting these movements. These systems allowed the monkeys to behave freely without any restrictions or constraining tethered electronics. After validation of the brain–spinal interface in intact monkeys, we performed a unilateral corticospinal tract lesion at the thoracic level. As early as six days post-injury and without prior training of the monkeys, the brain–spinal interface restored weight-bearing locomotion of the paralyzed leg on a treadmill and overground. The implantable components integrated in the brain–spinal interface have all been approved for investigational applications in similar human research, suggesting a practical translational pathway for proof-of-concept studies in people with spinal cord injury. PMID:27830790
Software systems for modeling articulated figures
NASA Technical Reports Server (NTRS)
Phillips, Cary B.
1989-01-01
Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.
Ocular attention-sensing interface system
NASA Technical Reports Server (NTRS)
Zaklad, Allen; Glenn, Floyd A., III; Iavecchia, Helene P.; Stokes, James M.
1986-01-01
The purpose of the research was to develop an innovative human-computer interface based on eye movement and voice control. By eliminating a manual interface (keyboard, joystick, etc.), OASIS provides a control mechanism that is natural, efficient, accurate, and low in workload.
Virtual microscopy: merging of computer mediated communication and intuitive interfacing
NASA Astrophysics Data System (ADS)
de Ridder, Huib; de Ridder-Sluiter, Johanna G.; Kluin, Philip M.; Christiaans, Henri H. C. M.
2009-02-01
Ubiquitous computing (or Ambient Intelligence) is an upcoming technology that is usually associated with futuristic smart environments in which information is available anytime anywhere and with which humans can interact in a natural, multimodal way. However spectacular the corresponding scenarios may be, it is equally challenging to consider how this technology may enhance existing situations. This is illustrated by a case study from the Dutch medical field: central quality reviewing for pathology in child oncology. The main goal of the review is to assess the quality of the diagnosis based on patient material. The sharing of knowledge in social face-to-face interaction during such meeting is an important advantage. At the same time there is the disadvantage that the experts from the seven Dutch academic medical centers have to travel to the review meeting and that the required logistics to collect and bring patient material and data to the meeting is cumbersome and time-consuming. This paper focuses on how this time-consuming, nonefficient way of reviewing can be replaced by a virtual collaboration system by merging technology supporting Computer Mediated Collaboration and intuitive interfacing. This requires insight in the preferred way of communication and collaboration as well as knowledge about preferred interaction style with a virtual shared workspace.
Monitoring of Vital Signs with Flexible and Wearable Medical Devices.
Khan, Yasser; Ostfeld, Aminy E; Lochner, Claire M; Pierre, Adrien; Arias, Ana C
2016-06-01
Advances in wireless technologies, low-power electronics, the internet of things, and in the domain of connected health are driving innovations in wearable medical devices at a tremendous pace. Wearable sensor systems composed of flexible and stretchable materials have the potential to better interface to the human skin, whereas silicon-based electronics are extremely efficient in sensor data processing and transmission. Therefore, flexible and stretchable sensors combined with low-power silicon-based electronics are a viable and efficient approach for medical monitoring. Flexible medical devices designed for monitoring human vital signs, such as body temperature, heart rate, respiration rate, blood pressure, pulse oxygenation, and blood glucose have applications in both fitness monitoring and medical diagnostics. As a review of the latest development in flexible and wearable human vitals sensors, the essential components required for vitals sensors are outlined and discussed here, including the reported sensor systems, sensing mechanisms, sensor fabrication, power, and data processing requirements. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
MARTI: man-machine animation real-time interface
NASA Astrophysics Data System (ADS)
Jones, Christian M.; Dlay, Satnam S.
1997-05-01
The research introduces MARTI (man-machine animation real-time interface) for the realization of natural human-machine interfacing. The system uses simple vocal sound-tracks of human speakers to provide lip synchronization of computer graphical facial models. We present novel research in a number of engineering disciplines, which include speech recognition, facial modeling, and computer animation. This interdisciplinary research utilizes the latest, hybrid connectionist/hidden Markov model, speech recognition system to provide very accurate phone recognition and timing for speaker independent continuous speech, and expands on knowledge from the animation industry in the development of accurate facial models and automated animation. The research has many real-world applications which include the provision of a highly accurate and 'natural' man-machine interface to assist user interactions with computer systems and communication with one other using human idiosyncrasies; a complete special effects and animation toolbox providing automatic lip synchronization without the normal constraints of head-sets, joysticks, and skilled animators; compression of video data to well below standard telecommunication channel bandwidth for video communications and multi-media systems; assisting speech training and aids for the handicapped; and facilitating player interaction for 'video gaming' and 'virtual worlds.' MARTI has introduced a new level of realism to man-machine interfacing and special effect animation which has been previously unseen.
Using APEX to Model Anticipated Human Error: Analysis of a GPS Navigational Aid
NASA Technical Reports Server (NTRS)
VanSelst, Mark; Freed, Michael; Shefto, Michael (Technical Monitor)
1997-01-01
The interface development process can be dramatically improved by predicting design facilitated human error at an early stage in the design process. The approach we advocate is to SIMULATE the behavior of a human agent carrying out tasks with a well-specified user interface, ANALYZE the simulation for instances of human error, and then REFINE the interface or protocol to minimize predicted error. This approach, incorporated into the APEX modeling architecture, differs from past approaches to human simulation in Its emphasis on error rather than e.g. learning rate or speed of response. The APEX model consists of two major components: (1) a powerful action selection component capable of simulating behavior in complex, multiple-task environments; and (2) a resource architecture which constrains cognitive, perceptual, and motor capabilities to within empirically demonstrated limits. The model mimics human errors arising from interactions between limited human resources and elements of the computer interface whose design falls to anticipate those limits. We analyze the design of a hand-held Global Positioning System (GPS) device used for radical and navigational decisions in small yacht recalls. The analysis demonstrates how human system modeling can be an effective design aid, helping to accelerate the process of refining a product (or procedure).
VERDEX: A virtual environment demonstrator for remote driving applications
NASA Technical Reports Server (NTRS)
Stone, Robert J.
1991-01-01
One of the key areas of the National Advanced Robotics Centre's enabling technologies research program is that of the human system interface, phase 1 of which started in July 1989 and is currently addressing the potential of virtual environments to permit intuitive and natural interactions between a human operator and a remote robotic vehicle. The aim of the first 12 months of this program (to September, 1990) is to develop a virtual human-interface demonstrator for use later as a test bed for human factors experimentation. This presentation will describe the current state of development of the test bed, and will outline some human factors issues and problems for more general discussion. In brief, the virtual telepresence system for remote driving has been designed to take the following form. The human operator will be provided with a helmet-mounted stereo display assembly, facilities for speech recognition and synthesis (using the Marconi Macrospeak system), and a VPL DataGlove Model 2 unit. The vehicle to be used for the purposes of remote driving is a Cybermotion Navmaster K2A system, which will be equipped with a stereo camera and microphone pair, mounted on a motorized high-speed pan-and-tilt head incorporating a closed-loop laser ranging sensor for camera convergence control (currently under contractual development). It will be possible to relay information to and from the vehicle and sensory system via an umbilical or RF link. The aim is to develop an interactive audio-visual display system capable of presenting combined stereo TV pictures and virtual graphics windows, the latter featuring control representations appropriate for vehicle driving and interaction using a graphical 'hand,' slaved to the flex and tracking sensors of the DataGlove and an additional helmet-mounted Polhemus IsoTrack sensor. Developments planned for the virtual environment test bed include transfer of operator control between remote driving and remote manipulation, dexterous end effector integration, virtual force and tactile sensing (also the focus of a current ARRL contract, initially employing a 14-pneumatic bladder glove attachment), and sensor-driven world modeling for total virtual environment generation and operator-assistance in remote scene interrogation.
Combining fuzzy mathematics with fuzzy logic to solve business management problems
NASA Astrophysics Data System (ADS)
Vrba, Joseph A.
1993-12-01
Fuzzy logic technology has been applied to control problems with great success. Because of this, many observers fell that fuzzy logic is applicable only in the control arena. However, business management problems almost never deal with crisp values. Fuzzy systems technology--a combination of fuzzy logic, fuzzy mathematics and a graphical user interface--is a natural fit for developing software to assist in typical business activities such as planning, modeling and estimating. This presentation discusses how fuzzy logic systems can be extended through the application of fuzzy mathematics and the use of a graphical user interface to make the information contained in fuzzy numbers accessible to business managers. As demonstrated through examples from actual deployed systems, this fuzzy systems technology has been employed successfully to provide solutions to the complex real-world problems found in the business environment.
Schraagen, Jan Maarten; Verhoeven, Fenne
2013-02-01
The aims of this study were to investigate how a variety of research methods is commonly employed to study technology and practitioner cognition. User-interface issues with infusion pumps were selected as a case because of its relevance to patient safety. Starting from a Cognitive Systems Engineering perspective, we developed an Impact Flow Diagram showing the relationship of computer technology, cognition, practitioner behavior, and system failure in the area of medical infusion devices. We subsequently conducted a systematic literature review on user-interface issues with infusion pumps, categorized the studies in terms of methods employed, and noted the usability problems found with particular methods. Next, we assigned usability problems and related methods to the levels in the Impact Flow Diagram. Most study methods used to find user interface issues with infusion pumps focused on observable behavior rather than on how artifacts shape cognition and collaboration. A concerted and theory-driven application of these methods when testing infusion pumps is lacking in the literature. Detailed analysis of one case study provided an illustration of how to apply the Impact Flow Diagram, as well as how the scope of analysis may be broadened to include organizational and regulatory factors. Research methods to uncover use problems with technology may be used in many ways, with many different foci. We advocate the adoption of an Impact Flow Diagram perspective rather than merely focusing on usability issues in isolation. Truly advancing patient safety requires the systematic adoption of a systems perspective viewing people and technology as an ensemble, also in the design of medical device technology. Copyright © 2012 Elsevier Inc. All rights reserved.
2013-10-01
EFFECTIVENESS DIRECTORATE, WRIGHT-PATTERSON AIR FORCE BASE, OH 45433 AIR FORCE MATERIEL COMMAND UNITED STATES AIR FORCE NOTICE AND SIGNATURE...Division //signed// William E. Russell, Acting Chief Warfighter Interface Division Human Effectiveness Directorate 711 Human Performance...Wing Human Effectiveness Directorate Warfighter Interface Division Battlespace Acoustics Branch Wright-Patterson AFB OH 45433
A model for the control mode man-computer interface dialogue
NASA Technical Reports Server (NTRS)
Chafin, R. L.
1981-01-01
A four stage model is presented for the control mode man-computer interface dialogue. It consists of context development, semantic development syntactic development, and command execution. Each stage is discussed in terms of the operator skill levels (naive, novice, competent, and expert) and pertinent human factors issues. These issues are human problem solving, human memory, and schemata. The execution stage is discussed in terms of the operators typing skills. This model provides an understanding of the human process in command mode activity for computer systems and a foundation for relating system characteristics to operator characteristics.
NASA Astrophysics Data System (ADS)
Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash
2012-06-01
This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.
Guidelines for developing distributed virtual environment applications
NASA Astrophysics Data System (ADS)
Stytz, Martin R.; Banks, Sheila B.
1998-08-01
We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.
Non-invasive neural stimulation
NASA Astrophysics Data System (ADS)
Tyler, William J.; Sanguinetti, Joseph L.; Fini, Maria; Hool, Nicholas
2017-05-01
Neurotechnologies for non-invasively interfacing with neural circuits have been evolving from those capable of sensing neural activity to those capable of restoring and enhancing human brain function. Generally referred to as non-invasive neural stimulation (NINS) methods, these neuromodulation approaches rely on electrical, magnetic, photonic, and acoustic or ultrasonic energy to influence nervous system activity, brain function, and behavior. Evidence that has been surmounting for decades shows that advanced neural engineering of NINS technologies will indeed transform the way humans treat diseases, interact with information, communicate, and learn. The physics underlying the ability of various NINS methods to modulate nervous system activity can be quite different from one another depending on the energy modality used as we briefly discuss. For members of commercial and defense industry sectors that have not traditionally engaged in neuroscience research and development, the science, engineering and technology required to advance NINS methods beyond the state-of-the-art presents tremendous opportunities. Within the past few years alone there have been large increases in global investments made by federal agencies, foundations, private investors and multinational corporations to develop advanced applications of NINS technologies. Driven by these efforts NINS methods and devices have recently been introduced to mass markets via the consumer electronics industry. Further, NINS continues to be explored in a growing number of defense applications focused on enhancing human dimensions. The present paper provides a brief introduction to the field of non-invasive neural stimulation by highlighting some of the more common methods in use or under current development today.
A human factors approach to range scheduling for satellite control
NASA Technical Reports Server (NTRS)
Wright, Cameron H. G.; Aitken, Donald J.
1991-01-01
Range scheduling for satellite control presents a classical problem: supervisory control of a large-scale dynamic system, with unwieldy amounts of interrelated data used as inputs to the decision process. Increased automation of the task, with the appropriate human-computer interface, is highly desirable. The development and user evaluation of a semi-automated network range scheduling system is described. The system incorporates a synergistic human-computer interface consisting of a large screen color display, voice input/output, a 'sonic pen' pointing device, a touchscreen color CRT, and a standard keyboard. From a human factors standpoint, this development represents the first major improvement in almost 30 years to the satellite control network scheduling task.
Human Factors and Simulation in Emergency Medicine.
Hayden, Emily M; Wong, Ambrose H; Ackerman, Jeremy; Sande, Margaret K; Lei, Charles; Kobayashi, Leo; Cassara, Michael; Cooper, Dylan D; Perry, Kimberly; Lewandowski, William E; Scerbo, Mark W
2018-02-01
This consensus group from the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes" held in Orlando, Florida, on May 16, 2017, focused on the use of human factors (HF) and simulation in the field of emergency medicine (EM). The HF discipline is often underutilized within EM but has significant potential in improving the interface between technologies and individuals in the field. The discussion explored the domain of HF, its benefits in medicine, how simulation can be a catalyst for HF work in EM, and how EM can collaborate with HF professionals to effect change. Implementing HF in EM through health care simulation will require a demonstration of clinical and safety outcomes, advocacy to stakeholders and administrators, and establishment of structured collaborations between HF professionals and EM, such as in this breakout group. © 2017 by the Society for Academic Emergency Medicine.
Charge pumping with finger capacitance for body sensor energy harvesting.
Zhou, Alyssa Y; Maharbiz, Michel M
2017-07-01
Sensors are becoming ubiquitous and increasingly integrated with and on the human body; powering such "body network" devices remains an outstanding problem. In this paper, we demonstrate a touch interrogation powered energy harvesting system. This system transforms the kinetic energy of a human finger to electric energy, with each tap producing approximately 1 nJ of energy at a storage capacitor. As is well known for touch display devices, the proximity of a finger can alter the effective value of small capacitances; we demonstrate that these capacitance changes can drive a current which is rectified to charge a capacitor. As a demonstration, an untethered circuit charged this way can deliver enough instantaneous power to light a red LED every ~ 10 seconds. This technology illustrates the ability to communicate with and operate low-power sensors with motions already used for interfacing to devices.
Designing the Instructional Interface.
ERIC Educational Resources Information Center
Lohr, L. L.
2000-01-01
Designing the instructional interface is a challenging endeavor requiring knowledge and skills in instructional and visual design, psychology, human-factors, ergonomic research, computer science, and editorial design. This paper describes the instructional interface, the challenges of its development, and an instructional systems approach to its…
Metis Hub: The Development of an Intuitive Project Planning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
McConnell, Rachael M.; Lawrence Livermore National Lab.
2015-08-26
The goal is to develop an intuitive, dynamic, and consistent interface for the Metis Planning System by combining user requirements and human engineering concepts. The system is largely based upon existing systems so some tools already have working models that we can follow. However, the web-based interface is completely new.
Brain-computer interface after nervous system injury.
Burns, Alexis; Adeli, Hojjat; Buford, John A
2014-12-01
Brain-computer interface (BCI) has proven to be a useful tool for providing alternative communication and mobility to patients suffering from nervous system injury. BCI has been and will continue to be implemented into rehabilitation practices for more interactive and speedy neurological recovery. The most exciting BCI technology is evolving to provide therapeutic benefits by inducing cortical reorganization via neuronal plasticity. This article presents a state-of-the-art review of BCI technology used after nervous system injuries, specifically: amyotrophic lateral sclerosis, Parkinson's disease, spinal cord injury, stroke, and disorders of consciousness. Also presented is transcending, innovative research involving new treatment of neurological disorders. © The Author(s) 2014.
Methodology for automating software systems
NASA Technical Reports Server (NTRS)
Moseley, Warren
1990-01-01
Applying ITS technology to the shuttle diagnostics would not require the rigor of the Petri Net representation, however it is important in providing the animated simulated portion of the interface and the demands placed on the system to support the training aspects to have a homogeneous and consistent underlying knowledge representation. By keeping the diagnostic rule base, the hardware description, the software description, user profiles, desired behavioral knowledge, and the user interface in the same notation, it is possible to reason about the all of the properties of petri nets, on any selected portion of the simulation. This reasoning provides foundation for utilization of intelligent tutoring systems technology.
Interface Anywhere: Development of a Voice and Gesture System for Spaceflight Operations
NASA Technical Reports Server (NTRS)
Thompson, Shelby; Haddock, Maxwell; Overland, David
2013-01-01
The Interface Anywhere Project was funded through Innovation Charge Account (ICA) at NASA JSC in the Fall of 2012. The project was collaboration between human factors and engineering to explore the possibility of designing an interface to control basic habitat operations through gesture and voice control; (a) Current interfaces require the users to be physically near an input device in order to interact with the system; and (b) By using voice and gesture commands, the user is able to interact with the system anywhere they want within the work environment.
Biomimetic Particles as Therapeutics
Green, Jordan J.
2015-01-01
In recent years, there have been major advances in the development of novel nanoparticle and microparticle-based therapeutics. An emerging paradigm is the incorporation of biomimetic features into these synthetic therapeutic constructs to enable them to better interface with biological systems. Through the control of size, shape, and material consistency, particle cores have been generated that better mimic natural cells and viruses. In addition, there have been significant advances in biomimetic surface functionalization of particles through the integration of bio-inspired artificial cell membranes and naturally derived cell membranes. Biomimetic technologies enable therapeutic particles to have increased potency to benefit human health. PMID:26277289
Human Factors in the Design of a Computer-Assisted Instruction System. Technical Progress Report.
ERIC Educational Resources Information Center
Mudge, J. C.
A research project built an author-controlled computer-assisted instruction (CAI) system to study ease-of-use factors in student-system, author-system, and programer-system interfaces. Interfaces were designed and observed in use and systematically revised. Development of course material by authors, use by students, and administrative tasks were…
Crew Office Evaluation of a Precision Lunar Landing System
NASA Technical Reports Server (NTRS)
Major, Laura M.; Duda, Kevin R.; Hirsh, Robert L.
2011-01-01
A representative Human System Interface for a precision lunar landing system, ALHAT, has been developed as a platform for prototype visualization and interaction concepts. This facilitates analysis of crew interaction with advanced sensors and AGNC systems. Human-in-the-loop evaluations with representatives from the Crew Office (i.e. astronauts) and Mission Operations Directorate (MOD) were performed to refine the crew role and information requirements during the final phases of landing. The results include a number of lessons learned from Shuttle that are applicable to the design of a human supervisory landing system and cockpit. Overall, the results provide a first order analysis of the tasks the crew will perform during lunar landing, an architecture for the Human System Interface based on these tasks, as well as details on the information needs to land safely.
Takano, Kouji; Hata, Naoki; Kansaku, Kenji
2011-01-01
The brain–machine interface (BMI) or brain–computer interface is a new interface technology that uses neurophysiological signals from the brain to control external machines or computers. This technology is expected to support daily activities, especially for persons with disabilities. To expand the range of activities enabled by this type of interface, here, we added augmented reality (AR) to a P300-based BMI. In this new system, we used a see-through head-mount display (HMD) to create control panels with flicker visual stimuli to support the user in areas close to controllable devices. When the attached camera detects an AR marker, the position and orientation of the marker are calculated, and the control panel for the pre-assigned appliance is created by the AR system and superimposed on the HMD. The participants were required to control system-compatible devices, and they successfully operated them without significant training. Online performance with the HMD was not different from that using an LCD monitor. Posterior and lateral (right or left) channel selections contributed to operation of the AR–BMI with both the HMD and LCD monitor. Our results indicate that AR–BMI systems operated with a see-through HMD may be useful in building advanced intelligent environments. PMID:21541307
Technology assessment of future intercity passenger transporation systems. Volume 1: Summary report
NASA Technical Reports Server (NTRS)
1976-01-01
Technical, economic, environmental, and sociopolitical issues associated with future intercity transportation system options were assured. Technology assessment was used as a tool to assist in the identification of basic research and technology development tasks that should be undertaken. The emphasis was on domestic passenger transportation, but interfaces with freight and international transportation were considered.
Probing low noise at the MOS interface with a spin-orbit qubit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jock, Ryan Michael; Jacobson, Noah Tobias; Harvey-Collard, Patrick
The silicon metal-oxide-semiconductor (MOS) material system is technologically important for the implementation of electron spin-based quantum information technologies. Researchers predict the need for an integrated platform in order to implement useful computation, and decades of advancements in silicon microelectronics fabrication lends itself to this challenge. However, fundamental concerns have been raised about the MOS interface (e.g. trap noise, variations in electron g-factor and practical implementation of multi-QDs). Furthermore, two-axis control of silicon qubits has, to date, required the integration of non-ideal components (e.g. microwave strip-lines, micro-magnets, triple quantum dots, or introduction of donor atoms). In this paper, we introduce amore » spin-orbit (SO) driven singlet- triplet (ST) qubit in silicon, demonstrating all-electrical two-axis control that requires no additional integrated elements and exhibits charge noise properties equivalent to other more model, but less commercially mature, semiconductor systems. We demonstrate the ability to tune an intrinsic spin-orbit interface effect, which is consistent with Rashba and Dresselhaus contributions that are remarkably strong for a low spin-orbit material such as silicon. The qubit maintains the advantages of using isotopically enriched silicon for producing a quiet magnetic environment, measuring spin dephasing times of 1.6 μs using 99.95% 28Si epitaxy for the qubit, comparable to results from other isotopically enhanced silicon ST qubit systems. This work, therefore, demonstrates that the interface inherently provides properties for two-axis control, and the technologically important MOS interface does not add additional detrimental qubit noise. isotopically enhanced silicon ST qubit systems« less
Customizing graphical user interface technology for spacecraft control centers
NASA Technical Reports Server (NTRS)
Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald
1993-01-01
The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.
The flight telerobotic servicer: From functional architecture to computer architecture
NASA Technical Reports Server (NTRS)
Lumia, Ronald; Fiala, John
1989-01-01
After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.
The silicon chip: A versatile micro-scale platform for micro- and nano-scale systems
NASA Astrophysics Data System (ADS)
Choi, Edward
Cutting-edge advances in micro- and nano-scale technology require instrumentation to interface with the external world. While technology feature sizes are continually being reduced, the size of experimentalists and their instrumentation do not mirror this trend. Hence there is a need for effective application-specific instrumentation to bridge the gap from the micro and nano-scale phenomena being studied to the comparative macro-scale of the human interfaces. This dissertation puts forward the idea that the silicon CMOS integrated circuit, or microchip in short, serves as an excellent platform to perform this functionality. The electronic interfaces designed for the semiconductor industry are particularly attractive as development platforms, and the reduction in feature sizes that has been a hallmark of the industry suggests that chip-scale instrumentation may be more closely coupled to the phenomena of interest, allowing finer control or improved measurement capabilities. Compatibility with commercial processes will further enable economies of scale through mass production, another welcome feature of this approach. Thus chip-scale instrumentation may replace the bulky, expensive, cumbersome-to-operate macro-scale prototypes currently in use for many of these applications. The dissertation examines four specific applications in which the chip may serve as the ideal instrumentation platform. These are nanorod manipulation, polypyrrole bilayer hinge microactuator control, organic transistor hybrid circuits, and contact fluorescence imaging. The thesis is structured around chapters devoted to each of these projects, in addition to a chapter on preliminary work on an RFID system that serves as a wireless interface model. Each of these chapters contains tools and techniques developed for chip-scale instrumentation, from custom scripts for automated layout and data collection to microfabrication processes. Implementation of these tools to develop systems for the applications above is evaluated. The viability of this approach is not limited to the examples listed in this work, and innovative new methodologies beyond those included here may be developed in the future for other systems which would benefit from the versatility of chip-scale platforms.
User interface design principles for the SSM/PMAD automated power system
NASA Technical Reports Server (NTRS)
Jakstas, Laura M.; Myers, Chris J.
1991-01-01
Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.
Three-Dimensional User Interfaces for Immersive Virtual Reality
NASA Technical Reports Server (NTRS)
vanDam, Andries
1997-01-01
The focus of this grant was to experiment with novel user interfaces for immersive Virtual Reality (VR) systems, and thus to advance the state of the art of user interface technology for this domain. Our primary test application was a scientific visualization application for viewing Computational Fluid Dynamics (CFD) datasets. This technology has been transferred to NASA via periodic status reports and papers relating to this grant that have been published in conference proceedings. This final report summarizes the research completed over the past year, and extends last year's final report of the first three years of the grant.
A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.
Yu, Jun; Wang, Zeng-Fu
2015-05-01
A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.
Gloved Human-Machine Interface
NASA Technical Reports Server (NTRS)
Adams, Richard (Inventor); Hannaford, Blake (Inventor); Olowin, Aaron (Inventor)
2015-01-01
Certain exemplary embodiments can provide a system, machine, device, manufacture, circuit, composition of matter, and/or user interface adapted for and/or resulting from, and/or a method and/or machine-readable medium comprising machine-implementable instructions for, activities that can comprise and/or relate to: tracking movement of a gloved hand of a human; interpreting a gloved finger movement of the human; and/or in response to interpreting the gloved finger movement, providing feedback to the human.
Projection Mapping User Interface for Disabled People
Simutis, Rimvydas; Maskeliūnas, Rytis
2018-01-01
Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities. PMID:29686827
Projection Mapping User Interface for Disabled People.
Gelšvartas, Julius; Simutis, Rimvydas; Maskeliūnas, Rytis
2018-01-01
Difficulty in communicating is one of the key challenges for people suffering from severe motor and speech disabilities. Often such person can communicate and interact with the environment only using assistive technologies. This paper presents a multifunctional user interface designed to improve communication efficiency and person independence. The main component of this interface is a projection mapping technique used to highlight objects in the environment. Projection mapping makes it possible to create a natural augmented reality information presentation method. The user interface combines a depth sensor and a projector to create camera-projector system. We provide a detailed description of camera-projector system calibration procedure. The described system performs tabletop object detection and automatic projection mapping. Multiple user input modalities have been integrated into the multifunctional user interface. Such system can be adapted to the needs of people with various disabilities.
Human-Vehicle Interface for Semi-Autonomous Operation of Uninhabited Aero Vehicles
NASA Technical Reports Server (NTRS)
Jones, Henry L.; Frew, Eric W.; Woodley, Bruce R.; Rock, Stephen M.
2001-01-01
The robustness of autonomous robotic systems to unanticipated circumstances is typically insufficient for use in the field. The many skills of human user often fill this gap in robotic capability. To incorporate the human into the system, a useful interaction between man and machine must exist. This interaction should enable useful communication to be exchanged in a natural way between human and robot on a variety of levels. This report describes the current human-robot interaction for the Stanford HUMMINGBIRD autonomous helicopter. In particular, the report discusses the elements of the system that enable multiple levels of communication. An intelligent system agent manages the different inputs given to the helicopter. An advanced user interface gives the user and helicopter a method for exchanging useful information. Using this human-robot interaction, the HUMMINGBIRD has carried out various autonomous search, tracking, and retrieval missions.
Marco-Ruiz, Luis; Bønes, Erlend; de la Asunción, Estela; Gabarron, Elia; Aviles-Solis, Juan Carlos; Lee, Eunji; Traver, Vicente; Sato, Keiichi; Bellika, Johan G
2017-10-01
Symptom checkers are software tools that allow users to submit a set of symptoms and receive advice related to them in the form of a diagnosis list, health information or triage. The heterogeneity of their potential users and the number of different components in their user interfaces can make testing with end-users unaffordable. We designed and executed a two-phase method to test the respiratory diseases module of the symptom checker Erdusyk. Phase I consisted of an online test with a large sample of users (n=53). In Phase I, users evaluated the system remotely and completed a questionnaire based on the Technology Acceptance Model. Principal Component Analysis was used to correlate each section of the interface with the questionnaire responses, thus identifying which areas of the user interface presented significant contributions to the technology acceptance. In the second phase, the think-aloud procedure was executed with a small number of samples (n=15), focusing on the areas with significant contributions to analyze the reasons for such contributions. Our method was used effectively to optimize the testing of symptom checker user interfaces. The method allowed kept the cost of testing at reasonable levels by restricting the use of the think-aloud procedure while still assuring a high amount of coverage. The main barriers detected in Erdusyk were related to problems understanding time repetition patterns, the selection of levels in scales to record intensities, navigation, the quantification of some symptom attributes, and the characteristics of the symptoms. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Fuel Tank Wireless Power and Signal Study
NASA Technical Reports Server (NTRS)
Merrill, Garrick
2015-01-01
Hydro Technologies has developed a custom electronics and mechanical framework for interfacing with off-the-shelf sensors to achieve through barrier sensing solutions. The core project technology relies on Hydro Technologies Wireless Power and Signal Interface (Wi psi) System for transmitting data and power wirelessly using magnetic fields. To accomplish this, Wi psi uses a multi-frequency local magnetic field to produce magnetic fields capable of carrying data and power through almost any material such as metals, seawater, concrete, and air. It will also work through layers of multiple materials.
Distributed and collaborative synthetic environments
NASA Technical Reports Server (NTRS)
Bajaj, Chandrajit L.; Bernardini, Fausto
1995-01-01
Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.