Science.gov

Sample records for outdoor autonomous robots

  1. An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots

    DTIC Science & Technology

    2006-04-01

    An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots E.B. Pacis, B. Sights, G. Ahuja, G. Kogut, H.R. Everett...TITLE AND SUBTITLE An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...demonstrated a series of collaborative behaviors of multiple autonomous robots in a force-protection scenario. Stand- alone sensors detected intruder

  2. Autonomous robot using infrared thermal camera to discriminate objects in outdoor scene

    NASA Technical Reports Server (NTRS)

    Caillas, C.

    1990-01-01

    A complete autonomous legged robot is beig designed at Carnegie Mellon University to perform planetary exploration without human supervision. This robot must traverse unknown and geographically diverse areas in order to collect samples of materials. This paper describes how thermal imaging can be used to identify materials in order to find good footfall positions and collection sites of material. First, a model developed for determining the temperature of materials in an outdoor scene is presented. By applying this model, it is shown that it is possible to determine a physical characteristic of the material: thermal inertia. Second, experimental results are described that consist in recording thermal images of an outdoor scene constituted with sand and rock. Third, results and limitations of applying the model to experimental images are analyzed. Finally, the paper analyzes how basic segmentation algorithms can be combined with the thermal inertia segmentation in order to improve the discrimination of different kinds of materials.

  3. An adaptive localization system for outdoor/indoor navigation for autonomous robots

    NASA Astrophysics Data System (ADS)

    Pacis, E. B.; Sights, B.; Ahuja, G.; Kogut, G.; Everett, H. R.

    2006-05-01

    Many envisioned applications of mobile robotic systems require the robot to navigate in complex urban environments. This need is particularly critical if the robot is to perform as part of a synergistic team with human forces in military operations. Historically, the development of autonomous navigation for mobile robots has targeted either outdoor or indoor scenarios, but not both, which is not how humans operate. This paper describes efforts to fuse component technologies into a complete navigation system, allowing a robot to seamlessly transition between outdoor and indoor environments. Under the Joint Robotics Program's Technology Transfer project, empirical evaluations of various localization approaches were conducted to assess their maturity levels and performance metrics in different exterior/interior settings. The methodologies compared include Markov localization, global positioning system, Kalman filtering, and fuzzy-logic. Characterization of these technologies highlighted their best features, which were then fused into an adaptive solution. A description of the final integrated system is discussed, including a presentation of the design, experimental results, and a formal demonstration to attendees of the Unmanned Systems Capabilities Conference II in San Diego in December 2005.

  4. Robotic Lander Completes Multiple Outdoor Flight

    NASA Video Gallery

    NASA’s Robotic Lander Development Project in Huntsville, Ala., has successfully completed seven autonomous outdoor flight tests of a lander prototype, dubbed Mighty Eagle. On Oct. 14, Mighty Eagl...

  5. Miniaturized autonomous robot

    NASA Astrophysics Data System (ADS)

    Ishihara, Hidenori; Fukuda, Toshio

    1998-01-01

    Many projects developing the miniaturized autonomous robot have been carried out in the whole world. This paper deals with our challenges developing a miniaturized autonomous robot. The miniaturized autonomous robot is defined as the miniaturized closed-loop system with micro processor, microactuators and microsensors. We have developed the micro autonomous robotic system (MARS) consisting of the microprocessor, microsensors, microactuators, communication units and batteries. The MARS controls itself by the downloaded program supplied through the IR communication system. In this paper, we demonstrate several performance of the MARS, and discuss the properties of the miniaturized autonomous robot.

  6. Micro autonomous robotic system

    NASA Astrophysics Data System (ADS)

    Ishihara, Hidenori; Fukuda, Toshio

    1995-12-01

    This paper deals with the structural proposal of the micro autonomous robotic system, and shows the design of the prototype. We aim at developing the micro robot, which autonomously acts based on its detection, in order to propose a solution to constitute the micro autonomous robotic system. However, as miniaturizing the size, the number of the sensors gets restricted and the information from them becomes lack. Lack of the information makes it difficult to realize an intelligence of quality. Because of that, the micro robotic system needs to develop the simple algorithm. In this paper, we propose the simply logical algorithms to control the actuator, and show the performance of the micro robot controlled by them, and design the Micro Line Trace Robot, which dimension is about 1 cm cube and which moves along the black line on the white-colored ground, and the programmable micro autonomous robot, which dimension is about 2 cm cube and which performs according to the program optionally.

  7. Experiments in autonomous robotics

    SciTech Connect

    Hamel, W.R.

    1987-01-01

    The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.

  8. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  9. Autonomous Robot Skill Acquisition

    DTIC Science & Technology

    2011-05-01

    Research. ix ABSTRACT AUTONOMOUS ROBOT SKILL ACQUISITION MAY 2011 GEORGE DIMITRI KONIDARIS B.Sc., UNIVERSITY OF THE WITWATERSRAND B.Sc. Hons., UNIVERSITY...OF THE WITWATERSRAND M.Sc., UNIVERSITY OF EDINBURGH Ph.D., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Andrew G. Barto Among the most

  10. Demonstration of autonomous air monitoring through robotics

    SciTech Connect

    Rancatore, R.

    1989-11-01

    The project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. The robot was also modified to carry a HNU PI-101 Photoionization Detector air monitoring device. A sonar range finder, which already was an integral part of the Surveyor, was repositioned to the front of the robot chassis to detect large obstacles in the path of the robot. In addition, the software of the onboard computer was also extensively modified to provide: navigation control, dynamic steering to smoothly follow the wire-course without hesitation, obstacle avoidance, autonomous shut down and remote reporting of toxic substance detection.

  11. Cooperative Autonomous Robots for Reconnaissance

    DTIC Science & Technology

    2009-03-06

    REPORT Cooperative Autonomous Robots for Reconnaissance 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Collaborating mobile robots equipped with WiFi ...Cooperative Autonomous Robots for Reconnaissance Report Title ABSTRACT Collaborating mobile robots equipped with WiFi transceivers are configured as a mobile...equipped with WiFi transceivers are configured as a mobile ad-hoc network. Algorithms are developed to take advantage of the distributed processing

  12. Evaluating Autonomous Ground-Robots

    DTIC Science & Technology

    2012-06-14

    executed o Time taken for computation of hazard detection (did robots ‘stop to think ’) o Number and nature of obstacles detected, avoided, etc o...Evaluating Autonomous Ground- Robots Anthony Finn 1 , Adam Jacoff 2 , Mike Del Rose 3 , Bob Kania 3 , Udam Silva 4 and Jon Bornstein 5...Abstract The robotics community benefits from common test methods and metrics of performance to focus their research. As a result, many performance

  13. GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge

    DTIC Science & Technology

    2004-01-01

    GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge Reid Simmons, Allison Bruce, Dani Goldberg, Adam Goode, Michael Montemerlo, Nicholas...2004 2. REPORT TYPE 3. DATES COVERED - 4. TITLE AND SUBTITLE GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge 5a. CONTRACT...Simmons. “A Social Robot that Stands in Line.” Autonomous Robots , 12:3 pp.313-324, May 2002. [Ortony, 1988] A. Ortony, G. L. Clore, and A. Collins

  14. Open multiagent architecture extended to distributed autonomous robotic systems

    NASA Astrophysics Data System (ADS)

    Sellem, Philippe; Amram, Eric; Luzeaux, Dominique

    2000-07-01

    Our research deals with the design and experiment of a control architecture for an autonomous outdoor mobile robot which uses mainly vision for perception. In this case of a single robot, we have designed a hybrid architecture with an attention mechanism that allows dynamic selection of perception processes. Building on this work, we have developed an open multi-agent architecture, for standard multi-task operating system, using the C++ programming language and Posix threads. Our implementation features of efficient and fully generic messages between agents, automatic acknowledgement receipts and built-in synchronization capabilities. Knowledge is distributed among robots according to a collaborative scheme: every robot builds its own representation of the world and shares it with others. Pieces of information are exchanged when decisions have to be made. Experiments are to be led with two outdoor ActiveMedia Pioneer AT mobile robots. Distributed perception, using mainly vision but also ultrasound, will serve as proof of concept.

  15. Autonomous mobile robot

    SciTech Connect

    Mattaboni, P.J.

    1987-01-20

    This patent describes a mobile robot of the type having (a) a vision system, (b) memory means for storing data derived from the robot vision system, and (c) a computer for processing data derived from the robot's vision system, the improvement wherein the robot's vision system comprises (i) a first array of ranging transducers for obtaining data on the position and distance of far objects in a volume of space, the transducers of the first array being symmetrically disposed on the mobile robot with respect to an axis of symmetry within the mobile robot. Each transducer of the first array is fixed in position with respect to that axis of symmetry and sees a portion of the volume of space seen by its entire array; (ii) a second array of ranging transducers for obtaining data of the position and distance of near objects in the same or an overlapping volume of space, the transducers of the second array being symmetrically disposed on the mobile robot with respect to the axis of symmetry. Each transducer of the second array is fixed in position with respect to the axis of symmetry and sees a portion of the volume of space seen by its entire array, the angle of view of the transducers of the second array being different from the angle of view of the transducers of the first array with respect to the same object in space; and (iii) means for polling the ranging transducers in sequences determined by the computer.

  16. Autonomous mobile robots: Vehicles with cognitive control

    SciTech Connect

    Meystel, A.

    1987-01-01

    This book explores a new rapidly developing area of robotics. It describes the state-of-the-art intelligence control, applied machine intelligence, and research and initial stages of manufacturing of autonomous mobile robots. A complete account of the theoretical and experimental results obtained during the last two decades together with some generalizations on Autonomous Mobile Systems are included in this book. Contents: Introduction; Requirements and Specifications; State-of-the-art in Autonomous Mobile Robots Area; Structure of Intelligent Mobile Autonomous System; Planner, Navigator; Pilot; Cartographer; Actuation Control; Computer Simulation of Autonomous Operation; Testing the Autonomous Mobile Robot; Conclusions; Bibliography.

  17. Autonomous caregiver following robotic wheelchair

    NASA Astrophysics Data System (ADS)

    Ratnam, E. Venkata; Sivaramalingam, Sethurajan; Vignesh, A. Sri; Vasanth, Elanthendral; Joans, S. Mary

    2011-12-01

    In the last decade, a variety of robotic/intelligent wheelchairs have been proposed to meet the need in aging society. Their main research topics are autonomous functions such as moving toward some goals while avoiding obstacles, or user-friendly interfaces. Although it is desirable for wheelchair users to go out alone, caregivers often accompany them. Therefore we have to consider not only autonomous functions and user interfaces but also how to reduce caregivers' load and support their activities in a communication aspect. From this point of view, we have proposed a robotic wheelchair moving with a caregiver side by side based on the MATLAB process. In this project we discussing about robotic wheel chair to follow a caregiver by using a microcontroller, Ultrasonic sensor, keypad, Motor drivers to operate robot. Using camera interfaced with the DM6437 (Davinci Code Processor) image is captured. The captured image are then processed by using image processing technique, the processed image are then converted into voltage levels through MAX 232 level converter and given it to the microcontroller unit serially and ultrasonic sensor to detect the obstacle in front of robot. In this robot we have mode selection switch Automatic and Manual control of robot, we use ultrasonic sensor in automatic mode to find obstacle, in Manual mode to use the keypad to operate wheel chair. In the microcontroller unit, c language coding is predefined, according to this coding the robot which connected to it was controlled. Robot which has several motors is activated by using the motor drivers. Motor drivers are nothing but a switch which ON/OFF the motor according to the control given by the microcontroller unit.

  18. Miniature Autonomous Robotic Vehicle (MARV)

    SciTech Connect

    Feddema, J.T.; Kwok, K.S.; Driessen, B.J.; Spletzer, B.L.; Weber, T.M.

    1996-12-31

    Sandia National Laboratories (SNL) has recently developed a 16 cm{sup 3} (1 in{sup 3}) autonomous robotic vehicle which is capable of tracking a single conducting wire carrying a 96 kHz signal. This vehicle was developed to assess the limiting factors in using commercial technology to build miniature autonomous vehicles. Particular attention was paid to the design of the control system to search out the wire, track it, and recover if the wire was lost. This paper describes the test vehicle and the control analysis. Presented in the paper are the vehicle model, control laws, a stability analysis, simulation studies and experimental results.

  19. A power autonomous monopedal robot

    NASA Astrophysics Data System (ADS)

    Krupp, Benjamin T.; Pratt, Jerry E.

    2006-05-01

    We present the design and initial results of a power-autonomous planar monopedal robot. The robot is a gasoline powered, two degree of freedom robot that runs in a circle, constrained by a boom. The robot uses hydraulic Series Elastic Actuators, force-controllable actuators which provide high force fidelity, moderate bandwidth, and low impedance. The actuators are mounted in the body of the robot, with cable drives transmitting power to the hip and knee joints of the leg. A two-stroke, gasoline engine drives a constant displacement pump which pressurizes an accumulator. Absolute position and spring deflection of each of the Series Elastic Actuators are measured using linear encoders. The spring deflection is translated into force output and compared to desired force in a closed loop force-control algorithm implemented in software. The output signal of each force controller drives high performance servo valves which control flow to each of the pistons of the actuators. In designing the robot, we used a simulation-based iterative design approach. Preliminary estimates of the robot's physical parameters were based on past experience and used to create a physically realistic simulation model of the robot. Next, a control algorithm was implemented in simulation to produce planar hopping. Using the joint power requirements and range of motions from simulation, we worked backward specifying pulley diameter, piston diameter and stroke, hydraulic pressure and flow, servo valve flow and bandwidth, gear pump flow, and engine power requirements. Components that meet or exceed these specifications were chosen and integrated into the robot design. Using CAD software, we calculated the physical parameters of the robot design, replaced the original estimates with the CAD estimates, and produced new joint power requirements. We iterated on this process, resulting in a design which was prototyped and tested. The Monopod currently runs at approximately 1.2 m/s with the weight of all

  20. Structured control for autonomous robots

    SciTech Connect

    Simmons, R.G. . School of Computer Science)

    1994-02-01

    To operate in rich, dynamic environments, autonomous robots must be able to effectively utilize and coordinate their limited physical and occupational resources. As complexity increases, it becomes necessary to impose explicit constraints on the control of planning, perception, and action to ensure that unwanted interactions between behaviors do not occur. This paper advocates developing complex robot systems by layering reactive behaviors onto deliberative components. In this structured control approach, the deliberative components handle normal situations and the reactive behaviors, which are explicitly constrained as to when and how they are activated, handle exceptional situations. The Task Control Architecture (TCA) has been developed to support this approach. TCA provides an integrated set of control constructs useful for implementing deliberative and reactive behaviors. The control constructs facilitate modular and evolutionary system development: they are used to integrate and coordinate planning, perception, and execution, and to incrementally improve the efficiency and robustness of the robot systems. To date, TCA has been used in implementing a half-dozen mobile robot systems, including an autonomous six-legged rover and indoor mobile manipulator.

  1. [Mobile autonomous robots-Possibilities and limits].

    PubMed

    Maehle, E; Brockmann, W; Walthelm, A

    2002-02-01

    Besides industrial robots, which today are firmly established in production processes, service robots are becoming more and more important. They shall provide services for humans in different areas of their professional and everyday environment including medicine. Most of these service robots are mobile which requires an intelligent autonomous behaviour. After characterising the different kinds of robots the relevant paradigms of intelligent autonomous behaviour for mobile robots are critically discussed in this paper and illustrated by three concrete examples of robots realized in Lübeck. In addition a short survey of actual kinds of surgical robots as well as an outlook to future developments is given.

  2. Lethality and Autonomous Robots: An Ethical Stance

    DTIC Science & Technology

    2007-01-01

    Lethality and Autonomous Robots : An Ethical Stance Ronald C. Arkin and Lilia Moshkina College of Computing Georgia Institute of Technology Atlanta... autonomous robots that maintain an ethical infrastructure to govern their behavior will be referred to as humane-oids. 2. Understanding the Ethical...2007 4. TITLE AND SUBTITLE Lethality and Autonomous Robots : An Ethical Stance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  3. Autonomous Robotic Inspection in Tunnels

    NASA Astrophysics Data System (ADS)

    Protopapadakis, E.; Stentoumis, C.; Doulamis, N.; Doulamis, A.; Loupos, K.; Makantasis, K.; Kopsiaftis, G.; Amditis, A.

    2016-06-01

    In this paper, an automatic robotic inspector for tunnel assessment is presented. The proposed platform is able to autonomously navigate within the civil infrastructures, grab stereo images and process/analyse them, in order to identify defect types. At first, there is the crack detection via deep learning approaches. Then, a detailed 3D model of the cracked area is created, utilizing photogrammetric methods. Finally, a laser profiling of the tunnel's lining, for a narrow region close to detected crack is performed; allowing for the deduction of potential deformations. The robotic platform consists of an autonomous mobile vehicle; a crane arm, guided by the computer vision-based crack detector, carrying ultrasound sensors, the stereo cameras and the laser scanner. Visual inspection is based on convolutional neural networks, which support the creation of high-level discriminative features for complex non-linear pattern classification. Then, real-time 3D information is accurately calculated and the crack position and orientation is passed to the robotic platform. The entire system has been evaluated in railway and road tunnels, i.e. in Egnatia Highway and London underground infrastructure.

  4. Three-dimensional vision sensors for autonomous robots

    NASA Astrophysics Data System (ADS)

    Uchiyama, Takashi; Okabayashi, Keizyu; Wakitani, Jun

    1993-09-01

    A three dimensional measurement system, which is important for developing autonomous robots is described. Industrial robots used in today's plants are of the preprogrammed teaching playback type. It is necessary to develop autonomous robots which can work based on sensor information for intelligent manufacturing systems. Moreover, practical use of robots which work in unstructured environments such as outdoors and in space is expected. To realize this, a function to measure objects and the environment three-dimensionally is a key technology. Additional important requirements for robotic sensors are real-time processing and compactness. We have developed smart 3-D vision sensors for the purpose of realizing autonomous robots. These are two kinds of sensors with different functions corresponding to the application. One is a slitted light range finder ( SLRF ) to measure stationary objects. The other is a real-time tracking vision ( RTTV ) which can measure moving objects at high speed. SLRF uses multiple slitted lights which are generated by a semiconductor laser through an interference filter and a cylindrical lens. Furthermore, we developed a liquid crystal shutter with multiple electrodes. We devised a technique to make coded slitted light by putting this shutter in front of the light source. As a result, using the principle of triangulation, objects can be measured in three dimensions. In addition, high-speed image input was enabled by projecting multiple slitted light at the same time. We have confirmed the effectiveness of the SLRF applied to a hand-eye system using a robot.

  5. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.

  6. Hierarchical loop detection for mobile outdoor robots

    NASA Astrophysics Data System (ADS)

    Lang, Dagmar; Winkens, Christian; Häselich, Marcel; Paulus, Dietrich

    2012-01-01

    Loop closing is a fundamental part of 3D simultaneous localization and mapping (SLAM) that can greatly enhance the quality of long-term mapping. It is essential for the creation of globally consistent maps. Conceptually, loop closing is divided into detection and optimization. Recent approaches depend on a single sensor to recognize previously visited places in the loop detection stage. In this study, we combine data of multiple sensors such as GPS, vision, and laser range data to enhance detection results in repetitively changing environments that are not sufficiently explained by a single sensor. We present a fast and robust hierarchical loop detection algorithm for outdoor robots to achieve a reliable environment representation even if one or more sensors fail.

  7. A Biologically-Inspired Autonomous Robot

    DTIC Science & Technology

    1993-12-13

    AD-A273 909 DTIC ELECTE SDEC,2 01993 A PERFORMANCE REPORT A Biologically-Inspired Autonomous Robot Grant N00014-90-J- 1545 Period of Performance: 3...rough estimate of the torque generated by the electrical activation of the muscle dunng the movement. " The previous simulation of the robot has been...reaction forces for the robot that shares features with Full’s force measurements of cockroach walking. "* The 18 motor driver circuits for the robot have

  8. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  9. Autonomous Student Experiences in Outdoor and Adventure Education

    ERIC Educational Resources Information Center

    Daniel, Brad; Bobilya, Andrew J.; Kalisch, Kenneth R.; McAvoy, Leo H.

    2014-01-01

    This article explores the current state of knowledge regarding the use of autonomous student experiences (ASE) in outdoor and adventure education (OAE) programs. ASE are defined as components (e.g., solo, final expedition) in which participants have a greater measure of choice and control over the planning, execution, and outcomes of their…

  10. Tele-robotic/autonomous control using controlshell

    SciTech Connect

    Wilhelmsen, K.C.; Hurd, R.L.; Couture, S.

    1996-12-10

    A tele-robotic and autonomous controller architecture for waste handling and sorting has been developed which uses tele-robotics, autonomous grasping and image processing. As a starting point, prior work from LLNL and ORNL was restructured and ported to a special real-time development environment. Significant improvements in collision avoidance, force compliance, and shared control aspects were then developed. Several orders of magnitude improvement were made in some areas to meet the speed and robustness requirements of the application.

  11. Control algorithms for autonomous robot navigation

    SciTech Connect

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  12. Automatic learning by an autonomous mobile robot

    SciTech Connect

    de Saussure, G.; Spelt, P.F.; Killough, S.M.; Pin, F.G.; Weisbin, C.R.

    1989-01-01

    This paper describes recent research in automatic learning by the autonomous mobile robot HERMIES-IIB at the Center for Engineering Systems Advanced Research (CESAR). By acting on the environment and observing the consequences during a set of training examples, the robot learns a sequence of successful manipulations on a simulated control panel. The robot learns to classify panel configurations in order to deal with new configurations that are not part of the original training set. 5 refs., 2 figs.

  13. Mapping planetary caves with an autonomous, heterogeneous robot team

    NASA Astrophysics Data System (ADS)

    Husain, Ammar; Jones, Heather; Kannan, Balajee; Wong, Uland; Pimentel, Tiago; Tang, Sarah; Daftry, Shreyansh; Huber, Steven; Whittaker, William L.

    Caves on other planetary bodies offer sheltered habitat for future human explorers and numerous clues to a planet's past for scientists. While recent orbital imagery provides exciting new details about cave entrances on the Moon and Mars, the interiors of these caves are still unknown and not observable from orbit. Multi-robot teams offer unique solutions for exploration and modeling subsurface voids during precursor missions. Robot teams that are diverse in terms of size, mobility, sensing, and capability can provide great advantages, but this diversity, coupled with inherently distinct low-level behavior architectures, makes coordination a challenge. This paper presents a framework that consists of an autonomous frontier and capability-based task generator, a distributed market-based strategy for coordinating and allocating tasks to the different team members, and a communication paradigm for seamless interaction between the different robots in the system. Robots have different sensors, (in the representative robot team used for testing: 2D mapping sensors, 3D modeling sensors, or no exteroceptive sensors), and varying levels of mobility. Tasks are generated to explore, model, and take science samples. Based on an individual robot's capability and associated cost for executing a generated task, a robot is autonomously selected for task execution. The robots create coarse online maps and store collected data for high resolution offline modeling. The coordination approach has been field tested at a mock cave site with highly-unstructured natural terrain, as well as an outdoor patio area. Initial results are promising for applicability of the proposed multi-robot framework to exploration and modeling of planetary caves.

  14. Tele/Autonomous Robot For Nuclear Facilities

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.

    1994-01-01

    Fail-safe tele/autonomous robotic system makes it unnecessary for human technicians to enter nuclear-fuel-reprocessing facilities and other high-radiation or otherwise hazardous industrial environments. Used to carry out experiments as exchanging equipment modules, turning bolts, cleaning surfaces, and grappling turning objects by use of mixture of autonomous actions and teleoperation with either single arm or two cooperating arms. System capable of fully autonomous operation, teleoperation or shared control.

  15. INL Autonomous Navigation System

    SciTech Connect

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  16. Autonomous Navigation for Mobile Robots with Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Ballantyne, James; Johns, Edward; Valibeik, Salman; Wong, Charence; Yang, Guang-Zhong

    Dynamic and complex indoor environments present a challenge for mobile robot navigation. The robot must be able to simultaneously map the environment, which often has repetitive features, whilst keep track of its pose and location. This chapter introduces some of the key considerations for human guided navigation. Rather than letting the robot explore the environment fully autonomously, we consider the use of human guidance for progressively building up the environment map and establishing scene association, learning, as well as navigation and planning. After the guide has taken the robot through the environment and indicated the points of interest via hand gestures, the robot is then able to use the geometric map and scene descriptors captured during the tour to create a high-level plan for subsequent autonomous navigation within the environment. Issues related to gesture recognition, multi-cue integration, tracking, target pursuing, scene association and navigation planning are discussed.

  17. Supervised autonomous robotic soft tissue surgery.

    PubMed

    Shademan, Azad; Decker, Ryan S; Opfermann, Justin D; Leonard, Simon; Krieger, Axel; Kim, Peter C W

    2016-05-04

    The current paradigm of robot-assisted surgeries (RASs) depends entirely on an individual surgeon's manual capability. Autonomous robotic surgery-removing the surgeon's hands-promises enhanced efficacy, safety, and improved access to optimized surgical techniques. Surgeries involving soft tissue have not been performed autonomously because of technological limitations, including lack of vision systems that can distinguish and track the target tissues in dynamic surgical environments and lack of intelligent algorithms that can execute complex surgical tasks. We demonstrate in vivo supervised autonomous soft tissue surgery in an open surgical setting, enabled by a plenoptic three-dimensional and near-infrared fluorescent (NIRF) imaging system and an autonomous suturing algorithm. Inspired by the best human surgical practices, a computer program generates a plan to complete complex surgical tasks on deformable soft tissue, such as suturing and intestinal anastomosis. We compared metrics of anastomosis-including the consistency of suturing informed by the average suture spacing, the pressure at which the anastomosis leaked, the number of mistakes that required removing the needle from the tissue, completion time, and lumen reduction in intestinal anastomoses-between our supervised autonomous system, manual laparoscopic surgery, and clinically used RAS approaches. Despite dynamic scene changes and tissue movement during surgery, we demonstrate that the outcome of supervised autonomous procedures is superior to surgery performed by expert surgeons and RAS techniques in ex vivo porcine tissues and in living pigs. These results demonstrate the potential for autonomous robots to improve the efficacy, consistency, functional outcome, and accessibility of surgical techniques.

  18. JOMAR: Joint Operations with Mobile Autonomous Robots

    DTIC Science & Technology

    2015-12-21

    improvements in GPS- aided navigation. * A data-association algorithm with applications to target tracking and computer vision applications, named the...A characterization of Global Positioning System (GPS) noise models in the MaxMixture framework, allowing significant improvements in GPS- aided ...autonomous tractor operations,” Autonomous Robots, vol. 13, no. 1, pp. 87–104, 2002. [11] J. Kim and S. Sukkarieh, “SLAM aided GPS/INS navigation in GPS

  19. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  20. Synthesis of autonomous robots through evolution.

    PubMed

    Nolfi, Stefano; Floreano, Dario

    2002-01-01

    Evolutionary robotics is the attempt to develop robots through a self-organized process based on artificial evolution. This approach stresses the importance of the study of systems that have a body and that are situated in a physical environment, and which autonomously develop their own skills in close interaction with the environment. In this review we briefly illustrate the method and the main concept of evolutionary robotics, and examine the most significant contribution in this area. We also discuss some of the contributions that this research area is making to the foundational debate in cognitive science.

  1. From Autonomous Robots to Artificial Ecosystems

    NASA Astrophysics Data System (ADS)

    Mastrogiovanni, Fulvio; Sgorbissa, Antonio; Zaccaria, Renato

    During the past few years, starting from the two mainstream fields of Ambient Intelligence [2] and Robotics [17], several authors recognized the benefits of the socalled Ubiquitous Robotics paradigm. According to this perspective, mobile robots are no longer autonomous, physically situated and embodied entities adapting themselves to a world taliored for humans: on the contrary, they are able to interact with devices distributed throughout the environment and get across heterogeneous information by means of communication technologies. Information exchange, coupled with simple actuation capabilities, is meant to replace physical interaction between robots and their environment. Two benefits are evident: (i) smart environments overcome inherent limitations of mobile platforms, whereas (ii) mobile robots offer a mobility dimension unknown to smart environments.

  2. The Baker Observatory Robotic Autonomous Telescope

    NASA Astrophysics Data System (ADS)

    Hicks, L. L.; Reed, M. D.; Thompson, M. A.; Gilker, J. T.

    We describe the Baker Observatory Robotic Autonomous Telescope project. The hardware includes a 16 inch Meade LX-200 telescope, an AstroHaven 7 feet dome, an Apogee U47 CCD camera and filter wheel, a Boltwood Cloud Sensor II, and various other minor hardware. We are implementing RTS2 for the Telescope Control System and incorporating custom drivers for ancillary systems.

  3. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1989-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  4. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1988-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  5. Neuromodulation and plasticity in an autonomous robot.

    PubMed

    Sporns, Olaf; Alexander, William H

    2002-01-01

    In this paper we implement a computational model of a neuromodulatory system in an autonomous robot. The output of the neuromodulatory system acts as a value signal, modulating widely distributed synaptic changes. The model is based on anatomical and physiological properties of midbrain diffuse ascending systems, in particular parts of the dopamine and noradrenaline systems. During reward conditioning, the model learns to generate tonic and phasic signals that represent predictions and prediction errors, including precisely timed negative signals if expected rewards are omitted or delayed. We test the robot's learning and behavior in different environmental contexts and observe changes in the development of the neuromodulatory system that depend upon environmental factors. Simulation of a computational model incorporating both reward-related and aversive stimuli leads to the emergence of conditioned reward and aversive behaviors. These studies represent a step towards investigating computational aspects of neuromodulatory systems in autonomous robots.

  6. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  7. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-01-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  8. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-10-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  9. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  10. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks.

  11. SIR-1: An autonomous mobile sentry robot

    SciTech Connect

    Harrington, J.J.; Klarer, P.R.

    1987-01-01

    This paper describes a prototype mobile robot system configured to function as part of an overall security system at a high security facility. The features of this robot system include specialized software and sensors for navigation without the need for external locator beacons or sign posts, sensors for remote imaging and intruder detection, and data link facilities to communicate information either directly to an electronic security system or to a manned central control center. Other features of the robot system include low weight, compact size, and low power consumption. The robot system can operate either by remote manual control, or it can operate autonomously where the need for direct human control is limited to the global command level. The robot can act as a mobile remote sensing platform for visual alarm assessment or roving patrol, or as an exploratory device in situations potentially hazardous to humans. This robot system may also be used to walk-test intrusion detection sensors as part of a routine test and maintenance program for an interior intrusion detection system (IDS), and to provide a programmable, temporary sensor capability to backup an IDS sensor that has failed. This capability may also be used to provide improved sensor coverage of an area that will be secured on a temporary or short term basis, thereby eliminating the need for a permanent sensor installation. The hardware, software, and operation of this robot system are briefly described.

  12. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings.

  13. SIR-1: An autonomous mobile sentry robot

    SciTech Connect

    Harrington, J.J.; Klarer, P.R.

    1987-05-01

    This paper describes a prototype mobile robot system configured to function as part of an overall security system at a high security facility. The features of this robot system include specialized software and sensors for navigation without the need for external locator beacons or sign posts, sensors for remote imaging and intruder detection, and data link facilities to communicate information either directly to an electronic security system or to a manned central control center. Other features of the robot system include low weight, compact size, and low power consumption. The robot system can operate either by remote manual control, or it can operate autonomously where the need for direct human control is limited to the global command level. The robot can act as a mobile remote sensing platform for visual alarm assessment or roving patrol, or as an exploratory device in situations potentially hazardous to humans. This robot system may also be used to walk-test intrusion detection sensors as part of a routine test and maintenance program for an interior intrusion detection system (IDS), and to provide a programmable, temporary sensor capability to backup an IDS sensor that has failed. This capability may also be used to provide improved sensor coverage of an area that will be secured on a temporary or short term basis, thereby eliminating the need for a permanent sensor installation. The hardware, software, and operation of this robot system will be briefly described herein.

  14. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  15. Autonomous mobile robot research using the HERMIES-III robot

    SciTech Connect

    Pin, F.G.; Beckerman, M.; Spelt, P.F.; Robinson, J.T.; Weisbin, C.R.

    1989-01-01

    This paper reports on the status and future directions in the research, development and experimental validation of intelligent control techniques for autonomous mobile robots using the HERMIES-III robot at the Center for Engineering Systems Advanced research (CESAR) at Oak Ridge National Laboratory (ORNL). HERMIES-III is the fourth robot in a series of increasingly more sophisticated and capable experimental test beds developed at CESAR. HERMIES-III is comprised of a battery powered, onmi-directional wheeled platform with a seven degree-of-freedom manipulator arm, video cameras, sonar range sensors, laser imaging scanner and a dual computer system containing up to 128 NCUBE nodes in hypercube configuration. All electronics, sensors, computers, and communication equipment required for autonomous operation of HERMIES-III are located on board along with sufficient battery power for three to four hours of operation. The paper first provides a more detailed description of the HERMIES-III characteristics, focussing on the new areas of research and demonstration now possible at CESAR with this new test-bed. The initial experimental program is then described with emphasis placed on autonomous performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES- III). The paper concludes with a discussion of the integration problems and safety considerations necessarily arising from the set-up of an experimental program involving human-scale, multi-autonomous mobile robots performance. 10 refs., 3 figs.

  16. An architecture for an autonomous learning robot

    NASA Technical Reports Server (NTRS)

    Tillotson, Brian

    1988-01-01

    An autonomous learning device must solve the example bounding problem, i.e., it must divide the continuous universe into discrete examples from which to learn. We describe an architecture which incorporates an example bounder for learning. The architecture is implemented in the GPAL program. An example run with a real mobile robot shows that the program learns and uses new causal, qualitative, and quantitative relationships.

  17. Autonomous environment modeling by a mobile robot

    NASA Astrophysics Data System (ADS)

    Moutarlier, Philippe

    1991-02-01

    Internal geometric representation of the environment is considered. The autonomy of a mobile robot partly relies on its ability to build a reliable representation of its environment. On the other hand, an autonomous environment building process requires that model be adapted to plan motions and perception actions. Therefore, the modeling process must be a reversible interface between perception motion devices and the model itself. Several kinds of models are necessary in order to achieve an autonomous process. Sensors give stochastic information on the surface, navigation needs free-space representation, and perception planning requires aspect graphs. The functions of stochastic surface modeling, free space representation, and topological graph computing are presented through the integrated geometric model builder called 'Yaka.' Since all environment data uncertainties are correlated together through the robot location inaccuracy, classical filtering methods are inadequate. A method of computing a linear variance estimator, that is adapted to the problem, is proposed. This general formalism is validated by a large number of experimentation wherein the robot incrementally builds a surfacic representation of its environment. Free space cannot be deduced directly, at each step, from the surfacic data provided by the sensors. Innacuracies on object surfaces and uncertainties on the visibility of objects by the sensor as well as the possible motion of objects must all be taken into account for building the free space incrementally. Then, motion and perception planning for autonomous environment modeling are achieved using this free space model and topological location and aspect graphs.

  18. Evolutionary neurocontrollers for autonomous mobile robots.

    PubMed

    Floreano, D; Mondada, F

    1998-10-01

    In this article we describe a methodology for evolving neurocontrollers of autonomous mobile robots without human intervention. The presentation, which spans from technological and methodological issues to several experimental results on evolution of physical mobile robots, covers both previous and recent work in the attempt to provide a unified picture within which the reader can compare the effects of systematic variations on the experimental settings. After describing some key principles for building mobile robots and tools suitable for experiments in adaptive robotics, we give an overview of different approaches to evolutionary robotics and present our methodology. We start reviewing two basic experiments showing that different environments can shape very different behaviours and neural mechanisms under very similar selection criteria. We then address the issue of incremental evolution in two different experiments from the perspective of changing environments and robot morphologies. Finally, we investigate the possibility of evolving plastic neurocontrollers and analyse an evolved neurocontroller that relies on fast and continuously changing synapses characterized by dynamic stability. We conclude by reviewing the implications of this methodology for engineering, biology, cognitive science and artificial life, and point at future directions of research.

  19. Mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D. )

    1992-01-01

    The robotics development group at the Savannah River Laboratory (SRL) is developing a mobile autonomous robot that performs radiological surveys of potentially contaminated floors. The robot is called SIMON, which stands for Semi-Intelligent Mobile Observing Navigator. Certain areas of SRL are classified as radiologically controlled areas (RCAs). In an RCA, radioactive materials are frequently handled by workers, and thus, the potential for contamination is ever present. Current methods used for floor radiological surveying includes labor-intensive manual scanning or random smearing of certain floor locations. An autonomous robot such as SIMON performs the surveying task in a much more efficient manner and will track down contamination before it is contacted by humans. SIMON scans floors at a speed of 1 in./s and stops and alarms upon encountering contamination. Its environment is well defined, consisting of smooth building floors with wide corridors. The kind of contaminations that SIMON is capable of detecting are alpha and beta-gamma. The contamination levels of interest are low to moderate.

  20. A Proposal of Autonomous Robotic Systems Educative Environment

    NASA Astrophysics Data System (ADS)

    Ierache, Jorge; Garcia-Martinez, Ramón; de Giusti, Armando

    This work presents our experiences in the implementation of a laboratory of autonomous robotic systems applied to the training of beginner and advanced students doing a degree course in Computer Engineering., taking into account the specific technologies, robots, autonomous toys, and programming languages. They provide a strategic opportunity for human resources formation by involving different aspects which range from the specification elaboration, modeling, software development and implementation and testing of an autonomous robotic system.

  1. An autonomous vision-based mobile robot

    NASA Astrophysics Data System (ADS)

    Baumgartner, Eric Thomas

    This dissertation describes estimation and control methods for use in the development of an autonomous mobile robot for structured environments. The navigation of the mobile robot is based on precise estimates of the position and orientation of the robot within its environment. The extended Kalman filter algorithm is used to combine information from the robot's drive wheels with periodic observations of small, wall-mounted, visual cues to produce the precise position and orientation estimates. The visual cues are reliably detected by at least one video camera mounted on the mobile robot. Typical position estimates are accurate to within one inch. A path tracking algorithm is also developed to follow desired reference paths which are taught by a human operator. Because of the time-independence of the tracking algorithm, the speed that the vehicle travels along the reference path is specified independent from the tracking algorithm. The estimation and control methods have been applied successfully to two experimental vehicle systems. Finally, an analysis of the linearized closed-loop control system is performed to study the behavior and the stability of the system as a function of various control parameters.

  2. Artificial consciousness, artificial emotions, and autonomous robots.

    PubMed

    Cardon, Alain

    2006-12-01

    Nowadays for robots, the notion of behavior is reduced to a simple factual concept at the level of the movements. On another hand, consciousness is a very cultural concept, founding the main property of human beings, according to themselves. We propose to develop a computable transposition of the consciousness concepts into artificial brains, able to express emotions and consciousness facts. The production of such artificial brains allows the intentional and really adaptive behavior for the autonomous robots. Such a system managing the robot's behavior will be made of two parts: the first one computes and generates, in a constructivist manner, a representation for the robot moving in its environment, and using symbols and concepts. The other part achieves the representation of the previous one using morphologies in a dynamic geometrical way. The robot's body will be seen for itself as the morphologic apprehension of its material substrata. The model goes strictly by the notion of massive multi-agent's organizations with a morphologic control.

  3. PRIMUS: autonomous driving robot for military applications

    NASA Astrophysics Data System (ADS)

    Schwartz, Ingo

    2000-07-01

    This article describes the government experimental program PRIMUS (PRogram of Intelligent Mobile Unmanned Systems) and the achieved results of phase C demonstrated in summer 1999 on a military prooving ground. In this program there shall be shown the autonomous driving on an unmanned robot in open terrain. The most possible degree of autonomy shall be reached with today's technology to get a platform for different missions. The goal is to release the soldier from high dangerous tasks, to increase the performance and to come to a reduction of personnel and costs with unmanned systems. In phase C of the program two small tracked vehicles (Digitized Wiesel 2, airtransportable by CH53) are used. One as a robot vehicle the other as a command & control system. The Wiesel 2 is configured as a drive by wire-system and therefore well suited for the adaption of control computers. The autonomous detection and avoidance of obstacles in unknown, not cooperative environment is the main task. For navigation and orientation a sensor package is integrated. To detect obstacles the scene in the driving corridor of the robot is scanned 4 times per second by a 3D- Range image camera (LADAR). The measured 3D-range image is converted into a 2D-obstacle map and used as input for calculation of an obstacle free path. The combination of local navigation (obstacle avoidance) and global navigation leads to a collission free driving in open terrain to a predefined goal point with a velocity of up to 25km/h. A contour tracker with a TV-camera as sensor is also implemented which allows to follow contours (e.g. edge of a meadow) or to drive on paved or unpaved roads with a velocity up to 50km/h. In addition to these autonomous driving modes the operator in the command & control station can drive the robot by remote control. All the functions were successfully demonstrated in the summer 1999 on a military prooving ground. During a mission example the robot vehicle covered a distance of several

  4. Robotic technologies for outdoor industrial vehicles

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony

    2001-09-01

    The commercial industries of agriculture, mining, construction, and material handling employ a wide variety of mobile machines, including tractors, combines, Load-Haul-Dump vehicles, trucks, paving machines, fork trucks, and many more. Automation of these vehicles promises to improve productivity, reduce operational costs, and increase safety. Since the vehicles typically operate in difficult environments, under all weather conditions, and in the presence of people and other obstacles, reliable automation faces severe technical challenges. Furthermore, the viable technology solutions are constrained by cost considerations. Fortunately, due to the limited application domain, repetitive nature, and the utility of partial automation for most tasks, robotics technologies can have a profound impact on industrial vehicles. In this paper, we describe a technical approach developed at Carnegie Mellon University for automating mobile machines in several applications, including mass excavation, mining, and agriculture. The approach is introduced via case studies, and the results are presented.

  5. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  6. Development of autonomous eating mechanism for biomimetic robots

    NASA Astrophysics Data System (ADS)

    Jeong, Kil-Woong; Cho, Ik-Jin; Lee, Yun-Jung

    2005-12-01

    Most of the recently developed robots are human friendly robots which imitate animals or humans such as entertainment robot, bio-mimetic robot and humanoid robot. Interest for these robots are being increased because the social trend is focused on health, welfare, and graying. Autonomous eating functionality is most unique and inherent behavior of pets and animals. Most of entertainment robots and pet robots make use of internal-type battery. Entertainment robots and pet robots with internal-type battery are not able to operate during charging the battery. Therefore, if a robot has an autonomous function for eating battery as its feeds, the robot is not only able to operate during recharging energy but also become more human friendly like pets. Here, a new autonomous eating mechanism was introduced for a biomimetic robot, called ELIRO-II(Eating LIzard RObot version 2). The ELIRO-II is able to find a food (a small battery), eat and evacuate by itself. This work describe sub-parts of the developed mechanism such as head-part, mouth-part, and stomach-part. In addition, control system of autonomous eating mechanism is described.

  7. Task-level control for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid

    1994-01-01

    Task-level control refers to the integration and coordination of planning, perception, and real-time control to achieve given high-level goals. Autonomous mobile robots need task-level control to effectively achieve complex tasks in uncertain, dynamic environments. This paper describes the Task Control Architecture (TCA), an implemented system that provides commonly needed constructs for task-level control. Facilities provided by TCA include distributed communication, task decomposition and sequencing, resource management, monitoring and exception handling. TCA supports a design methodology in which robot systems are developed incrementally, starting first with deliberative plans that work in nominal situations, and then layering them with reactive behaviors that monitor plan execution and handle exceptions. To further support this approach, design and analysis tools are under development to provide ways of graphically viewing the system and validating its behavior.

  8. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  9. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-04

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist.

  10. Autonomous Dome for a Robotic Telescope

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Sengupta, A.; Ganesh, S.

    2016-12-01

    The Physical Research Laboratory operates a 50 cm robotic observatory at Mount Abu (Rajsthan, India). This Automated Telescope for Variability Studies (ATVS) makes use of the Remote Telescope System 2 (RTS2) for autonomous operations. The observatory uses a 3.5 m dome from Sirius Observatories. We have developed electronics using Arduino electronic circuit boards with home grown logic and software to control the dome operations. We are in the process of completing the drivers to link our Arduino based dome controller with RTS2. This document is a short description of the various phases of the development and their integration to achieve the required objective.

  11. Autonomous biomorphic robots as platforms for sensors

    SciTech Connect

    Tilden, M.; Hasslacher, B.; Mainieri, R.; Moses, J.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomous machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.

  12. Modular control systems for teleoperated and autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Kadonoff, Mark B.; Parish, David W.

    1995-01-01

    This paper will discuss components of a modular hardware and software architecture for mobile robots that supports both teleoperation and autonomous control. The Modular Autonomous Robot System architecture enables rapid development of control systems for unmanned vehicles for a wide variety of commercial and military applications.

  13. Radio Frequency Mapping using an Autonomous Robot: Application to the 2.4 GHz Band

    NASA Astrophysics Data System (ADS)

    Lebreton, J. M.; Murad, N. M.; Lorion, R.

    2016-03-01

    Radio signal strength measurement systems are essential to build a Radio Frequency (RF) mapping in indoor and outdoor environments for different application scenarios. This paper presents an autonomous robot making the construction of a radio signal mapping, by collecting and forwarding different useful information related to all access point devices and inherent to the robot towards the base station. A real case scenario is considered by measuring the RF field from our department network. The RF signal mapping consistency is shown by fitting the measurements with the radio signal strength model in two-dimensional area, and a path-loss exponent of 2.3 is estimated for the open corridor environment.

  14. Development of Outdoor Service Robot to Collect Trash on Streets

    NASA Astrophysics Data System (ADS)

    Obata, Masayuki; Nishida, Takeshi; Miyagawa, Hidekazu; Kondo, Takashi; Ohkawa, Fujio

    The outdoor service robot which we call OSR-01 is developed intending for cleaning up urban areas by means of collecting discarded trash such as PET bottles, cans, plastic bags and so on. We, in this paper, describe the architecture of OSR-01 consisting of hardwares such as sensors, a manipulator, driving wheels, etc. for searching for and picking up trash, and softwares such as fast pattern matching for identifying various trash and distance measurement for picking up via the manipulator. After describing the vision system in detail, which is one of the most critical parts of the trash collection task, we show the result of an open experiment in which OSR-01 collects PET bottles on a real shopping street in the special zone for robot research and development in Kitakyushu-city.

  15. Autonomous robotic navigation and pipe inspection: A simulation approach

    SciTech Connect

    Ioannou, D.; Wang, S.; Tulenko, J.S.

    1994-12-31

    An important task for an autonomously functioning robot in the nuclear industry is pipe inspection in a nuclear power plant. A typical scenario for such a robot: The robot enters a highly radioactive area to perform several inspection and cleanup tasks. Because the robot is functioning in a radioactive environment, it must perform these tasks in a limited time. As much information as possible should be extracted in the shortest time (i.e., with the fewest number of snapshots). At the University of Florida`s Mobile Robotics for Hazardous Environments Laboratory, a project is under way to build an autonomous robot that will function in a go-stop-process-go manner in a nuclear environment. The system follows the spirit of Thayer et al., but the difference is that it functions autonomously. This paper discusses a simulation of this system.

  16. Object guided autonomous exploration for mobile robots in indoor environments

    NASA Astrophysics Data System (ADS)

    Nieto-Granda, Carlos; Choudhary, Siddarth; Rogers, John G.; Twigg, Jeff; Murali, Varun; Christensen, Henrik I.

    2014-06-01

    Autonomous mobile robotic teams are increasingly used in exploration of indoor environments. Accurate modeling of the world around the robot and describing the interaction of the robot with the world greatly increases the ability of the robot to act autonomously. This paper demonstrates the ability of autonomous robotic teams to find objects of interest. A novel feature of our approach is the object discovery and the use of it to augment the mapping and navigation process. The generated map can then be decomposed into semantic regions while also considering the distance and line of sight to anchor points. The advantage of this approach is that the robot can return a dense map of the region around an object of interest. The robustness of this approach is demonstrated in indoor environments with multiple platforms with the objective of discovering objects of interest.

  17. Quantifying Emergent Behavior of Autonomous Robots

    NASA Astrophysics Data System (ADS)

    Martius, Georg; Olbrich, Eckehard

    2015-10-01

    Quantifying behaviors of robots which were generated autonomously from task-independent objective functions is an important prerequisite for objective comparisons of algorithms and movements of animals. The temporal sequence of such a behavior can be considered as a time series and hence complexity measures developed for time series are natural candidates for its quantification. The predictive information and the excess entropy are such complexity measures. They measure the amount of information the past contains about the future and thus quantify the nonrandom structure in the temporal sequence. However, when using these measures for systems with continuous states one has to deal with the fact that their values will depend on the resolution with which the systems states are observed. For deterministic systems both measures will diverge with increasing resolution. We therefore propose a new decomposition of the excess entropy in resolution dependent and resolution independent parts and discuss how they depend on the dimensionality of the dynamics, correlations and the noise level. For the practical estimation we propose to use estimates based on the correlation integral instead of the direct estimation of the mutual information using the algorithm by Kraskov et al. (2004) which is based on next neighbor statistics because the latter allows less control of the scale dependencies. Using our algorithm we are able to show how autonomous learning generates behavior of increasing complexity with increasing learning duration.

  18. Autonomous surgical robotics using 3-D ultrasound guidance: feasibility study.

    PubMed

    Whitman, John; Fronheiser, Matthew P; Ivancevich, Nikolas M; Smith, Stephen W

    2007-10-01

    The goal of this study was to test the feasibility of using a real-time 3D (RT3D) ultrasound scanner with a transthoracic matrix array transducer probe to guide an autonomous surgical robot. Employing a fiducial alignment mark on the transducer to orient the robot's frame of reference and using simple thresholding algorithms to segment the 3D images, we tested the accuracy of using the scanner to automatically direct a robot arm that touched two needle tips together within a water tank. RMS measurement error was 3.8% or 1.58 mm for an average path length of 41 mm. Using these same techniques, the autonomous robot also performed simulated needle biopsies of a cyst-like lesion in a tissue phantom. This feasibility study shows the potential for 3D ultrasound guidance of an autonomous surgical robot for simple interventional tasks, including lesion biopsy and foreign body removal.

  19. Biomimetic smart sensors for autonomous robotic behavior I: acoustic processing

    NASA Astrophysics Data System (ADS)

    Deligeorges, Socrates; Xue, Shuwan; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Robotics are rapidly becoming an integral tool on the battlefield and in homeland security, replacing humans in hazardous conditions. To enhance the effectiveness of robotic assets and their interaction with human operators, smart sensors are required to give more autonomous function to robotic platforms. Biologically inspired sensors are an essential part of this development of autonomous behavior and can increase both capability and performance of robotic systems. Smart, biologically inspired acoustic sensors have the potential to extend autonomous capabilities of robotic platforms to include sniper detection, vehicle tracking, personnel detection, and general acoustic monitoring. The key to enabling these capabilities is biomimetic acoustic processing using a time domain processing method based on the neural structures of the mammalian auditory system. These biologically inspired algorithms replicate the extremely adaptive processing of the auditory system yielding high sensitivity over broad dynamic range. The algorithms provide tremendous robustness in noisy and echoic spaces; properties necessary for autonomous function in real world acoustic environments. These biomimetic acoustic algorithms also provide highly accurate localization of both persistent and transient sounds over a wide frequency range, using baselines on the order of only inches. A specialized smart sensor has been developed to interface with an iRobot Packbot® platform specifically to enhance its autonomous behaviors in response to personnel and gunfire. The low power, highly parallel biomimetic processor, in conjunction with a biomimetic vestibular system (discussed in the companion paper), has shown the system's autonomous response to gunfire in complicated acoustic environments to be highly effective.

  20. An Autonomous Mobile Robot for Tsukuba Challenge: JW-Future

    NASA Astrophysics Data System (ADS)

    Fujimoto, Katsuharu; Kaji, Hirotaka; Negoro, Masanori; Yoshida, Makoto; Mizutani, Hiroyuki; Saitou, Tomoya; Nakamura, Katsu

    “Tsukuba Challenge” is the only of its kind to require mobile robots to work autonomously and safely on public walkways. In this paper, we introduce the outline of our robot “JW-Future”, developed for this experiment based on an electric wheel chair. Additionally, the significance of participation to such a technical trial is discussed from the viewpoint of industries.

  1. Autonomous Realtime Threat-Hunting Robot (ARTHR

    ScienceCinema

    INL

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  2. Autonomous Realtime Threat-Hunting Robot (ARTHR

    SciTech Connect

    INL

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  3. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  4. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  5. Tele-assistance for semi-autonomous robots

    NASA Technical Reports Server (NTRS)

    Rogers, Erika; Murphy, Robin R.

    1994-01-01

    This paper describes a new approach in semi-autonomous mobile robots. In this approach the robot has sufficient computerized intelligence to function autonomously under a certain set of conditions, while the local system is a cooperative decision making unit that combines human and machine intelligence. Communication is then allowed to take place in a common mode and in a common language. A number of exception-handling scenarios that were constructed as a result of experiments with actual sensor data collected from two mobile robots were presented.

  6. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  7. REACT - A Third Generation Language For Autonomous Robot Systems

    NASA Astrophysics Data System (ADS)

    Longley, Maxwell J.; Owens, John; Allen, Charles R.; Ratcliff, Karl

    1990-03-01

    REACT is a language under development at Newcastle for the programming of autonomous robot systems, which uses AI constructs and sensor information to respond to failures in assumptions about the real-world by replanning a task. This paper describes the important features of a REACT programmed robotic system, and the results of some initial studies made on defining an executive language using a concept called visiblity sets. Several examples from the language are then applied to specific examples e.g. a white line follower and a railway network controller. The applicability of visibility sets to autonomous robots is evaluated.

  8. Autonomous robot calibration for hand-eye coordination

    SciTech Connect

    Bennett, D.J.; Geiger, D. ); Hollerbach, J.M. )

    1991-10-01

    Autonomous robot calibration is defined as the process of determining a robot's model by using only its internal sensors. It is shown that autonomous calibration of a manipulator and stereo camera system is possible. The proposed autonomous calibration algorithm may obtain the manipulator kinematic parameters, external kinematic camera parameters, and internal camera parameters. To do this, only joint angle readings and camera image plane data are used. A condition for the identifiability of the manipulator/camera parameters is derived. The method is a generalization of a recently developed scheme for self-calibrating a manipulator by forming it into a mobile closed-loop kinematic chain.

  9. Experimentation and concept formation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Oliver, G.; Silliman, M.

    1990-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning which involves autonomous concept formation using feedback from trial-and-error experimentation with the environment. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 11 refs., 7 figs.

  10. Sensory architectures for biologically inspired autonomous robotics.

    PubMed

    Higgins, C M

    2001-04-01

    Engineers have a lot to gain from studying biology. The study of biological neural systems alone provides numerous examples of computational systems that are far more complex than any man-made system and perform real-time sensory and motor tasks in a manner that humbles the most advanced artificial systems. Despite the evolutionary genesis of these systems and the vast apparent differences between species, there are common design strategies employed by biological systems that span taxa, and engineers would do well to emulate these strategies. However, biologically-inspired computational architectures, which are continuous-time and parallel in nature, do not map well onto conventional processors, which are discrete-time and serial in operation. Rather, an implementation technology that is capable of directly realizing the layered parallel structure and nonlinear elements employed by neurobiology is required for power- and space-efficient implementation. Custom neuromorphic hardware meets these criteria and yields low-power dedicated sensory systems that are small, light, and ideal for autonomous robot applications. As examples of how this technology is applied, this article describes both a low-level neuromorphic hardware emulation of an elementary visual motion detector, and a large-scale, system-level spatial motion integration system.

  11. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  12. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  13. Autonomous Evolution of Dynamic Gaits with Two Quadruped Robots

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Takamura, Seichi; Yamamoto, Takashi; Fujita, Masahiro

    2004-01-01

    A challenging task that must be accomplished for every legged robot is creating the walking and running behaviors needed for it to move. In this paper we describe our system for autonomously evolving dynamic gaits on two of Sony's quadruped robots. Our evolutionary algorithm runs on board the robot and uses the robot's sensors to compute the quality of a gait without assistance from the experimenter. First we show the evolution of a pace and trot gait on the OPEN-R prototype robot. With the fastest gait, the robot moves at over 10/min/min., which is more than forty body-lengths/min. While these first gaits are somewhat sensitive to the robot and environment in which they are evolved, we then show the evolution of robust dynamic gaits, one of which is used on the ERS-110, the first consumer version of AIBO.

  14. A Vision-Based Trajectory Controller for Autonomous Cleaning Robots

    NASA Astrophysics Data System (ADS)

    Gerstmayr, Lorenz; Röben, Frank; Krzykawski, Martin; Kreft, Sven; Venjakob, Daniel; Möller, Ralf

    Autonomous cleaning robots should completely cover the accessible area with minimal repeated coverage. We present a mostly visionbased navigation strategy for systematical exploration of an area with meandering lanes. The results of the robot experiments show that our approach can guide the robot along parallel lanes while achieving a good coverage with only a small proportion of repeated coverage. The proposed method can be used as a building block for more elaborated navigation strategies which allow the robot to systematically clean rooms with a complex workspace shape.

  15. Taking on the tall poles of autonomous robot navigation

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark; Rajagopalan, Venkat; Steinbis, John; Haddon, John; Cannon, Paul

    2011-05-01

    The Holy Grail of autonomous ground robotics has been to make ground vehicles that behave like humans. Over the years, as a community, we have realized the difficulty of this task, and we have back pedaled from the initial Holy Grail and have constrained and narrowed the domains of operation in order to get robotic systems fielded. This has lead to phrases such as "operation in structured environments" and "open-and-rolling terrain" in the context of autonomous robot navigation. Unfortunately, constraining the problem in this way has only put off the inevitable, i.e., solving the myriad of difficult robotics problems that we identified as long ago as the 1980's on the Autonomous Land Vehicle Project and in most cases are still facing today. These "Tall Poles" have included but are not limited to navigation through complex terrain geometry, navigation through thick vegetation, the detection of geometry-less obstacles such as negative obstacles and thin obstacles, the ability to deal with diverse and dynamic environmental conditions, the ability to function in dynamic and cluttered environments alongside other humans, and any combination of the above. This paper is an overview of the progress we have made at Autonomous Systems over the last three years in trying to knock down some of the tall poles remaining in the field of autonomous ground robotics.

  16. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    ScienceCinema

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  17. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    SciTech Connect

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  18. GPS and odometer data fusion for outdoor robots continuous positioning

    NASA Astrophysics Data System (ADS)

    Pozo-Ruz, Ana; Garcia-Perez, Lia; Garcia-Alegre, Maria C.; Guinea, Domingo; Ribeiro, Angela; Sandoval, Francisco

    2002-02-01

    Present work describes an approximation to obtain the best estimation of the position of the outdoor robot ROJO, a low cost lawnmower to perform unmanned precision agriculture task such are the spraying of pesticides in horticulture. For continuous location of ROJO, two redundant sensors have been installed onboard: a DGPS submetric precision model and an odometric system. DGPS system will allow an absolute positioning of the vehicle in the field, but GPS failures in the reception of the signals due to obstacles and electrical and meteorological disturbance, lead us to the integration of the odometric system. Thus, a robust odometer based upon magnetic strip sensors has been designed and integrated in the vehicle. These sensors continuosly deliver the position of the vehicle relative to its initial position, complementing the DGPS blindness periods. They give an approximated location of the vehicle in the field that can be in turn conveniently updated and corrected by the DGPS. Thus, to provided the best estimation, a fusion algorithm has been proposed and proved, wherein the best estimation is calculated as the maximum value of the join probability function obtained from both position estimation of the onboard sensors. Some results are presented to show the performance of the proposed sensor fusion technique.

  19. Interactive animated displayed of man-controlled and autonomous robots

    SciTech Connect

    Crane, C.D. III; Duffy, J.

    1986-01-01

    An interactive computer graphics program has been developed which allows an operator to more readily control robot motions in two distinct modes; viz., man-controlled and autonomous. In man-controlled mode, the robot is guided by a joystick or similar device. As the robot moves, actual joint angle information is measured and supplied to a graphics system which accurately duplicates the robot motion. Obstacles are placed in the actual and animated workspace and the operator is warned of imminent collisions by sight and sound via the graphics system. Operation of the system in man-controlled mode is shown. In autonomous mode, a collision-free path between specified points is obtained by previewing robot motions on the graphics system. Once a satisfactory path is selected, the path characteristics are transmitted to the actual robot and the motion is executed. The telepresence system developed at the University of Florida has been successful in demonstrating that the concept of controlling a robot manipulator with the aid of an interactive computer graphics system is feasible and practical. The clarity of images coupled with real-time interaction and real-time determination of imminent collision with obstacles has resulted in improved operator performance. Furthermore, the ability for an operator to preview and supervise autonomous operations is a significant attribute when operating in a hazardous environment.

  20. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  1. Online Tracking Control of Autonomous Mobile Robot Utilizing Optimal Formulation

    NASA Astrophysics Data System (ADS)

    Hirakoso, Nobuto; Takizawa, Takahiro; Ishihara, Masaaki; Aoki, Kouzou

    In this study, the objective is to build a wheeled mobile robot which can move independently avoiding obstacles. To move autonomously, this robot is enabled to detect obstacles' shapes and conduct self-localization. Also, this robot can move by tracking trajectories designed by the robot itself, based on the information about the obstacles' shapes and the robot's position and attitude angle. The optimal trajectories which lead the robot to its destination are designed by using a unique optimization method. As convergent calculation is performed by setting the variables within a certain range in this proposed optimization method, the optimal solutions can be obtained approximately, even in cases where there is a difference between the number of input and output variables, and when the nonlinearity is strong with restraint conditions. In this thesis, the effectiveness of the optimal track designing method used is proven and the method deemed as appropriate.

  2. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  3. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-01-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  4. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-11-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  5. Automatic detection and classification of obstacles with applications in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Rosas-Miranda, Dario I.

    2016-04-01

    Hardware implementation of an automatic detection and classification of objects that can represent an obstacle for an autonomous mobile robot using stereo vision algorithms is presented. We propose and evaluate a new method to detect and classify objects for a mobile robot in outdoor conditions. This method is divided in two parts, the first one is the object detection step based on the distance from the objects to the camera and a BLOB analysis. The second part is the classification step that is based on visuals primitives and a SVM classifier. The proposed method is performed in GPU in order to reduce the processing time values. This is performed with help of hardware based on multi-core processors and GPU platform, using a NVIDIA R GeForce R GT640 graphic card and Matlab over a PC with Windows 10.

  6. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans.

  7. FPGA implementation of vision algorithms for small autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Lee, D. J.; Archibald, J. K.

    2005-10-01

    The use of on-board vision with small autonomous robots has been made possible by the advances in the field of Field Programmable Gate Array (FPGA) technology. By connecting a CMOS camera to an FPGA board, on-board vision has been used to reduce the computation time inherent in vision algorithms. The FPGA board allows the user to create custom hardware in a faster, safer, and more easily verifiable manner that decreases the computation time and allows the vision to be done in real-time. Real-time vision tasks for small autonomous robots include object tracking, obstacle detection and avoidance, and path planning. Competitions were created to demonstrate that our algorithms work with our small autonomous vehicles in dealing with these problems. These competitions include Mouse-Trapped-in-a-Box, where the robot has to detect the edges of a box that it is trapped in and move towards them without touching them; Obstacle Avoidance, where an obstacle is placed at any arbitrary point in front of the robot and the robot has to navigate itself around the obstacle; Canyon Following, where the robot has to move to the center of a canyon and follow the canyon walls trying to stay in the center; the Grand Challenge, where the robot had to navigate a hallway and return to its original position in a given amount of time; and Stereo Vision, where a separate robot had to catch tennis balls launched from an air powered cannon. Teams competed on each of these competitions that were designed for a graduate-level robotic vision class, and each team had to develop their own algorithm and hardware components. This paper discusses one team's approach to each of these problems.

  8. ODYSSEUS autonomous walking robot: The leg/arm design

    NASA Technical Reports Server (NTRS)

    Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.

    1994-01-01

    ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.

  9. An Aerial–Ground Robotic System for Navigation and Obstacle Mapping in Large Outdoor Areas

    PubMed Central

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  10. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  11. Modeling and Control Strategies for Autonomous Robotic Systems

    DTIC Science & Technology

    1991-12-23

    Robotic Systems 12 PERSONAL AUTHOR(S) Roger W. Brockett A]&. TYPE Of REPORT 113b. TIME COVERD 114 DATt__Of RE PVRT (Year- month, Day) S.PAGE COUNT Final...NO. ACCESSION NO Reseaich Triangle Park, NC 27709-2211I I 11 TITLE (ir-’-4 Cae-unrv Canmuicauon) Modeling and Control Strategies for Autonomous

  12. Navigation and learning experiments by an autonomous robot

    SciTech Connect

    de Saussure, G.; Weisbin, C.R.; Spelt, P.F.

    1988-01-01

    Developing an autonomous mobile robot capable of navigation, surveillance and manipulation in complex and dynamic environments is a key research activity at CESAR, Oak Ridge National Laboratory's Center for Engineering Systems Advanced Research. The latest series of completed experiments was performed using the autonomous mobile robot HERMIES-IIB (Hostile Environment Robotic Machine Intelligence Experiment Series II-B). The next section describes HERMIES-IIB and some of its major components required for autonomous operation in unstructured, dynamic environments. Section 3 outlines some ongoing research in autonomous navigation. Section 4 discusses our newest research in machine learning concepts. Section 5 describes a successful experiment in which the robot is placed in an arbitrary initial location without any prior specification of the content of its environment, successively discovers and navigates around stationary or moving obstacles, picks up and moves small obstacles, searches for a control panel and performs a learned sequence of manipulations on the panel devices. The last section outlines some future directions of the program.

  13. GRACE: An Autonomous Robot for the AAAI Robot Challenge

    DTIC Science & Technology

    2003-01-01

    robot interaction (aside from the speech recognition) worked relatively well, there were areas for improvement. For instance, gesture recognition , which...and tracking, people following, gesture recognition , nametag reading, and face recognition. We plan to incorporate capabilities for the robot to

  14. Defining proprioceptive behaviors for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Overholt, James L.; Hudas, Greg R.; Gerhart, Grant R.

    2002-07-01

    Proprioception is a sense of body position and movement that supports the control of many automatic motor functions such as posture and locomotion. This concept, normally relegated to the fields of neural physiology and kinesiology, is being utilized in the field of unmanned mobile robotics. This paper looks at developing proprioceptive behaviors for use in controlling an unmanned ground vehicle. First, we will discuss the field of behavioral control of mobile robots. Next, a discussion of proprioception and the development of proprioceptive sensors will be presented. We will then focus on the development of a unique neural-fuzzy architecture that will be used to incorporate the control behaviors coming directly from the proprioceptive sensors. Finally we will present a simulation experiment where a simple multi-sensor robot, utilizing both external and proprioceptive sensors, is presented with the task of navigating an unknown terrain to a known target position. Results of the mobile robot utilizing this unique fusion methodology will be discussed.

  15. Applications of concurrent neuromorphic algorithms for autonomous robots

    NASA Technical Reports Server (NTRS)

    Barhen, J.; Dress, W. B.; Jorgensen, C. C.

    1988-01-01

    This article provides an overview of studies at the Oak Ridge National Laboratory (ORNL) of neural networks running on parallel machines applied to the problems of autonomous robotics. The first section provides the motivation for our work in autonomous robotics and introduces the computational hardware in use. Section 2 presents two theorems concerning the storage capacity and stability of neural networks. Section 3 presents a novel load-balancing algorithm implemented with a neural network. Section 4 introduces the robotics test bed now in place. Section 5 concerns navigation issues in the test-bed system. Finally, Section 6 presents a frequency-coded network model and shows how Darwinian techniques are applied to issues of parameter optimization and on-line design.

  16. Research in autonomous robotics at ORNL using HERMIES-III

    SciTech Connect

    Weisbin, C.R.; Burks, B.L.; Einstein, J.R.; Feezell, R.R.; Manges, W.W.; Thompson, D.H.

    1989-01-01

    HERMIES-III is an autonomous robot comprised of a seven degree-of- freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omnidirectional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information. 10 refs., 4 figs.

  17. Autonomous learning in humanoid robotics through mental imagery.

    PubMed

    Di Nuovo, Alessandro G; Marocco, Davide; Di Nuovo, Santo; Cangelosi, Angelo

    2013-05-01

    In this paper we focus on modeling autonomous learning to improve performance of a humanoid robot through a modular artificial neural networks architecture. A model of a neural controller is presented, which allows a humanoid robot iCub to autonomously improve its sensorimotor skills. This is achieved by endowing the neural controller with a secondary neural system that, by exploiting the sensorimotor skills already acquired by the robot, is able to generate additional imaginary examples that can be used by the controller itself to improve the performance through a simulated mental training. Results and analysis presented in the paper provide evidence of the viability of the approach proposed and help to clarify the rational behind the chosen model and its implementation.

  18. The Baker Observatory Robotic Autonomous Telescope

    NASA Astrophysics Data System (ADS)

    Reed, Mike D.; Thompson, Matthew A.; Hicks, L. L.; Baran, A. S.

    2011-03-01

    The objective of our project is to have an autonomous observatory to obtain long duration time-series observations of pulsating stars. Budget constraints dictate an inexpensive facility. In this paper, we discuss our solution.

  19. Concurrent planning and execution for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid G.

    1992-01-01

    The Task Control Architecture (TCA) provides communication and coordination facilities to construct distributed, concurrent robotic systems. The use of TCA in a system that walks a legged robot through rugged terrain is described. The walking system, as originally implemented, had a sequential sense-plan-act control cycle. Utilizing TCA features for task sequencing and monitoring, the system was modified to concurrently plan and execute steps. Walking speed improved by over 30 percent, with only a relatively modest conversion effort.

  20. System safety analysis of an autonomous mobile robot

    SciTech Connect

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  1. Multiagent collaboration for experimental calibration of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Vachon, Bertrand; Berge-Cherfaoui, Veronique

    1991-03-01

    This paper presents an action in mission SOCRATES whose aim is the development of a self-calibration method for an autonomous mobile robot. The robot has to determine the precise location of the coordinate system shared by its sensors. Knowledge of this system is a sine qua non condition for efficient multisensor fusion and autonomous navigation in an unknown environment. But, as perceptions and motions are not accurate, this knowledge can only be achieved by multisensor fusion. The application described highlights this kind of problem. Multisensor fusion is used here especially in its symbolic aspect. Useful knowledge includes both numerous data coming from various sensors and suitable ways to process these data. A blackboard architecture has been chosen to manage useful information. Knowledge sources are called agents and the implement physical sensors (perceptors or actuators) as well as logical sensors (high level data processors). The problem to solve is self- calibration which includes the determination of the coordinate system R of the robot and the transformations necessary to convert data from sensor reference to R. The origin of R has been chosen to be O, the rotation center of the robot. As its genuine location may vary due to robot or ground characteristics, an experimental determination of O is attempted. A strategy for measuring distances in approximate positions is proposed. This strategy must take into account the fact that motions of the robot as well as perceptions may be inaccurate. Results obtained during experiments and future extensions of the system are discussed.

  2. Biomimetic smart sensors for autonomous robotic behavior II: vestibular processing

    NASA Astrophysics Data System (ADS)

    Xue, Shuwan; Deligeorges, Socrates; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Limited autonomous behaviors are fast becoming a critical capability in the field of robotics as robotic applications are used in more complicated and interactive environments. As additional sensory capabilities are added to robotic platforms, sensor fusion to enhance and facilitate autonomous behavior becomes increasingly important. Using biology as a model, the equivalent of a vestibular system needs to be created in order to orient the system within its environment and allow multi-modal sensor fusion. In mammals, the vestibular system plays a central role in physiological homeostasis and sensory information integration (Fuller et al, Neuroscience 129 (2004) 461-471). At the level of the Superior Colliculus in the brain, there is multimodal sensory integration across visual, auditory, somatosensory, and vestibular inputs (Wallace et al, J Neurophysiol 80 (1998) 1006-1010), with the vestibular component contributing a strong reference frame gating input. Using a simple model for the deep layers of the Superior Colliculus, an off-the-shelf 3-axis solid state gyroscope and accelerometer was used as the equivalent representation of the vestibular system. The acceleration and rotational measurements are used to determine the relationship between a local reference frame of a robotic platform (an iRobot Packbot®) and the inertial reference frame (the outside world), with the simulated vestibular input tightly coupled with the acoustic and optical inputs. Field testing of the robotic platform using acoustics to cue optical sensors coupled through a biomimetic vestibular model for "slew to cue" gunfire detection have shown great promise.

  3. Rice-obot 1: An intelligent autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R.; Ciscon, L.; Berberian, D.

    1989-01-01

    The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes.

  4. Application of a Chaotic Oscillator in an Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tlelo-Cuautle, Esteban; Ramos-López, Hugo C.; Sánchez-Sánchez, Mauro; Pano-Azucena, Ana D.; Sánchez-Gaspariano, Luis A.; Núñez-Pérez, José C.; Camas-Anzueto, Jorge L.

    2014-05-01

    Terrain exploration robots can be of great usefulness in critical navigation circumstances. However, the challenge is how to guarantee a control for covering a full terrain area. That way, the application of a chaotic oscillator to control the wheels of an autonomous mobile robot, is introduced herein. Basically, we describe the realization of a random number generator (RNG) based on a double-scroll chaotic oscillator, which is used to guide the robot to cover a full terrain area. The resolution of the terrain exploration area is determined by both the number of bits provided by the RNG and the characteristics of step motors. Finally, the experimental results highlight the covered area by painting the trajectories that the robot explores.

  5. Autonomous robots for hazardous and unstructured environments

    SciTech Connect

    Hamel, W.R.; Babcock, S.M.; Hall, M.G.; Jorgenson, C.C.; Killough, S.M.; Weisbin, C.R.

    1986-01-01

    This paper reports continuing research in the areas of navigation and manipulation in unstructured environments, which is being carried out at the Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL). The HERMIES-II mobile robot, a low-cost prototype of a series that will include many of the major features required for remote operations in hazardous environments, is discussed. Progress toward development of a high-performance research manipulator is presented, and application of an advanced parallel computer to mobile robot problems, which is under way, is discussed.

  6. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, A.M.; Ward, C.R.; Jones, J.D.; Mallet, W.R.; Harpring, L.J.; Collins, M.X.; Anderson, E.K.

    1999-08-10

    A mobile robotic system is described that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console. 4 figs.

  7. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, Aed M.; Ward, Clyde R.; Jones, Joel D.; Mallet, William R.; Harpring, Larry J.; Collins, Montenius X.; Anderson, Erin K.

    1999-01-01

    A mobile robotic system that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console.

  8. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  9. Automatic Welding System Using Speed Controllable Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Kim, Taewon; Suto, Takeshi; Kobayashi, Junya; Kim, Jongcheol; Suga, Yasuo

    A prototype of autonomous mobile robot with two vision sensors for automatic welding of steel plates was constructed. The robot can move straight, steer and turn around the robot center by controlling the driving speed of the two wheels respectively. At the tip of the movable arm, two CCD cameras are fixed. A local camera observes the welding line near the welding torch and another wide camera observes relatively wide area in front of the welding part. The robot controls the traveling speed in accordance with the shape of the welding line. In the case of straight welding line, the speed of the robot is accelerated and the welding efficiency is improved. However, if the robot finds a corner of welding line, the speed is decelerated in order to realize the precise seam tracking and stable welding. Therefore, the robot can realize precise and high speed seam-tracking by controlling the travel speed. The effectiveness of the control system is confirmed by welding experiments.

  10. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed.

  11. Autonomous Robot System for Sensor Characterization

    SciTech Connect

    David Bruemmer; Douglas Few; Frank Carney; Miles Walton; Heather Hunting; Ron Lujan

    2004-03-01

    This paper discusses an innovative application of new Markov localization techniques that combat the problem of odometry drift, allowing a novel control architecture developed at the Idaho National Engineering and Environmental Laboratory (INEEL) to be utilized within a sensor characterization facility developed at the Remote Sensing Laboratory (RSL) in Nevada. The new robotic capability provided by the INEEL will allow RSL to test and evaluate a wide variety of sensors including radiation detection systems, machine vision systems, and sensors that can detect and track heat sources (e.g. human bodies, machines, chemical plumes). By accurately moving a target at varying speeds along designated paths, the robotic solution allows the detection abilities of a wide variety of sensors to be recorded and analyzed.

  12. A task control architecture for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Mitchell, Tom

    1990-01-01

    An architecture is presented for controlling robots that have multiple tasks, operate in dynamic domains, and require a fair degree of autonomy. The architecture is built on several layers of functionality, including a distributed communication layer, a behavior layer for querying sensors, expanding goals, and executing commands, and a task level for managing the temporal aspects of planning and achieving goals, coordinating tasks, allocating resources, monitoring, and recovering from errors. Application to a legged planetary rover and an indoor mobile manipulator is described.

  13. Autonomous Robotic Following Using Vision Based Techniques

    DTIC Science & Technology

    2005-02-03

    different methods for the soldier’s control of the vehicle are being investigated. One such method is the Leader - Follower approach. In the Field So...what is the current state of the art for leader - follower applications? One of the leaders in this field is the RF ATD (Robotic Follower Advanced...these systems have in common? Both of these platforms are representative of the state-of-the-art of current leader - follower technology being tested by

  14. Autonomous Military Robotics: Risk, Ethics, and Design

    DTIC Science & Technology

    2008-12-20

    avenge the deaths of their brothers in arms—unlawful actions that carry a significant political cost. Indeed, robots may act as objective...unblinking observers on the battlefield, reporting any unethical behavior back to command; their mere presence as such would discourage all-too-human... act in compliance with the LOW and ROE (though this may not be as straightforward and simply as it first appears) or act ethically in the specific

  15. Concurrent algorithms for autonomous robot navigation in an unexplored terrain

    SciTech Connect

    Rao, S.V.N.; Iyengar, S.S.; Jorgensen, C.C.; Weisbin, C.R.

    1986-01-01

    Navigation planning is one of the vital aspects of any autonomous mobile robot. In this paper, we present concurrent algorithms for an autonomous robot navigation system that does not require a pre-learned obstacle terrain model. The terrain model is gradually built by integrating the information from multiple journeys. The available information is used to the maximum extent in navigation planning, and global optimality is gradually achieved. It is shown that these concurrent algorithms are free from deadlocks and starvation. The performance of the concurrent algorithms is analyzed in terms of the planning time, travel time, scanning time, and update time. A modified adjacency list is proposed as the data structure for the spatial graph that represents an obstacle terrain. The time complexities of various algorithms that access, maintain, and update the spatial graph are estimated, and the effectiveness of the implementation is illustrated.

  16. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  17. A fuzzy logic controller for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Yen, John; Pfluger, Nathan

    1993-01-01

    The ability of a mobile robot system to plan and move intelligently in a dynamic system is needed if robots are to be useful in areas other than controlled environments. An example of a use for this system is to control an autonomous mobile robot in a space station, or other isolated area where it is hard or impossible for human life to exist for long periods of time (e.g., Mars). The system would allow the robot to be programmed to carry out the duties normally accomplished by a human being. Some of the duties that could be accomplished include operating instruments, transporting objects, and maintenance of the environment. The main focus of our early work has been on developing a fuzzy controller that takes a path and adapts it to a given environment. The robot only uses information gathered from the sensors, but retains the ability to avoid dynamically placed obstacles near and along the path. Our fuzzy logic controller is based on the following algorithm: (1) determine the desired direction of travel; (2) determine the allowed direction of travel; and (3) combine the desired and allowed directions in order to determine a direciton that is both desired and allowed. The desired direction of travel is determined by projecting ahead to a point along the path that is closer to the goal. This gives a local direction of travel for the robot and helps to avoid obstacles.

  18. Autonomous robotic platforms for locating radio sources buried under rubble

    NASA Astrophysics Data System (ADS)

    Tasu, A. S.; Anchidin, L.; Tamas, R.; Paun, M.; Danisor, A.; Petrescu, T.

    2016-12-01

    This paper deals with the use of autonomous robotic platforms able to locate radio signal sources such as mobile phones, buried under collapsed buildings as a result of earthquakes, natural disasters, terrorism, war, etc. This technique relies on averaging position data resulting from a propagation model implemented on the platform and the data acquired by robotic platforms at the disaster site. That allows us to calculate the approximate position of radio sources buried under the rubble. Based on measurements, a radio map of the disaster site is made, very useful for locating victims and for guiding specific rubble lifting machinery, by assuming that there is a victim next to a mobile device detected by the robotic platform; by knowing the approximate position, the lifting machinery does not risk to further hurt the victims. Moreover, by knowing the positions of the victims, the reaction time is decreased, and the chances of survival for the victims buried under the rubble, are obviously increased.

  19. Autonomous Robotic Refueling System (ARRS) for rapid aircraft turnaround

    NASA Astrophysics Data System (ADS)

    Williams, O. R.; Jackson, E.; Rueb, K.; Thompson, B.; Powell, K.

    An autonomous robotic refuelling system is being developed to achieve rapid aircraft turnaround, notably during combat operations. The proposed system includes a gantry positioner with sufficient reach to position a robotic arm that performs the refuelling tasks; a six degree of freedom manipulator equipped with a remote center of compliance, torque sensor, and a gripper that can handle standard tools; a computer vision system to locate and guide the refuelling nozzle, inspect the nozzle, and avoid collisions; and an operator interface with video and graphics display. The control system software will include components designed for trajectory planning and generation, collision detection, sensor interfacing, sensory processing, and human interfacing. The robotic system will be designed so that upgrading to perform additional tasks will be relatively straightforward.

  20. Distributed multisensor blackboard system for an autonomous robot

    NASA Astrophysics Data System (ADS)

    Kappey, Dietmar; Pokrandt, Peter; Schloen, Jan

    1994-10-01

    Sensoric data enable a robotic system to react to events occurring in its environment. Much work has been done on the development of various sensors and algorithms to extract information from an environment. On the other hand, only little work has been done in the field of multisensor communication. This paper presents a shared memory based communication protocol that has been developed for the autonomous robot system KAMRO. This system consists of two PUMA 260 manipulators and an omnidirectionally driven mobile platform. The proposed approach is based on logical sensors, which can be used to dynamically build hierarchical sensor units. The protocol uses a distributed blackboard structure for the transmission of sensor data and commands. To support asynchronous coupling of robots and sensors, it not only transfers single sensor values, but also offers functions to estimate future values.

  1. A functional system architecture for fully autonomous robot

    NASA Astrophysics Data System (ADS)

    Kalaycioglu, S.

    The Mobile Servicing System (MSS) Autonomous Robotics Program intends to define and plan the development of technologies required to provide a supervised autonomous operation capability for the Special Purpose Dexterous Manipulator (SPDM) on the MSS. The operational functions for the SPDM to perform the required tasks, both in fully autonomous or supervised modes, are identified. Functional decomposition is performed using a graphics oriented methodology called Structural Analysis Design Technique. This process defines the functional architecture of the system, the types of data required to support its functionality, and the control processes that need to be emplaced. On the basis of the functional decomposition, a technology breakdown structure is also developed. A preliminary estimate of the status and maturity of each relevant technology is made, based on this technology breakdown. The developed functional hierarchy is found to be very effective for a robotic system with any level of autonomy. Moreover, this hierarchy can easily be applied to an existing very low level autonomous system and can provide a smooth transition towards a higher degree of autonomy. The effectiveness of the developed functional hierarchy will also play a very significant role both in the system design as well as in the development of the control hierarchy.

  2. Active objects programming for military autonomous mobile robots software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-09-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge pannel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  3. Active object programming for military autonomous mobile robot software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-10-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge panel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  4. Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots

    DTIC Science & Technology

    2003-01-01

    Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots Eric Matson Scott DeLoach Multi-agent and Cooperative...00-00-2003 4. TITLE AND SUBTITLE Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots 5a. CONTRACT NUMBER 5b...company structure. Agent: Equivalent to autonomous robots in this instance. Agents coordinate through the organization via conversations and act

  5. Gyro and Accelerometer Based Navigation System for a Mobile Autonomous Robot.

    DTIC Science & Technology

    1985-12-02

    8217[ C) ~OF ~ FEB 13 1986 J GYRO AND ACCELEROMETER BASED NAVIGATION SYSTEM FOR A MOBILE AUTONOMOUS ROBOT Roland J. Bloom William J. Ramey, Jr. Captain...ACCELEROMETER BASED NAVIGATION SYSTEM FOR A MOBILE AUTONOMOUS ROBOT THESIS Roland J. Bloom William J. Ramey, Jr. Captain, USAF Captain, USAF AF IT/GA/GE/ENG/85D...MOBILE AUTONOMOUS ROBOT THE SI S Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology Air University In

  6. Context-Based Intent Understanding for Autonomous Systems in Naval and Collaborative Robot Applications

    DTIC Science & Technology

    2013-10-29

    29/2013 Final Report 8/1/2009 to 7/31/2013 Context-Based Intent Understanding for Autonomous Systems in Naval and Collaborative Robot Applications...are very good at recognizing intentions, endowing an autonomous system ( robot or simulated agent) with similar skills is a more complex problem, which...understanding, with specific focus on autonomous systems for naval and collaborative robotics applications. The main research problems we will address in

  7. Evolving self-assembly in autonomous homogeneous robots: experiments with two physical robots.

    PubMed

    Ampatzis, Christos; Tuci, Elio; Trianni, Vito; Christensen, Anders Lyhne; Dorigo, Marco

    2009-01-01

    This research work illustrates an approach to the design of controllers for self-assembling robots in which the self-assembly is initiated and regulated by perceptual cues that are brought forth by the physical robots through their dynamical interactions. More specifically, we present a homogeneous control system that can achieve assembly between two modules (two fully autonomous robots) of a mobile self-reconfigurable system without a priori introduced behavioral or morphological heterogeneities. The controllers are dynamic neural networks evolved in simulation that directly control all the actuators of the two robots. The neurocontrollers cause the dynamic specialization of the robots by allocating roles between them based solely on their interaction. We show that the best evolved controller proves to be successful when tested on a real hardware platform, the swarm-bot. The performance achieved is similar to the one achieved by existing modular or behavior-based approaches, also due to the effect of an emergent recovery mechanism that was neither explicitly rewarded by the fitness function, nor observed during the evolutionary simulation. Our results suggest that direct access to the orientations or intentions of the other agents is not a necessary condition for robot coordination: Our robots coordinate without direct or explicit communication, contrary to what is assumed by most research works in collective robotics. This work also contributes to strengthening the evidence that evolutionary robotics is a design methodology that can tackle real-world tasks demanding fine sensory-motor coordination.

  8. Sensing and modelling concepts for autonomous and remote mobile robot control

    NASA Astrophysics Data System (ADS)

    Yeung, S. K.; McMath, W. S.; Necsulescu, D.; Petriu, E. M.

    1993-01-01

    Sensing functions and modeling concepts for autonomous and remote mobile robot control in an unstructured environment are discussed. Sensing methods for robot position recovery, object recognition, and tactile teleoperator feedback are presented.

  9. Acquisition of Autonomous Behaviors by Robotic Assistants

    NASA Technical Reports Server (NTRS)

    Peters, R. A., II; Sarkar, N.; Bodenheimer, R. E.; Brown, E.; Campbell, C.; Hambuchen, K.; Johnson, C.; Koku, A. B.; Nilas, P.; Peng, J.

    2005-01-01

    Our research achievements under the NASA-JSC grant contributed significantly in the following areas. Multi-agent based robot control architecture called the Intelligent Machine Architecture (IMA) : The Vanderbilt team received a Space Act Award for this research from NASA JSC in October 2004. Cognitive Control and the Self Agent : Cognitive control in human is the ability to consciously manipulate thoughts and behaviors using attention to deal with conflicting goals and demands. We have been updating the IMA Self Agent towards this goal. If opportunity arises, we would like to work with NASA to empower Robonaut to do cognitive control. Applications 1. SES for Robonaut, 2. Robonaut Fault Diagnostic System, 3. ISAC Behavior Generation and Learning, 4. Segway Research.

  10. Recognition of traversable areas for mobile robotic navigation in outdoor environments.

    SciTech Connect

    Hutchinson, Scott Alan; Davidson, James C.

    2003-06-01

    In this paper we consider the problem of automatically determining whether regions in an outdoor environment can be traversed by a mobile robot. We propose a two-level classifier that uses data from a single color image to make this determination. At the low level, we have implemented three classifiers based on color histograms, directional filters and local binary patterns. The outputs of these low level classifiers are combined using a voting scheme that weights the results of each classifier using an estimate of its error probability. We present results from a large number of trials using a database of representative images acquired in real outdoor environments.

  11. A software architecture for autonomous orbital robotics

    NASA Astrophysics Data System (ADS)

    Henshaw, Carl G.; Akins, Keith; Creamer, N. Glenn; Faria, Matthew; Flagg, Cris; Hayden, Matthew; Healy, Liam; Hrolenok, Brian; Johnson, Jeffrey; Lyons, Kimberly; Pipitone, Frank; Tasker, Fred

    2006-05-01

    SUMO, the Spacecraft for the Universal Modification of Orbits, is a DARPA-sponsored spacecraft designed to provide orbital repositioning services to geosynchronous satellites. Such services may be needed to facilitate changing the geostationary slot of a satellite, to allow a satellite to be used until the propellant is expended instead of reserving propellant for a retirement burn, or to rescue a satellite stranded in geosynchronous transfer orbit due to a launch failure. Notably, SUMO is being designed to be compatible with the current geosynchronous satellite catalog, which implies that it does not require the customer spacecraft to have special docking fixtures, optical guides, or cooperative communications or pose sensors. In addition, the final approach and grapple will be performed autonomously. SUMO is being designed and built by the Naval Center for Space Technology, a division of the U.S. Naval Research Laboratory in Washington, DC. The nature of the SUMO concept mission leads to significant challenges in onboard spacecraft autonomy. Also, because research and development in machine vision, trajectory planning, and automation algorithms for SUMO is being pursued in parallel with flight software development, there are considerable challenges in prototyping and testing algorithms in situ and in transitioning these algorithms from laboratory form into software suitable for flight. This paper discusses these challenges, outlining the current SUMO design from the standpoint of flight algorithms and software. In particular, the design of the SUMO phase 1 laboratory demonstration software is described in detail. The proposed flight-like software architecture is also described.

  12. Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)

    NASA Technical Reports Server (NTRS)

    Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey

    1990-01-01

    Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.

  13. Low-cost semi-autonomous manipulation technique for explosive ordnance disposal robots

    NASA Astrophysics Data System (ADS)

    Czop, Andrew; Del Signore, Michael J.; Hacker, Kurt

    2008-04-01

    Robotic manipulators used on current EOD robotic platforms exhibit very few autonomous capabilities. This lack of autonomy forces the operator to completely control manipulator movements. With the increasing complexity of robotic manipulators, this can prove to be a very complex and tedious task. The development of autonomous capabilities for platform navigation are currently being extensively researched and applied to EOD robots. While autonomous manipulation has also been researched, this technology has yet to appear in fielded EOD robotic systems. As a result, there is a need for the exploration and development of manipulator automation within the scope of EOD robotics. In addition, due to the expendable nature of EOD robotic assets, the addition of this technology needs to add little to the overall cost of the robotic system. To directly address the need for a low-cost semi-autonomous manipulation capability for EOD robots, the Naval Explosive Ordnance Disposal Technology Division (NAVEODTECHDIV) proposes the Autonomous Robotic Manipulator (ARM). The ARM incorporates several semi-autonomous manipulation behaviors including point-and-click movement, user-defined distance movement, user-defined angle positioning, memory locations to save and recall manipulator positions, and macros to memorize and repeat multi-position repetitive manipulator movements. These semi-autonomous behaviors will decrease an EOD operator's time on target by reducing the manipulation workload in a user-friendly fashion. This conference paper will detail the background of the project, design of the prototype, algorithm development, implementation, results, and future work.

  14. Laser-based pedestrian tracking in outdoor environments by multiple mobile robots.

    PubMed

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-10-29

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures.

  15. Laser-Based Pedestrian Tracking in Outdoor Environments by Multiple Mobile Robots

    PubMed Central

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  16. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  17. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  18. Emotion understanding from the perspective of autonomous robots research.

    PubMed

    Cañamero, Lola

    2005-05-01

    In this paper, I discuss some of the contributions that modeling emotions in autonomous robots can make towards understanding human emotions-'as sited in the brain' and as used in our interactions with the environment-and emotions in general. Such contributions are linked, on the one hand, to the potential use of such robotic models as tools and 'virtual laboratories' to test and explore systematically theories and models of human emotions, and on the other hand to a modeling approach that fosters conceptual clarification and operationalization of the relevant aspects of theoretical notions and models. As illustrated by an overview of recent advances in the field, this area is still in its infancy. However, the work carried out already shows that we share many conceptual problems and interests with other disciplines in the affective sciences and that sound progress necessitates multidisciplinary efforts.

  19. Dynamic map building for an autonomous mobile robot

    SciTech Connect

    Leonard, J.J.; Durrant-Whyte, H.F. ); Cox, I.J. )

    1992-08-01

    This article presents an algorithm for autonomous map building and maintenance for a mobile robot. The authors believe that mobile robot navigation can be treated as a problem of tracking geometric features that occur naturally in the environment. They represent each feature in the map by a location estimate (the feature state vector) and two distinct measures of uncertainty: a covariance matrix to represent uncertainty in feature location, and a credibility measure to represent their belief in the validity of the feature. During each position update cycle, predicted measurements are generated for each geometric feature in the map and compared with actual sensor observations. Successful matches cause a feature's credibility to be increased. Unpredicted observations are used to initialize new geometric features, while unobserved predictions result in a geometric feature's credibility being decreased. They also describe experimental results obtained with the algorithm that demonstrate successful map building using real sonar data.

  20. On autonomous terrain model acquistion by a mobile robot

    NASA Technical Reports Server (NTRS)

    Rao, N. S. V.; Iyengar, S. S.; Weisbin, C. R.

    1987-01-01

    The following problem is considered: A point robot is placed in a terrain populated by an unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on scan from vertices strategy is given by the summation of i = 1 (sup n) N(O sub i)-n and summation of i = 1 (sup n) N(O sub i)-2n in two- and three-dimensional terrains respectively, where O = (O sub 1, O sub 2,....O sub n) set of the obstacles in the terrain, and N(O sub i) is the number of vertices of the obstacle O sub i.

  1. On autonomous terrain model acquisition by a mobile robot

    SciTech Connect

    Rao, N.S.V.; Iyengar, S.S.; Weisbin, C.R.

    1987-01-20

    The following problem is considered: A point robot is placed in a terrain populated by unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on 'scan from vertices' strategy is given by ..sigma../sub i = 1/sup n/N(O/sub i/) - n and ..sigma../sub i = 1/sup n/N(O/sub i/) - 2n in two and three dimensional terrains respectively, where O = )O/sub 1/,O/sub 2/,...,O/sub n/) set of the obstacles in the terrain, and N(O/sub i/) is the number of vertices of the obstacle O/sub i/.

  2. Classifying and recovering from sensing failures in autonomous mobile robots

    SciTech Connect

    Murphy, R.R.; Hershberger, D.

    1996-12-31

    This paper presents a characterization of sensing failures in autonomous mobile robots, a methodology for classification and recovery, and a demonstration of this approach on a mobile robot performing landmark navigation. A sensing failure is any event leading to defective perception, including sensor malfunctions, software errors, environmental changes, and errant expectations. The approach demonstrated in this paper exploits the ability of the robot to interact with its environment to acquire additional information for classification (i.e., active perception). A Generate and Test strategy is used to generate hypotheses to explain the symptom resulting from the sensing failure. The recovery scheme replaces the affected sensing processes with an alternative logical sensor. The approach is implemented as the Sensor Fusion Effects Exception Handling (SFX-EH) architecture. The advantages of SFX-EH are that it requires only a partial causal model of sensing failure, the control scheme strives for a fast response, tests are constructed so as to prevent confounding from collaborating sensors which have also failed, and the logical sensor organization allows SFX-EH to be interfaced with the behavioral level of existing robot architectures.

  3. Performance of visual and ultrasound sensing by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.

    1991-01-01

    This paper presents results of an experimental study of the reliability of an autonomous mobile robot operating in an unstructured environment. Examined in the study are the principal components of the visual and ultrasound sensor systems used to guide navigation and manipulation tasks of the robot. Performance criteria are established with respect to the requirements of the integrated robotic system. Repeated measurements are done of the geometric and spatial quantities used for docking the robot at a mock-up control panel, and for locating control panel devices to be manipulated. The systematic and random components of the errors in the measured quantities are exhibited, their origins are identified, and means for their reduction are developed. We focus on refinements of visual area data using ultrasound range data, and on extraction of yaw by visual and by ultrasound methods. Monte Carlo methods are used to study the sensor fusion, and angle-dependence considerations are used to characterize the precision of the yaw measurements. Issues relating to sensor models and sensor fusion, viewed as essential strategic components of intelligent systems, are then discussed. 32 refs., 13 figs., 5 tabs.

  4. Thermal Imaging for Robotic Applications in Outdoor Scenes

    DTIC Science & Technology

    1990-04-01

    PROGRAM PROJECT TASK WORK UNIT ELEMENT NO . NO NO . ACCESSION NO 11 TITLE (Include Security Classification) Thermal Imaging for Robotics Applications in...energy is called radiosity. Since there is almost no reflected energy in the infrared wavelength bands used by thermal cameras, the radiosity is the...absorb all incident energy. Consequently, that means that there is no reflected and no transmitted energy: a = 1 and p = r = 0 where a is the absorptivity

  5. Using Robotic Operating System (ROS) to control autonomous observatories

    NASA Astrophysics Data System (ADS)

    Vilardell, Francesc; Artigues, Gabriel; Sanz, Josep; García-Piquer, Álvaro; Colomé, Josep; Ribas, Ignasi

    2016-07-01

    Astronomical observatories are complex systems requiring the integration of numerous devices into a common platform. We are presenting here the firsts steps to integrate the popular Robotic Operating System (ROS) into the control of a fully autonomous observatory. The observatory is also equipped with a decision-making procedure that can automatically react to a changing environment (like weather events). The results obtained so far have shown that the automation of a small observatory can be greatly simplified when using ROS, as well as robust, with the implementation of our decision-making algorithms.

  6. Autonomous, teleoperated, and shared control of robot systems

    SciTech Connect

    Anderson, R.J.

    1994-12-31

    This paper illustrates how different modes of operation such as bilateral teleoperation, autonomous control, and shared control can be described and implemented using combinations of modules in the SMART robot control architecture. Telerobotics modes are characterized by different ``grids`` of SMART icons, where each icon represents a portion of run-time code that implements a passive control law. By placing strict requirements on the module`s input-output behavior and using scattering theory to develop a passive sampling technique, a flexible, expandable telerobot architecture is achieved. An automatic code generation tool for generating SMART systems is also described.

  7. An Economical Framework for Verification of Swarm-Based Algorithms Using Small, Autonomous Robots

    DTIC Science & Technology

    2006-09-01

    NAWCWD TP 8630 An Economical Framework for Verification of Swarm- Based Algorithms Using Small, Autonomous Robots by James...Verification of Swarm-Based Algorithms Using Small, Autonomous Robots (U) 6. AUTHOR(S) James Bobinchak, Eric Ford, Rodney Heil, and Duane Schwartzwald

  8. Design of a Micro-Autonomous Robot for Use in Astronomical Instruments

    NASA Astrophysics Data System (ADS)

    Cochrane, W. A.; Luo, X.; Lim, T.; Taylor, W. D.; Schnetler, H.

    2012-07-01

    A Micro-Autonomous Positioning System (MAPS) has been developed using micro-autonomous robots for the deployment of small mirrors within multi-object astronomical instruments for use on the next generation ground-based telescopes. The micro-autonomous robot is a two-wheel differential drive robot with a footprint of approximately 20 × 20 mm. The robot uses two brushless DC Smoovy motors with 125:1 planetary gearheads for positioning the mirror. This article describes the various elements of the overall system and in more detail the various robot designs. Also described in this article is the build and test of the most promising design, proving that micro-autonomous robot technology can be used in precision controlled applications.

  9. Using Insect Electroantennogram Sensors on Autonomous Robots for Olfactory Searches

    PubMed Central

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities1 or explosive traces in landmine fields2 face the same problem as insects foraging for food or searching for mates3: the olfactory search is constrained by the physics of turbulent transport4. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7 but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells8 or toxic and illicit substances9-11. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14 or using nanostructured gas sensors that mimic insect antennae15

  10. Using insect electroantennogram sensors on autonomous robots for olfactory searches.

    PubMed

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-08-04

    Robots designed to track chemical leaks in hazardous industrial facilities or explosive traces in landmine fields face the same problem as insects foraging for food or searching for mates: the olfactory search is constrained by the physics of turbulent transport. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells or toxic and illicit substances. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration or using nanostructured gas sensors that mimic insect antennae.

  11. Multi-polarimetric textural distinctiveness for outdoor robotic saliency detection

    NASA Astrophysics Data System (ADS)

    Haider, S. A.; Scharfenberger, C.; Kazemzadeh, F.; Wong, A.; Clausi, D. A.

    2015-01-01

    Mobile robots that rely on vision, for navigation and object detection, use saliency approaches to identify a set of potential candidates to recognize. The state of the art in saliency detection for mobile robotics often rely upon visible light imaging, using conventional camera setups, to distinguish an object against its surroundings based on factors such as feature compactness, heterogeneity and/or homogeneity. We are demonstrating a novel multi- polarimetric saliency detection approach which uses multiple measured polarization states of a scene. We leverage the light-material interaction known as Fresnel reflections to extract rotationally invariant multi-polarimetric textural representations to then train a high dimensional sparse texture model. The multi-polarimetric textural distinctiveness is characterized using a conditional probability framework based on the sparse texture model which is then used to determine the saliency at each pixel of the scene. It was observed that through the inclusion of additional polarized states into the saliency analysis, we were able to compute noticeably improved saliency maps in scenes where objects are difficult to distinguish from their background due to color intensity similarities between the object and its surroundings.

  12. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree.

    PubMed

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J

    2015-05-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon's operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis.

  13. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree

    PubMed Central

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J.

    2015-01-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon’s operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis. PMID:26405563

  14. The Busot Observatory: towards a robotic autonomous telescope

    NASA Astrophysics Data System (ADS)

    García-Lozano, R.; Rodes, J. J.; Torrejón, J. M.; Bernabéu, G.; Berná, J. Á.

    2016-12-01

    We describe the Busot observatory, our project of a fully robotic autonomous telescope. This astronomical observatory, which obtained the Minor Planet Centre code MPC-J02 in 2009, includes a 14 inch MEADE LX200GPS telescope, a 2 m dome, a ST8-XME CCD camera from SBIG, with an AO-8 adaptive optics system, and a filter wheel equipped with UBVRI system. We are also implementing a spectrograph SGS ST-8 for the telescope. Currently, we are involved in long term studies of variable sources such as X-ray binaries systems, and variable stars. In this work we also present the discovery of W UMa systems and its orbital periods derived from the photometry light curve obtained at Busot Observatory.

  15. Systematical development of an autonomous HPF driven and controlled inspection robot

    SciTech Connect

    Niewels, J.; Jorden, W.

    1994-12-31

    Autonomous service robots represent currently one of the technically most demanding robot systems. The paper describes the development of such a system for under water internal pipe inspection. Starting off from a brief look at current rends in robot design the authors approach the problem by taking a close look at the internal structure of autonomous robots. They then concentrate on a systematical modularized approach in designing the hardware unit of an inspection system. Employed test facilities within the process of system optimization like six axes force/torque senor, smart skin etc. are described as well. Finally, the paper will present first results gained from the design study.

  16. Mechanical deployment system on aries an autonomous mobile robot

    SciTech Connect

    Rocheleau, D.N.

    1995-12-01

    ARIES (Autonomous Robotic Inspection Experimental System) is under development for the Department of Energy (DOE) to survey and inspect drums containing low-level radioactive waste stored in warehouses at DOE facilities. This paper focuses on the mechanical deployment system-referred to as the camera positioning system (CPS)-used in the project. The CPS is used for positioning four identical but separate camera packages consisting of vision cameras and other required sensors such as bar-code readers and light stripe projectors. The CPS is attached to the top of a mobile robot and consists of two mechanisms. The first is a lift mechanism composed of 5 interlocking rail-elements which starts from a retracted position and extends upward to simultaneously position 3 separate camera packages to inspect the top three drums of a column of four drums. The second is a parallelogram special case Grashof four-bar mechanism which is used for positioning a camera package on drums on the floor. Both mechanisms are the subject of this paper, where the lift mechanism is discussed in detail.

  17. Interaction dynamics of multiple autonomous mobile robots in bounded spatial domains

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    A general navigation strategy for multiple autonomous robots in a bounded domain is developed analytically. Each robot is modeled as a spherical particle (i.e., an effective spatial domain about the center of mass); its interactions with other robots or with obstacles and domain boundaries are described in terms of the classical many-body problem; and a collision-avoidance strategy is derived and combined with homing, robot-robot, and robot-obstacle collision-avoidance strategies. Results from homing simulations involving (1) a single robot in a circular domain, (2) two robots in a circular domain, and (3) one robot in a domain with an obstacle are presented in graphs and briefly characterized.

  18. Human-robot interaction for field operation of an autonomous helicopter

    NASA Astrophysics Data System (ADS)

    Jones, Henry L.; Frew, Eric W.; Woodley, Bruce R.; Rock, Stephen M.

    1999-01-01

    The robustness of autonomous robotic systems to unanticipated circumstances is typically insufficient for use in the field. The many skills of a human user often fill this gap in robotic capability. To incorporate the human into the system, a useful interaction between man and machine must exist. This interaction should enable useful communication to be exchanged in a natural way between human and robot on a variety of levels. This paper describes the current human-robot interaction of the Stanford HUMMINGBIRD autonomous helicopter. In particular, the paper discuses the elements of the system that enable multiple levels of communication. An intelligent system agent manages the different inputs given to the helicopter. An advanced user interface gives the user and helicopter a method for exchanging useful information. Using this human-robot interaction, the HUMMINGBIRD has carried out various autonomous search, tracking, and retrieval missions.

  19. Fusing Laser Reflectance and Image Data for Terrain Classification for Small Autonomous Robots

    DTIC Science & Technology

    2014-12-01

    rithms for small autonomous mobile robots which have a low perspective, limited power, and limited payload capacity (see Fig. 1). These restrictions...classification with a mobile robot ,” Journal of Field Robotics , vol. 23, no. 2, pp. 103–122, 2006. [4] R. McGhee and A. Frank, “On the stability properties of...Borenstein, G. Witus, and R. Karlsen, “Terrain characteriza- tion and classification with a mobile robot ,” Journal of Field Robotics , vol. 23, no. 2

  20. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-09-14

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper.

  1. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring

    PubMed Central

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  2. Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 and Smart Autonomous Sand-Swimming Excavator

    NASA Technical Reports Server (NTRS)

    Sandy, Michael

    2015-01-01

    The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.

  3. Behavior-based multi-robot collaboration for autonomous construction tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.

  4. Control of autonomous mobile robots using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S.

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to provide a qualitative reasoning capability for the real-time control of autonomous mobile robots. The design and operation of these boards are described and an example of application of qualitative reasoning for the autonomous navigation of a mobile robot in a-priori unknown environments is presented. Results concerning consistency and modularity in the development of qualitative reasoning schemes as well as the general applicability of these techniques to robotic control domains are also discussed. 17 refs., 4 figs.

  5. Application of autonomous robotized systems for the collection of nearshore topographic changing and hydrodynamic measurements

    NASA Astrophysics Data System (ADS)

    Belyakov, Vladimir; Makarov, Vladimir; Zezyulin, Denis; Kurkin, Andrey; Pelinovsky, Efim

    2015-04-01

    Hazardous phenomena in the coastal zone lead to the topographic changing which are difficulty inspected by traditional methods. It is why those autonomous robots are used for collection of nearshore topographic and hydrodynamic measurements. The robot RTS-Hanna is well-known (Wubbold, F., Hentschel, M., Vousdoukas, M., and Wagner, B. Application of an autonomous robot for the collection of nearshore topographic and hydrodynamic measurements. Coastal Engineering Proceedings, 2012, vol. 33, Paper 53). We describe here several constructions of mobile systems developed in Laboratory "Transported Machines and Transported Complexes", Nizhny Novgorod State Technical University. They can be used in the field surveys and monitoring of wave regimes nearshore.

  6. First experiences with semi-autonomous robotic harvesting of protein crystals.

    PubMed

    Viola, Robert; Walsh, Jace; Melka, Alex; Womack, Wesley; Murphy, Sean; Riboldi-Tunnicliffe, Alan; Rupp, Bernhard

    2011-07-01

    The demonstration unit of the Universal Micromanipulation Robot (UMR) capable of semi-autonomous protein crystal harvesting has been tested and evaluated by independent users. We report the status and capabilities of the present unit scheduled for deployment in a high-throughput protein crystallization center. We discuss operational aspects as well as novel features such as micro-crystal handling and drip-cryoprotection, and we extrapolate towards the design of a fully autonomous, integrated system capable of reliable crystal harvesting. The positive to enthusiastic feedback from the participants in an evaluation workshop indicates that genuine demand exists and the effort and resources to develop autonomous protein crystal harvesting robotics are justified.

  7. Concept formation and generalization based on experimentation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Lyness, E.; Oliver, G.; Silliman, M.

    1989-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning problems which involves autonomous concept formation using feedback from trial-and-error learning. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 9 refs., 5 figs.

  8. Biomimetic autonomous robot inspired by the Cyanea capillata (Cyro).

    PubMed

    Villanueva, Alex A; Marut, Kenneth J; Michael, Tyler; Priya, Shashank

    2013-12-01

    A biomimetic robot inspired by Cyanea capillata, termed as 'Cyro', was developed to meet the functional demands of underwater surveillance in defense and civilian applications. The vehicle was designed to mimic the morphology and swimming mechanism of the natural counterpart. The body of the vehicle consists of a rigid support structure with linear DC motors which actuate eight mechanical arms. The mechanical arms in conjunction with artificial mesoglea create the hydrodynamic force required for propulsion. The full vehicle measures 170 cm in diameter and has a total mass of 76 kg. An analytical model of the mechanical arm kinematics was developed. The analytical and experimental bell kinematics were analyzed and compared to the C. capillata. Cyro was found to reach the water surface untethered and autonomously from a depth of 182 cm in five actuation cycles. It achieved an average velocity of 8.47 cm s(-1) while consuming an average power of 70 W. A two-axis thrust stand was developed to calculate the thrust directly from a single bell segment yielding an average thrust of 27.9 N for the whole vehicle. Steady state velocity during Cyro's swimming test was not reached but the measured performance during its last swim cycle resulted in a cost of transport of 10.9 J (kg ⋅ m)(-1) and total efficiency of 0.03.

  9. An intelligent hybrid behavior coordination system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Fallouh, Samer

    2013-12-01

    In this paper, development of a low-cost PID controller with an intelligent behavior coordination system for an autonomous mobile robot is described that is equipped with IR sensors, ultrasonic sensors, regulator, and RC filters on the robot platform based on HCS12 microcontroller and embedded systems. A novel hybrid PID controller and behavior coordination system is developed for wall-following navigation and obstacle avoidance of an autonomous mobile robot. Adaptive control used in this robot is a hybrid PID algorithm associated with template and behavior coordination models. Software development contains motor control, behavior coordination intelligent system and sensor fusion. In addition, the module-based programming technique is adopted to improve the efficiency of integrating the hybrid PID and template as well as behavior coordination model algorithms. The hybrid model is developed to synthesize PID control algorithms, template and behavior coordination technique for wall-following navigation with obstacle avoidance systems. The motor control, obstacle avoidance, and wall-following navigation algorithms are developed to propel and steer the autonomous mobile robot. Experiments validate how this PID controller and behavior coordination system directs an autonomous mobile robot to perform wall-following navigation with obstacle avoidance. Hardware configuration and module-based technique are described in this paper. Experimental results demonstrate that the robot is successfully capable of being guided by the hybrid PID controller and behavior coordination system for wall-following navigation with obstacle avoidance.

  10. Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)

    2002-01-01

    Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.

  11. Multi-robot terrain coverage and task allocation for autonomous detection of landmines

    NASA Astrophysics Data System (ADS)

    Dasgupta, Prithviraj; Muñoz-Meléndez, Angélica; Guruprasad, K. R.

    2012-06-01

    Multi-robot systems comprising of heterogeneous autonomous vehicles on land, air, water are being increasingly used to assist or replace humans in different hazardous missions. Two crucial aspects in such multi-robot systems are to: a) explore an initially unknown region of interest to discover tasks, and, b) allocate and share the discovered tasks between the robots in a coordinated manner using a multi-robot task allocation (MRTA) algorithm. In this paper, we describe results from our research on multi-robot terrain coverage and MRTA algorithms within an autonomous landmine detection scenario, done as part of the COMRADES project. Each robot is equipped with a different type of landmine detection sensor and different sensors, even of the same type, can have different degrees of accuracy. The landmine detection-related operations performed by each robot are abstracted as tasks and multiple robots are required to complete a single task. First, we describe a distributed and robust terrain coverage algorithm that employs Voronoi partitions to divide the area of interest among the robots and then uses a single-robot coverage algorithm to explore each partition for potential landmines. Then, we describe MRTA algorithms that use the location information of discovered potential landmines and employ either a greedy strategy, or, an opportunistic strategy to allocate tasks among the robots while attempting to minimize the time (energy) expended by the robots to perform the tasks. We report experimental results of our algorithms using accurately-simulated Corobot robots within the Webots simulator performing a multi-robot, landmine detection operation.

  12. Remote wave measurements using autonomous mobile robotic systems

    NASA Astrophysics Data System (ADS)

    Kurkin, Andrey; Zeziulin, Denis; Makarov, Vladimir; Belyakov, Vladimir; Tyugin, Dmitry; Pelinovsky, Efim

    2016-04-01

    The project covers the development of a technology for monitoring and forecasting the state of the coastal zone environment using radar equipment transported by autonomous mobile robotic systems (AMRS). Sought-after areas of application are the eastern and northern coasts of Russia, where continuous collection of information on topographic changes of the coastal zone and carrying out hydrodynamic measurements in inaccessible to human environment are needed. The intensity of the reflection of waves, received by radar surveillance, is directly related to the height of the waves. Mathematical models and algorithms for processing experimental data (signal selection, spectral analysis, wavelet analysis), recalculation of landwash from data on heights of waves far from the shore, determination of the threshold values of heights of waves far from the shore have been developed. There has been developed the program complex for functioning of the experimental prototype of AMRS, comprising the following modules: data loading module, reporting module, module of georeferencing, data analysis module, monitoring module, hardware control module, graphical user interface. Further work will be connected with carrying out tests of manufactured experimental prototype in conditions of selected routes coastline of Sakhalin Island. Conducting field tests will allow to reveal the shortcomings of development and to identify ways of optimization of the structure and functioning algorithms of AMRS, as well as functioning the measuring equipment. The presented results have been obtained in Nizhny Novgorod State Technical University n.a. R. Alekseev in the framework of the Federal Target Program «Research and development on priority directions of scientific-technological complex of Russia for 2014 - 2020 years» (agreement № 14.574.21.0089 (unique identifier of agreement - RFMEFI57414X0089)).

  13. An integrated design and fabrication strategy for entirely soft, autonomous robots

    NASA Astrophysics Data System (ADS)

    Wehner, Michael; Truby, Ryan L.; Fitzgerald, Daniel J.; Mosadegh, Bobak; Whitesides, George M.; Lewis, Jennifer A.; Wood, Robert J.

    2016-08-01

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  14. An integrated design and fabrication strategy for entirely soft, autonomous robots.

    PubMed

    Wehner, Michael; Truby, Ryan L; Fitzgerald, Daniel J; Mosadegh, Bobak; Whitesides, George M; Lewis, Jennifer A; Wood, Robert J

    2016-08-25

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  15. A testbed for a unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Lee, T.; Tso, K.; Backes, P.; Lloyd, J.

    1990-01-01

    This paper describes a complete robot control facility built at the Jet Propulsion Laboratory as part of NASA a telerobotics program to develop a state-of-the-art robot control environment for laboratory based space-like experiments. This system, which is now fully operational, has the following features: separation of the computing facilities into local and remote sites, autonomous motion generation in joint or Cartesian coordinates, dual-arm force reflecting teleoperation with voice interaction between the operator and the robots, shared control between the autonomously generated motions and operator controlled teleoperation, and dual-arm coordinated trajectory generation. The system has been used to carry out realistic experiments such as the exchange of an Orbital Replacement Unit (ORU), bolt turning, and door opening, using a mixture of autonomous actions and teleoperation, with either a single arm or two cooperating arms.

  16. How to make an autonomous robot as a partner with humans: design approach versus emergent approach.

    PubMed

    Fujita, M

    2007-01-15

    In this paper, we discuss what factors are important to realize an autonomous robot as a partner with humans. We believe that it is important to interact with people without boring them, using verbal and non-verbal communication channels. We have already developed autonomous robots such as AIBO and QRIO, whose behaviours are manually programmed and designed. We realized, however, that this design approach has limitations; therefore we propose a new approach, intelligence dynamics, where interacting in a real-world environment using embodiment is considered very important. There are pioneering works related to this approach from brain science, cognitive science, robotics and artificial intelligence. We assert that it is important to study the emergence of entire sets of autonomous behaviours and present our approach towards this goal.

  17. The experimental humanoid robot H7: a research platform for autonomous behaviour.

    PubMed

    Nishiwaki, Koichi; Kuffner, James; Kagami, Satoshi; Inaba, Masayuki; Inoue, Hirochika

    2007-01-15

    This paper gives an overview of the humanoid robot 'H7', which was developed over several years as an experimental platform for walking, autonomous behaviour and human interaction research at the University of Tokyo. H7 was designed to be a human-sized robot capable of operating autonomously in indoor environments designed for humans. The hardware is relatively simple to operate and conduct research on, particularly with respect to the hierarchical design of its control architecture. We describe the overall design goals and methodology, along with a summary of its online walking capabilities, autonomous vision-based behaviours and automatic motion planning. We show experimental results obtained by implementations running within a simulation environment as well as on the actual robot hardware.

  18. Using custom-designed VLSI fuzzy inferencing chips for the autonomous navigation of a mobile robot

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, Hiroyuki; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI fuzzy inferencing chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation in apriori unknown environments is discussed. An approach using superposition of elemental sensor-based behaviors is shown to alloy easy development and testing of the inferencing rule base, while providing for progressive addition of behaviors to resolve situations of increasing complexity. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse and inaccurate sensor data. 17 refs., 6 figs.

  19. Autonomous navigation of a mobile robot using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, H.; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of a mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation is a-priori unknown environments is discussed. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse inaccurate sensor data. 17 refs., 6 figs.

  20. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    PubMed

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1).

  1. An architectural approach to create self organizing control systems for practical autonomous robots

    NASA Technical Reports Server (NTRS)

    Greiner, Helen

    1991-01-01

    For practical industrial applications, the development of trainable robots is an important and immediate objective. Therefore, the developing of flexible intelligence directly applicable to training is emphasized. It is generally agreed upon by the AI community that the fusion of expert systems, neural networks, and conventionally programmed modules (e.g., a trajectory generator) is promising in the quest for autonomous robotic intelligence. Autonomous robot development is hindered by integration and architectural problems. Some obstacles towards the construction of more general robot control systems are as follows: (1) Growth problem; (2) Software generation; (3) Interaction with environment; (4) Reliability; and (5) Resource limitation. Neural networks can be successfully applied to some of these problems. However, current implementations of neural networks are hampered by the resource limitation problem and must be trained extensively to produce computationally accurate output. A generalization of conventional neural nets is proposed, and an architecture is offered in an attempt to address the above problems.

  2. 3D Ultrasound Guidance of Autonomous Robotic Breast Biopsy: Feasibility Study

    PubMed Central

    Liang, Kaicheng; Rogers, Albert J.; Light, Edward D.; von Allmen, Daniel; Smith, Stephen W.

    2009-01-01

    Feasibility studies of autonomous robot biopsies in tissue have been conducted using real time 3D ultrasound combined with simple thresholding algorithms. The robot first autonomously processed 3D image volumes received from the ultrasound scanner to locate a metal rod target embedded in turkey breast tissue simulating a calcification, and in a separate experiment, the center of a water-filled void in the breast tissue simulating a cyst. In both experiments the robot then directed a needle to the desired target, with no user input required. Separate needle-touch experiments performed by the image-guided robot in a water tank yielded an rms error of 1.15 mm. PMID:19900753

  3. Three-dimensional ultrasound guidance of autonomous robotic breast biopsy: feasibility study.

    PubMed

    Liang, Kaicheng; Rogers, Albert J; Light, Edward D; von Allmen, Daniel; Smith, Stephen W

    2010-01-01

    Feasibility studies of autonomous robot biopsies in tissue have been conducted using real-time three-dimensional (3-D) ultrasound combined with simple thresholding algorithms. The robot first autonomously processed 3-D image volumes received from the ultrasound scanner to locate a metal rod target embedded in turkey breast tissue simulating a calcification, and in a separate experiment, the center of a water-filled void in the breast tissue simulating a cyst. In both experiments the robot then directed a needle to the desired target, with no user input required. Separate needle-touch experiments performed by the image-guided robot in a water tank yielded an rms error of 1.15 mm. (E-mail: kaicheng.liang@duke.edu).

  4. Autonomous discovery and learning by a mobile robot in unstructured environments

    SciTech Connect

    Pin, F.G.; de Saussure, G.; Spelt, P.F.; Barnett, D.L.; Killough, S.M.; Weisbin, C.R.

    1988-01-01

    This paper presents recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of autonomous discovery and learning of emergency and maintenance tasks in unstructured environments by a mobile robot. The methodologies for learning basic operating principles of control devices, and for using the acquired knowledge to solve new problems with conditions not encountered before are presented. The algorithms necessary for the robot to discover problem-solving sequences of actions, through experimentation with the environment, in the two cases of immediate feedback and delayed feedback are described. The inferencing schemes allowing the robot to classify the information acquired from a reduced set of examples and to generalize its knowledge to a much wider problem-solving domain are also provided. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot is then presented. 8 refs., 2 figs.

  5. Research and development of Ro-boat: an autonomous river cleaning robot

    NASA Astrophysics Data System (ADS)

    Sinha, Aakash; Bhardwaj, Prashant; Vaibhav, Bipul; Mohommad, Noor

    2013-12-01

    Ro-Boat is an autonomous river cleaning intelligent robot incorporating mechanical design and computer vision algorithm to achieve autonomous river cleaning and provide a sustainable environment. Ro-boat is designed in a modular fashion with design details such as mechanical structural design, hydrodynamic design and vibrational analysis. It is incorporated with a stable mechanical system with air and water propulsion, robotic arms and solar energy source and it is proceed to become autonomous by using computer vision. Both "HSV Color Space" and "SURF" are proposed to use for measurements in Kalman Filter resulting in extremely robust pollutant tracking. The system has been tested with successful results in the Yamuna River in New Delhi. We foresee that a system of Ro-boats working autonomously 24x7 can clean a major river in a city on about six months time, which is unmatched by alternative methods of river cleaning.

  6. Passive optically encoded transponder (POET) - An acquisition and alignment target for autonomous robotics

    NASA Astrophysics Data System (ADS)

    White, G. K.

    1987-01-01

    This paper shows that it is possible to produce a three-dimensional target from a two-dimensional transponder that can enhance the capabilities of an optical measurement or alignment system, and that the autonomous operation of such a system is possible. The attitude and position resolution that is possible using such a configuration would allow noncontact coordinate system transfer and tracking capability in a robotic system, enabling a robot to access the physical database of an acquired, known target item and inspect, attach to, or manipulate any external part of the item in a teleoperated or autonomous mode without sophisticated visual capabilities.

  7. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  8. Novel Integrated System Architecture for an Autonomous Jumping Micro-Robot

    DTIC Science & Technology

    2010-01-01

    micro and nano scale require patterning small feature sizes. This presents a challenge to conventional photolithography techniques used in MEMS...ABSTRACT Title of Document: Novel Integrated System Architecture for an Autonomous Jumping Micro -Robot Wayne A. Churaman...evolve from the macro to micro -scale, innovation of such systems is driven by the notion that a robot must be able to sense, think, and act [1

  9. Autonomous avoidance based on motion delay of master-slave surgical robot.

    PubMed

    Inoue, Shintaro; Toyoda, Kazutaka; Kobayashi, Yo; Fujie, Masakatsu G

    2009-01-01

    Safe use of master-slave robots for endoscopic surgery requires autonomous motions to avert contact with vital organs, blood vessels, and nerves. Here we describe an avoidance control algorithm with delay compensation that takes the dynamic characteristics of the robot into account. To determine the operating parameters, we measured frequency characteristics of each joint of the slave-manipulator. The results suggest this delay compensation program improves avoidance performance.

  10. Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl

    2015-01-01

    Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.

  11. Autonomous Mobile Platform for Research in Cooperative Robotics

    NASA Technical Reports Server (NTRS)

    Daemi, Ali; Pena, Edward; Ferguson, Paul

    1998-01-01

    This paper describes the design and development of a platform for research in cooperative mobile robotics. The structure and mechanics of the vehicles are based on R/C cars. The vehicle is rendered mobile by a DC motor and servo motor. The perception of the robot's environment is achieved using IR sensors and a central vision system. A laptop computer processes images from a CCD camera located above the testing area to determine the position of objects in sight. This information is sent to each robot via RF modem. Each robot is operated by a Motorola 68HC11E micro-controller, and all actions of the robots are realized through the connections of IR sensors, modem, and motors. The intelligent behavior of each robot is based on a hierarchical fuzzy-rule based approach.

  12. Introduction to autonomous mobile robotics using Lego Mindstorms NXT

    NASA Astrophysics Data System (ADS)

    Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-12-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.

  13. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  14. Robust performance of multiple tasks by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Einstein, R.; Jones, J.P.; Spelt, P.D.; Weisbin, C.R.

    1989-01-01

    There have been many successful mobile robot experiments, but very few papers have appeared that examine the range of applicability, or robustness, of a robot system. The purpose of this paper is to determine and quantify robustness of the Hermies-IIB experimental capabilities. 6 refs., 1 tab.

  15. Artificial immune-network based autonomous mobile robots navigation and coordination

    NASA Astrophysics Data System (ADS)

    Duan, Q. J.; Wang, R. X.

    2005-12-01

    Based on the analogies between multi autonomous robots system (MARS) and immune system, a synthesized immune network is proposed, and used to solve the navigation and coordination problem on MARS. Individual robot was regarded as small-scaled immune networks (SN). Task was regarded as antigen, and behavior tactics were deemed to the antibodies respectively. Behavior tactic to a robot sensor was taken as B cell. Navigation and coordination problem is transformed into the interaction mechanism among antibody, antigen and small-scaled immune networks. The pursuit problem was used to validate the hypothesis. Simulation results suggest that the proposal is promising.

  16. Manifold traversing as a model for learning control of autonomous robots

    NASA Technical Reports Server (NTRS)

    Szakaly, Zoltan F.; Schenker, Paul S.

    1992-01-01

    This paper describes a recipe for the construction of control systems that support complex machines such as multi-limbed/multi-fingered robots. The robot has to execute a task under varying environmental conditions and it has to react reasonably when previously unknown conditions are encountered. Its behavior should be learned and/or trained as opposed to being programmed. The paper describes one possible method for organizing the data that the robot has learned by various means. This framework can accept useful operator input even if it does not fully specify what to do, and can combine knowledge from autonomous, operator assisted and programmed experiences.

  17. Towards robotic heart surgery: introduction of autonomous procedures into an experimental surgical telemanipulator system.

    PubMed

    Bauernschmitt, R; Schirmbeck, E U; Knoll, A; Mayer, H; Nagy, I; Wessel, N; Wildhirt, S M; Lange, R

    2005-09-01

    The introduction of telemanipulator systems into cardiac surgery enabled the heart surgeon to perform minimally invasive procedures with high precision and stereoscopic view. For further improvement and especially for inclusion of autonomous action sequences, implementation of force-feedback is necessary. The aim of our study was to provide a robotic scenario giving the surgeon an impression very similar to open procedures (high immersion) and to enable autonomous surgical knot tying with delicate suture material. In this experimental set-up the feasibility of autonomous surgical knot tying is demonstrated for the first time using stereoscopic view and force feedback.

  18. A Prototype Novel Sensor for Autonomous, Space Based Robots - Phase 2

    NASA Technical Reports Server (NTRS)

    Squillante, M. R.; Derochemont, L. P.; Cirignano, L.; Lieberman, P.; Soller, M. S.

    1990-01-01

    The goal of this program was to develop new sensing capabilities for autonomous robots operating in space. Information gained by the robot using these new capabilities would be combined with other information gained through more traditional capabilities, such as video, to help the robot characterize its environment as well as to identify known or unknown objects that it encounters. Several sensing capabilities using nuclear radiation detectors and backscatter technology were investigated. The result of this research has been the construction and delivery to NASA of a prototype system with three capabilities for use by autonomous robots. The primary capability was the use of beta particle backscatter measurements to determine the average atomic number (Z) of an object. This gives the robot a powerful tool to differentiate objects which may look the same, such as objects made out of different plastics or other light weight materials. In addition, the same nuclear sensor used in the backscatter measurement can be used as a nuclear spectrometer to identify sources of nuclear radiation that may be encountered by the robot, such as nuclear powered satellites. A complete nuclear analysis system is included in the software and hardware of the prototype system built in phase 2 of this effort. Finally, a method to estimate the radiation dose in the environment of the robot has been included as a third capability. Again, the same nuclear sensor is used in a different operating mode and with different analysis software. Each of these capabilities are described.

  19. Motor-response learning at a process control panel by an autonomous robot

    SciTech Connect

    Spelt, P.F.; de Saussure, G.; Lyness, E.; Pin, F.G.; Weisbin, C.R.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring and manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.

  20. Robot soccer anywhere: achieving persistent autonomous navigation, mapping, and object vision tracking in dynamic environments

    NASA Astrophysics Data System (ADS)

    Dragone, Mauro; O'Donoghue, Ruadhan; Leonard, John J.; O'Hare, Gregory; Duffy, Brian; Patrikalakis, Andrew; Leederkerken, Jacques

    2005-06-01

    The paper describes an ongoing effort to enable autonomous mobile robots to play soccer in unstructured, everyday environments. Unlike conventional robot soccer competitions that are usually held on purpose-built robot soccer "fields", in our work we seek to develop the capability for robots to demonstrate aspects of soccer-playing in more diverse environments, such as schools, hospitals, or shopping malls, with static obstacles (furniture) and dynamic natural obstacles (people). This problem of "Soccer Anywhere" presents numerous research challenges including: (1) Simultaneous Localization and Mapping (SLAM) in dynamic, unstructured environments, (2) software control architectures for decentralized, distributed control of mobile agents, (3) integration of vision-based object tracking with dynamic control, and (4) social interaction with human participants. In addition to the intrinsic research merit of these topics, we believe that this capability would prove useful for outreach activities, in demonstrating robotics technology to primary and secondary school students, to motivate them to pursue careers in science and engineering.

  1. Development of the Research Platform of Small Autonomous Blimp Robot

    NASA Astrophysics Data System (ADS)

    Takaya, Toshihiko; Kawamura, Hidenori; Yamamoto, Masahito; Ohuchi, Azuma

    A blimp robot is attractive as an small flight robot and can float in the air by buoyancy and realize safe to the crash small flight with low energy and can movement for a long time compared with other flight robots with low energy and can movement for a long time compared with other flight robots. However, control of an airplane robot is difficult for the nonlinear characteristic exposed to inertia by the air flow in response to influence. Therefore, the applied research which carried out the maximum use of such in recent years a blimp robot's feature is prosperous. In this paper, we realized development of blimp robot for research which can be used general-purpose by carrying out clue division of the blimp robot body at a unit, and constituting and building for research of blimp robot, and application development. On the other hand, by developing a general-purpose blimp robot research platform, improvement in the research efficiency of many researchers can be attained, and further, research start of blimp robot becomes easy and contributes to development of research. We performed the experiments for the above-mentioned proof. 1. Checked basic keeping position performance and that various orbital operation was possible. And the unit exchange ease of software unit was checked by the experiment which exchanges the control layer of software for learning control from PID control, and carries out comparison of operation. 2. In order to check the exchange ease of hardware unit, the sensor was exchanged for the microphon from the camera, and control of operation was checked. 3. For the unit addition ease, the microphon which carries out sound detection with the picture detection with a camera was added, and control of operation was verified. 4. The unit exchange was carried out for the check of a function addition and the topological map generation experiment by addition of an ultrasonic sensor was conducted. Developed blimp robot for research mounted the exchange ease

  2. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators.

    PubMed

    Marchese, Andrew D; Onal, Cagdas D; Rus, Daniela

    2014-03-01

    In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input-output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion.

  3. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators

    PubMed Central

    Onal, Cagdas D.; Rus, Daniela

    2014-01-01

    Abstract In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input–output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion. PMID:27625912

  4. A unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, Samad; Lee, Thomas S.; Tso, Kam Sing; Backes, Paul G.; Lloyd, John

    1991-01-01

    A description is given of complete robot control facility built as part of a NASA telerobotics program to develop a state-of-the-art robot control environment for performing experiments in the repair and assembly of spacelike hardware to gain practical knowledge of such work and to improve the associated technology. The basic architecture of the manipulator control subsystem is presented. The multiarm Robot Control C Library (RCCL), a key software component of the system, is described, along with its implementation on a Sun-4 computer. The system's simulation capability is also described, and the teleoperation and shared control features are explained.

  5. A simple, inexpensive, and effective implementation of a vision-guided autonomous robot

    NASA Astrophysics Data System (ADS)

    Tippetts, Beau; Lillywhite, Kirt; Fowers, Spencer; Dennis, Aaron; Lee, Dah-Jye; Archibald, James

    2006-10-01

    This paper discusses a simple, inexpensive, and effective implementation of a vision-guided autonomous robot. This implementation is a second year entrance for Brigham Young University students to the Intelligent Ground Vehicle Competition. The objective of the robot was to navigate a course constructed of white boundary lines and orange obstacles for the autonomous competition. A used electric wheelchair was used as the robot base. The wheelchair was purchased from a local thrift store for $28. The base was modified to include Kegresse tracks using a friction drum system. This modification allowed the robot to perform better on a variety of terrains, resolving issues with last year's design. In order to control the wheelchair and retain the robust motor controls already on the wheelchair the wheelchair joystick was simply removed and replaced with a printed circuit board that emulated joystick operation and was capable of receiving commands through a serial port connection. Three different algorithms were implemented and compared: a purely reactive approach, a potential fields approach, and a machine learning approach. Each of the algorithms used color segmentation methods to interpret data from a digital camera in order to identify the features of the course. This paper will be useful to those interested in implementing an inexpensive vision-based autonomous robot.

  6. Adaptive artificial neural network for autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    The topics are presented in viewgraph form and include: neural network controller for robot arm positioning with visual feedback; initial training of the arm; automatic recovery from cumulative fault scenarios; and error reduction by iterative fine movements.

  7. Detection of Water Hazards for Autonomous Robotic Vehicles

    NASA Technical Reports Server (NTRS)

    Matthes, Larry; Belluta, Paolo; McHenry, Michael

    2006-01-01

    Four methods of detection of bodies of water are under development as means to enable autonomous robotic ground vehicles to avoid water hazards when traversing off-road terrain. The methods involve processing of digitized outputs of optoelectronic sensors aboard the vehicles. It is planned to implement these methods in hardware and software that would operate in conjunction with the hardware and software for navigation and for avoidance of solid terrain obstacles and hazards. The first method, intended for use during the day, is based on the observation that, under most off-road conditions, reflections of sky from water are easily discriminated from the adjacent terrain by their color and brightness, regardless of the weather and of the state of surface waves on the water. Accordingly, this method involves collection of color imagery by a video camera and processing of the image data by an algorithm that classifies each pixel as soil, water, or vegetation according to its color and brightness values (see figure). Among the issues that arise is the fact that in the presence of reflections of objects on the opposite shore, it is difficult to distinguish water by color and brightness alone. Another issue is that once a body of water has been identified by means of color and brightness, its boundary must be mapped for use in navigation. Techniques for addressing these issues are under investigation. The second method, which is not limited by time of day, is based on the observation that ladar returns from bodies of water are usually too weak to be detected. In this method, ladar scans of the terrain are analyzed for returns and the absence thereof. In appropriate regions, the presence of water can be inferred from the absence of returns. Under some conditions in which reflections from the bottom are detectable, ladar returns could, in principle, be used to determine depth. The third method involves the recognition of bodies of water as dark areas in short

  8. Challenging of path planning algorithms for autonomous robot in known environment

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Irwan, N.; Zuraida, Raja Lailatul; Shaharum, Umairah; Hanafi@Omar, Hafiz Mohd

    2014-06-01

    Most of the mobile robot path planning is estimated to reach its predetermined aim through the shortest path and avoiding the obstacles. This paper is a survey on path planning algorithms of various current research and existing system of Unmanned Ground Vehicles (UGV) where their challenging issues to be intelligent autonomous robot. The focuses are some short reviews on individual papers for UGV in the known environment. Methods and algorithms in path planning for the autonomous robot had been discussed. From the reviews, we obtained that the algorithms proposed are appropriate for some cases such as single or multiple obstacles, static or movement obstacle and optimal shortest path. This paper also describes some pros and cons for every reviewed paper toward algorithms improvement for further work.

  9. Remote Sensing of Radiation Dose Rate by Customizing an Autonomous Robot

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Nakahara, M.; Morisato, K.; Takashina, T.; Kanematsu, H.

    2012-03-01

    Distribution of radiation dose was measured by customizing an autonomous cleaning robot "Roomba" and a scintillation counter. The robot was used as a vehicle carrying the scintillation survey meter, and was additionally equipped with an H8 micro computer to remote-control the vehicle and to send measured data. The data obtained were arranged with position data, and then the distribution map of the radiation dose rate was produced. Manual, programmed and autonomous driving tests were conducted, and all performances were verified. That is, for each operational mode, the measurements both with moving and with discrete moving were tried in and outside of a room. Consequently, it has been confirmed that remote sensing of radiation dose rate is possible by customizing a robot on market.

  10. Experiences in Deploying Test Arenas for Autonomous Mobile Robots

    DTIC Science & Technology

    2001-09-01

    wallpaper and other types of materials pose challenges to stereo vision algorithms. Compliant objects that may visually look like rigid obstacles...c.) Soft materials, victim under bedb.) Curved wall Figure 3: Features from the Yellow arena robot and find alternate routes to exit the arenas that...Figure 2. The Yellow arena is the easiest in terms of traversability. Researchers who may not have very agile robot platforms, yet want to test their

  11. Autonomous robot for detecting subsurface voids and tunnels using microgravity

    NASA Astrophysics Data System (ADS)

    Wilson, Stacy S.; Crawford, Nicholas C.; Croft, Leigh Ann; Howard, Michael; Miller, Stephen; Rippy, Thomas

    2006-05-01

    Tunnels have been used to evade security of defensive positions both during times of war and peace for hundreds of years. Tunnels are presently being built under the Mexican Border by drug smugglers and possibly terrorists. Several have been discovered at the border crossing at Nogales near Tucson, Arizona, along with others at other border towns. During this war on terror, tunnels under the Mexican Border pose a significant threat for the security of the United States. It is also possible that terrorists will attempt to tunnel under strategic buildings and possibly discharge explosives. The Center for Cave and Karst Study (CCKS) at Western Kentucky University has a long and successful history of determining the location of caves and subsurface voids using microgravity technology. Currently, the CCKS is developing a remotely controlled robot which will be used to locate voids underground. The robot will be a remotely controlled vehicle that will use microgravity and GPS to accurately detect and measure voids below the surface. It is hoped that this robot will also be used in military applications to locate other types of voids underground such as tunnels and bunkers. It is anticipated that the robot will be able to function up to a mile from the operator. This paper will describe the construction of the robot and the use of microgravity technology to locate subsurface voids with the robot.

  12. A traffic priority language for collision-free navigation of autonomous mobile robots in dynamic environments.

    PubMed

    Bourbakis, N G

    1997-01-01

    This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.

  13. Terrain coverage of an unknown room by an autonomous mobile robot

    SciTech Connect

    VanderHeide, J.R.

    1995-12-05

    Terrain coverage problems are nearly as old as mankind: they were necessary early in our history for basic activities such as finding food and other necessities. As our societies and their associated machineries have grown more complex, we have not outgrown the need for this primitive skill. It is still used on a small scale for cleaning tasks and on a large scale for {open_quotes}search and report{close_quotes} missions of various kinds. The motivation for automating this process may not lie in the novelty of anything we might gain as an end product, but in freedom from something which we as humans find tedious, time-consuming and sometimes dangerous. Here we consider autonomous coverage of a terrain, typically indoor rooms, by a mobile robot that has no a priori model of the terrain. In evaluating its surroundings, the robot employs only inexpensive and commercially available ultrasonic and infrared sensors. The proposed solution is a basic step - a proof of principle - that can contribute to robots capable of autonomously performing tasks such as vacuum cleaning, mopping, radiation scanning, etc. The area of automatic terrain coverage and the closely related problem of terrain model acquisition have been studied both analytically and experimentally. Compared to the existing works, the following are three major distinguishing aspects of our study: (1) the theory is actually applied to an existing robot, (2) the robot has no a priori knowledge of the terrain, and (3) the robot can be realized relatively inexpensively.

  14. Bio-inspired motion planning algorithms for autonomous robots facilitating greater plasticity for security applications

    NASA Astrophysics Data System (ADS)

    Guo, Yi; Hohil, Myron; Desai, Sachi V.

    2007-10-01

    Proposed are techniques toward using collaborative robots for infrastructure security applications by utilizing them for mobile sensor suites. A vast number of critical facilities/technologies must be protected against unauthorized intruders. Employing a team of mobile robots working cooperatively can alleviate valuable human resources. Addressed are the technical challenges for multi-robot teams in security applications and the implementation of multi-robot motion planning algorithm based on the patrolling and threat response scenario. A neural network based methodology is exploited to plan a patrolling path with complete coverage. Also described is a proof-of-principle experimental setup with a group of Pioneer 3-AT and Centibot robots. A block diagram of the system integration of sensing and planning will illustrate the robot to robot interaction to operate as a collaborative unit. The proposed approach singular goal is to overcome the limits of previous approaches of robots in security applications and enabling systems to be deployed for autonomous operation in an unaltered environment providing access to an all encompassing sensor suite.

  15. Evaluation of a Home Biomonitoring Autonomous Mobile Robot.

    PubMed

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too.

  16. Evaluation of a Home Biomonitoring Autonomous Mobile Robot

    PubMed Central

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  17. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  18. An autonomous mobil robot to perform waste drum inspections

    SciTech Connect

    Peterson, K.D.; Ward, C.R.

    1994-03-01

    A mobile robot is being developed by the Savannah River Technology Center (SRTC) Robotics Group of Westinghouse Savannah River company (WSRC) to perform mandated inspections of waste drums stored in warehouse facilities. The system will reduce personnel exposure and create accurate, high quality documentation to ensure regulatory compliance. Development work is being coordinated among several DOE, academic and commercial entities in accordance with DOE`s technology transfer initiative. The prototype system was demonstrated in November of 1993. A system is now being developed for field trails at the Fernald site.

  19. Development of a semi-autonomous service robot with telerobotic capabilities

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; White, D. R.

    1987-01-01

    The importance to the United States of semi-autonomous systems for application to a large number of manufacturing and service processes is very clear. Two principal reasons emerge as the primary driving forces for development of such systems: enhanced national productivity and operation in environments whch are hazardous to humans. Completely autonomous systems may not currently be economically feasible. However, autonomous systems that operate in a limited operation domain or that are supervised by humans are within the technology capability of this decade and will likely provide reasonable return on investment. The two research and development efforts of autonomy and telerobotics are distinctly different, yet interconnected. The first addresses the communication of an intelligent electronic system with a robot while the second requires human communication and ergonomic consideration. Discussed here are work in robotic control, human/robot team implementation, expert system robot operation, and sensor development by the American Welding Institute, MTS Systems Corporation, and the Colorado School of Mines--Center for Welding Research.

  20. Agent-based Multimodal Interface for Dynamically Autonomous Mobile Robots

    DTIC Science & Technology

    2003-01-01

    Recognition Process The gesture recognition process utilizes a structured-light rangefinder which emits a horizontal plane of laser light. A camera...statements, will be spoken by the robot through its on-board voice synthesizer, and also sent as a text string back to the desktop GUI. 4.1. Gesture

  1. Semi-autonomous surgical tasks using a miniature in vivo surgical robot.

    PubMed

    Dumpert, Jason; Lehman, Amy C; Wood, Nathan A; Oleynikov, Dmitry; Farritor, Shane M

    2009-01-01

    Natural Orifice Translumenal Endoscopic Surgery (NOTES) is potentially the next step in minimally invasive surgery. This type of procedure could reduce patient trauma through eliminating external incisions, but poses many surgical challenges that are not sufficiently overcome with current flexible endoscopy tools. A robotic platform that attempts to emulate a laparoscopic interface for performing NOTES procedures is being developed to address these challenges. These robots are capable of entering the peritoneal cavity through the upper gastrointestinal tract, and once inserted are not constrained by incisions, allowing for visualization and manipulations throughout the cavity. In addition to using these miniature in vivo robots for NOTES procedures, these devices can also be used to perform semi-autonomous surgical tasks. Such tasks could be useful in situations where the patient is in a location far from a trained surgeon. A surgeon at a remote location could control the robot even if the communication link between surgeon and patient has low bandwidth or very high latency. This paper details work towards using the miniature robot to perform simple surgical tasks autonomously.

  2. R-MASTIF: robotic mobile autonomous system for threat interrogation and object fetch

    NASA Astrophysics Data System (ADS)

    Das, Aveek; Thakur, Dinesh; Keller, James; Kuthirummal, Sujit; Kira, Zsolt; Pivtoraiko, Mihail

    2013-01-01

    Autonomous robotic "fetch" operation, where a robot is shown a novel object and then asked to locate it in the field, re- trieve it and bring it back to the human operator, is a challenging problem that is of interest to the military. The CANINE competition presented a forum for several research teams to tackle this challenge using state of the art in robotics technol- ogy. The SRI-UPenn team fielded a modified Segway RMP 200 robot with multiple cameras and lidars. We implemented a unique computer vision based approach for textureless colored object training and detection to robustly locate previ- ously unseen objects out to 15 meters on moderately flat terrain. We integrated SRI's state of the art Visual Odometry for GPS-denied localization on our robot platform. We also designed a unique scooping mechanism which allowed retrieval of up to basketball sized objects with a reciprocating four-bar linkage mechanism. Further, all software, including a novel target localization and exploration algorithm was developed using ROS (Robot Operating System) which is open source and well adopted by the robotics community. We present a description of the system, our key technical contributions and experimental results.

  3. An integrated movement capture and control platform applied towards autonomous movements of surgical robots.

    PubMed

    Daluja, Sachin; Golenberg, Lavie; Cao, Alex; Pandya, Abhilash K; Auner, Gregory W; Klein, Michael D

    2009-01-01

    Robotic surgery has gradually gained acceptance due to its numerous advantages such as tremor filtration, increased dexterity and motion scaling. There remains, however, a significant scope for improvement, especially in the areas of surgeon-robot interface and autonomous procedures. Previous studies have attempted to identify factors affecting a surgeon's performance in a master-slave robotic system by tracking hand movements. These studies relied on conventional optical or magnetic tracking systems, making their use impracticable in the operating room. This study concentrated on building an intrinsic movement capture platform using microcontroller based hardware wired to a surgical robot. Software was developed to enable tracking and analysis of hand movements while surgical tasks were performed. Movement capture was applied towards automated movements of the robotic instruments. By emulating control signals, recorded surgical movements were replayed by the robot's end-effectors. Though this work uses a surgical robot as the platform, the ideas and concepts put forward are applicable to telerobotic systems in general.

  4. The concept and architecture of data communication in autonomous cleaning robots

    NASA Astrophysics Data System (ADS)

    Paczesny, Daniel; Nowak, Bartosz; Tarapata, Grzegorz; Marzecki, Michał

    2016-09-01

    The paper presents description of concept of hardware and software architecture which can be easy implemented in autonomous cleaning robots. The requirement for such system is its reliability but still offering free and simple expansions and modifications. The paper describes considerations of the control and communication system, the date frame configuration and the software architecture. To show results of presented control and development system the specialised measurement stand was also proposed and described. All performed tests passed successfully and as a consequence the system architecture was implemented on dedicated cleaning robots.

  5. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    NASA Astrophysics Data System (ADS)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  6. Welding torch trajectory generation for hull joining using autonomous welding mobile robot

    NASA Astrophysics Data System (ADS)

    Hascoet, J. Y.; Hamilton, K.; Carabin, G.; Rauch, M.; Alonso, M.; Ares, E.

    2012-04-01

    Shipbuilding processes involve highly dangerous manual welding operations. Welding of ship hulls presents a hazardous environment for workers. This paper describes a new robotic system, developed by the SHIPWELD consortium, that moves autonomously on the hull and automatically executes the required welding processes. Specific focus is placed on the trajectory control of such a system and forms the basis for the discussion in this paper. It includes a description of the robotic hardware design as well as some methodology used to establish the torch trajectory control.

  7. Analysis of mutual assured destruction-like scenario with swarms of non-recallable autonomous robots

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    This paper considers the implications of the creation of an autonomous robotic fighting force without recall-ability which could serve as a deterrent to a `total war' magnitude attack. It discusses the technical considerations for this type of robotic system and the limited enhancements required to current technologies (particularly UAVs) needed to create such a system. Particular consideration is paid to how the introduction of this type of technology by one actor could create a need for reciprocal development. Also considered is the prospective utilization of this type of technology by non-state actors and the impact of this on state actors.

  8. Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.

  9. Positional estimation techniques for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Aggarwal, J. K.

    1990-01-01

    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.

  10. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation.

    PubMed

    Omrane, Hajer; Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path.

  11. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748

  12. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  13. Performance evaluation of 3D vision-based semi-autonomous control method for assistive robotic manipulator.

    PubMed

    Ka, Hyun W; Chung, Cheng-Shiu; Ding, Dan; James, Khara; Cooper, Rory

    2017-03-22

    We developed a 3D vision-based semi-autonomous control interface for assistive robotic manipulators. It was implemented based on one of the most popular commercially available assistive robotic manipulator combined with a low-cost depth-sensing camera mounted on the robot base. To perform a manipulation task with the 3D vision-based semi-autonomous control interface, a user starts operating with a manual control method available to him/her. When detecting objects within a set range, the control interface automatically stops the robot, and provides the user with possible manipulation options through audible text output, based on the detected object characteristics. Then, the system waits until the user states a voice command. Once the user command is given, the control interface drives the robot autonomously until the given command is completed. In the empirical evaluations conducted with human subjects from two different groups, it was shown that the semi-autonomous control can be used as an alternative control method to enable individuals with impaired motor control to more efficiently operate the robot arms by facilitating their fine motion control. The advantage of semi-autonomous control was not so obvious for the simple tasks. But, for the relatively complex real-life tasks, the 3D vision-based semi-autonomous control showed significantly faster performance. Implications for Rehabilitation A 3D vision-based semi-autonomous control interface will improve clinical practice by providing an alternative control method that is less demanding physically as well cognitively. A 3D vision-based semi-autonomous control provides the user with task specific intelligent semiautonomous manipulation assistances. A 3D vision-based semi-autonomous control gives the user the feeling that he or she is still in control at any moment. A 3D vision-based semi-autonomous control is compatible with different types of new and existing manual control methods for ARMs.

  14. Needle Path Planning for Autonomous Robotic Surgical Suturing

    PubMed Central

    Jackson, Russell C.; Çavuşoğlu, M. Cenk

    2013-01-01

    This paper develops a path plan for suture needles used with solid tissue volumes in endoscopic surgery. The path trajectory is based on the best practices that are used by surgeons. The path attempts to minimize the interaction forces between the tissue and the needle. Using surgical guides as a basis, two different techniques for driving a suture needle are developed. The two techniques are compared in hardware experiments by robotically driving the suture needle using both of the motion plans. PMID:24683500

  15. Needle Path Planning for Autonomous Robotic Surgical Suturing.

    PubMed

    Jackson, Russell C; Cavuşoğlu, M Cenk

    2013-12-31

    This paper develops a path plan for suture needles used with solid tissue volumes in endoscopic surgery. The path trajectory is based on the best practices that are used by surgeons. The path attempts to minimize the interaction forces between the tissue and the needle. Using surgical guides as a basis, two different techniques for driving a suture needle are developed. The two techniques are compared in hardware experiments by robotically driving the suture needle using both of the motion plans.

  16. Application-based control of an autonomous mobile robot

    SciTech Connect

    Fisher, J.J.

    1988-01-01

    Industry response to new technology is governed, almost without exception, by the systems available to meet real world needs, not tools which prove the feasibility of the technology. To this end, SRL is developing robust control strategies and tools for potential autonomous vehicle applications on site. This document describes the work packages developed to perform remote tasks and a integrated control environment which allows rapid vehicle applications development and diagnostic capabilities. 5 refs., 7 figs.

  17. Autonomous and Remote-Controlled Airborne and Ground-Based Robotic Platforms for Adaptive Geophysical Surveying

    NASA Astrophysics Data System (ADS)

    Spritzer, J. M.; Phelps, G. A.

    2011-12-01

    Low-cost autonomous and remote-controlled robotic platforms have opened the door to precision-guided geophysical surveying. Over the past two years, the U.S. Geological Survey, Senseta, NASA Ames Research Center, and Carnegie Mellon University Silicon Valley, have developed and deployed small autonomous and remotely controlled vehicles for geophysical investigations. The purpose of this line of investigation is to 1) increase the analytical capability, resolution, and repeatability, and 2) decrease the time, and potentially the cost and map-power necessary to conduct near-surface geophysical surveys. Current technology has advanced to the point where vehicles can perform geophysical surveys autonomously, freeing the geoscientist to process and analyze the incoming data in near-real time. This has enabled geoscientists to monitor survey parameters; process, analyze and interpret the incoming data; and test geophysical models in the same field session. This new approach, termed adaptive surveying, provides the geoscientist with choices of how the remainder of the survey should be conducted. Autonomous vehicles follow pre-programmed survey paths, which can be utilized to easily repeat surveys on the same path over large areas without the operator fatigue and error that plague man-powered surveys. While initial deployments with autonomous systems required a larger field crew than a man-powered survey, over time operational experience costs and man power requirements will decrease. Using a low-cost, commercially available chassis as the base for autonomous surveying robotic systems promise to provide higher precision and efficiency than human-powered techniques. An experimental survey successfully demonstrated the adaptive techniques described. A magnetic sensor was mounted on a small rover, which autonomously drove a prescribed course designed to provide an overview of the study area. Magnetic data was relayed to the base station periodically, processed and gridded. A

  18. Road network modeling in open source GIS to manage the navigation of autonomous robots

    NASA Astrophysics Data System (ADS)

    Mangiameli, Michele; Muscato, Giovanni; Mussumeci, Giuseppe

    2013-10-01

    The autonomous navigation of a robot can be accomplished through the assignment of a sequence of waypoints previously identified in the territory to be explored. In general, the starting point is a vector graph of the network consisting of possible paths. The vector graph can be directly available in the case of actual road networks, or it can be modeled, i.e. on the basis of cartographic supports or, even better, of a digital terrain model (DTM). In this paper we present software procedures developed in Grass-GIS, PostGIS and QGIS environments to identify, model, and visualize a road graph and to extract and normalize sequence of waypoints which can be transferred to a robot for its autonomous navigation.

  19. Intelligent operating systems for autonomous robots: Real-time capabilities on a hypercube super-computer

    SciTech Connect

    Einstein, J.R.; Barhen, J.; Jefferson, D.

    1986-01-01

    Autonomous robots which must perform time-critical tasks in hostile environments require computers which can perform many asynchronous tasks at extremely high speeds. Certain hypercube multiprocessors have many of the required attributes, but their operating systems must be provided with special functions to improve the capability of the system to respond rapidly to unpredictable events. A ''virtual-time'' shell, under design for addition to the Vertex operating system of the NCUBE hypercube computer, and having such capabilities, is described.

  20. Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot.

    PubMed

    Mafrica, Stefano; Servel, Alain; Ruffier, Franck

    2016-11-10

    Here we present a novel bio-inspired optic flow (OF) sensor and its application to visual  guidance and odometry on a low-cost car-like robot called BioCarBot. The minimalistic OF sensor was robust to high-dynamic-range lighting conditions and to various visual patterns encountered thanks to its M(2)APIX auto-adaptive pixels and the new cross-correlation OF algorithm implemented. The low-cost car-like robot estimated its velocity and steering angle, and therefore its position and orientation, via an extended Kalman filter (EKF) using only two downward-facing OF sensors and the Ackerman steering model. Indoor and outdoor experiments were carried out in which the robot was driven in the closed-loop mode based on the velocity and steering angle estimates. The experimental results obtained show that our novel OF sensor can deliver high-frequency measurements ([Formula: see text]) in a wide OF range (1.5-[Formula: see text]) and in a 7-decade high-dynamic light level range. The OF resolution was constant and could be adjusted as required (up to [Formula: see text]), and the OF precision obtained was relatively high (standard deviation of [Formula: see text] with an average OF of [Formula: see text], under the most demanding lighting conditions). An EKF-based algorithm gave the robot's position and orientation with a relatively high accuracy (maximum errors outdoors at a very low light level: [Formula: see text] and [Formula: see text] over about [Formula: see text] and [Formula: see text]) despite the low-resolution control systems of the steering servo and the DC motor, as well as a simplified model identification and calibration. Finally, the minimalistic OF-based odometry results were compared to those obtained using measurements based on an inertial measurement unit (IMU) and a motor's speed sensor.

  1. Self-organization of spiking neural network that generates autonomous behavior in a real mobile robot.

    PubMed

    Alnajjar, Fady; Murase, Kazuyuki

    2006-08-01

    In this paper, we propose self-organization algorithm of spiking neural network (SNN) applicable to autonomous robot for generation of adoptive and goal-directed behavior. First, we formulated a SNN model whose inputs and outputs were analog and the hidden unites are interconnected each other. Next, we implemented it into a miniature mobile robot Khepera. In order to see whether or not a solution(s) for the given task(s) exists with the SNN, the robot was evolved with the genetic algorithm in the environment. The robot acquired the obstacle avoidance and navigation task successfully, exhibiting the presence of the solution. After that, a self-organization algorithm based on a use-dependent synaptic potentiation and depotentiation at synapses of input layer to hidden layer and of hidden layer to output layer was formulated and implemented into the robot. In the environment, the robot incrementally organized the network and the given tasks were successfully performed. The time needed to acquire the desired adoptive and goal-directed behavior using the proposed self-organization method was much less than that with the genetic evolution, approximately one fifth.

  2. Automatic generation of modules of object categorization for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Gorbenko, Anna

    2013-10-01

    Many robotic tasks require advanced systems of visual sensing. Robotic systems of visual sensing must be able to solve a number of different complex problems of visual data analysis. Object categorization is one of such problems. In this paper, we propose an approach to automatic generation of computationally effective modules of object categorization for autonomous mobile robots. This approach is based on the consideration of the stack cover problem. In particular, it is assumed that the robot is able to perform an initial inspection of the environment. After such inspection, the robot needs to solve the stack cover problem by using a supercomputer. A solution of the stack cover problem allows the robot to obtain a template for computationally effective scheduling of object categorization. Also, we consider an efficient approach to solve the stack cover problem. In particular, we consider an explicit reduction from the decision version of the stack cover problem to the satisfiability problem. For different satisfiability algorithms, the results of computational experiments are presented.

  3. Toward a mobile autonomous robotic system for Mars exploration

    NASA Astrophysics Data System (ADS)

    Arena, P.; Di Giamberardino, P.; Fortuna, L.; La Gala, F.; Monaco, S.; Muscato, G.; Rizzo, A.; Ronchini, R.

    2004-01-01

    The paper deals with the results obtained up to now in the design and realization of mobile platforms, wheeled and legged ones, for autonomous deployment in unknown and hostile environments: a work developed in the framework of a project supported by the Italian Space Agency. The paper is focused on the description of the hierarchical architecture adopted for the planning, the supervision and the control of their mobility. Experimental results validate the solutions proposed, evidencing the capabilities of the platforms to explore environments in presence of irregular ground shape and obstacles of different dimensions.

  4. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized

  5. Effectiveness of Social Behaviors for Autonomous Wheelchair Robot to Support Elderly People in Japan

    PubMed Central

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  6. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  7. Autonomous function of wheelchair-mounted robotic manipulators to perform daily activities.

    PubMed

    Chung, Cheng-Shiu; Wang, Hongwu; Cooper, Rory A

    2013-06-01

    Autonomous functions for wheelchair-mounted robotic manipulators (WMRMs) allow a user to focus more on the outcome from the task - for example, eating or drinking, instead of moving robot joints through user interfaces. In this paper, we introduce a novel personal assistive robotic system based on a position-based visual servoing (PBVS) approach. The system was evaluated with a complete drinking task, which included recognizing the location of the drink, picking up the drink from a start location, conveying the drink to the proximity of the user's mouth without spilling, and placing the drink back on the table. For a drink located in front of the wheelchair, the success rate was nearly 100%. Overall, the total time of completing drinking task is within 40 seconds.

  8. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan.

    PubMed

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items.

  9. Command and Control Architectures for Autonomous Micro-Robotic Forces - FY-2000 Project Report

    SciTech Connect

    Dudenhoeffer, Donald Dean

    2001-04-01

    Advances in Artificial Intelligence (AI) and micro-technologies will soon give rise to production of large-scale forces of autonomous micro-robots with systems of innate behaviors and with capabilities of self-organization and real world tasking. Such organizations have been compared to schools of fish, flocks of birds, herds of animals, swarms of insects, and military squadrons. While these systems are envisioned as maintaining a high degree of autonomy, it is important to understand the relationship of man with such machines. In moving from research studies to the practical deployment of large-scale numbers of robots, one of critical pieces that must be explored is the command and control architecture for humans to re-task and also inject global knowledge, experience, and intuition into the force. Tele-operation should not be the goal, but rather a level of adjustable autonomy and high-level control. If a herd of sheep is comparable to the collective of robots, then the human element is comparable to the shepherd pulling in strays and guiding the herd in the direction of greener pastures. This report addresses the issues and development of command and control for largescale numbers of autonomous robots deployed as a collective force.

  10. Aladdin: a semi-autonomous door opening system for EOD-class robots

    NASA Astrophysics Data System (ADS)

    Craft, Jack; Wilson, Jack; Huang, Wesley H.; Claffee, Mark R.; Phillips, Emilie A.

    2011-05-01

    This paper describes our results to date on the Aladdin project, an ongoing effort to enable small UGVs to open doors semi-autonomously. Our system consists of a modular general-purpose gripper and software that provides semiautonomous capabilities. The gripper features compliant elements which simplify operations such as turning a doorknob and opening a door; this gripper can be retrofitted onto existing general-purpose robotic manipulators without extensive hardware modifications. The software provides semi-autonomous door opening capability through an OCU; these capabilities are focused on targeting and reaching for a doorknob, a subtask that our initial testing showed would provide the greatest improvement in door opening operations. This paper describes our system and the results of our evaluations on the door opening task. We continue to develop both the hardware and software with the ultimate goal of fully autonomous door-opening.

  11. CREST Autonomous Robotic Scientist: Developing a Closed-Loop Science Exploration Capability for European Mars Missions

    NASA Astrophysics Data System (ADS)

    Woods, M.; Shaw, A.; Ward, R.; Barnes, D.; Pullan, D.; Long, D.

    2008-08-01

    In common with most Mars missions, the current communications baseline for Europe's ExoMars Rover mission exhibits constrained data links with Earth, making remote operations difficult. The time taken to transmit and react to planning data places a natural limit on the amount of science exploration that can be achieved in any given period. In order to increase the potential science return, autonomous science assessment and response is an attractive option and worthy of investigation. In this work, we have integrated technologies and techniques developed in previous studies and used the resulting test bed to demonstrate an autonomous, opportunistic science concept on a representative robotic platform. In addition to progressing the system design approach and individual autonomy components, we have introduced a methodology for autonomous science assessment based on terrestrial field science practice.

  12. Monocular SLAM for Autonomous Robots with Enhanced Features Initialization

    PubMed Central

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-01-01

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided. PMID:24699284

  13. Monocular SLAM for autonomous robots with enhanced features initialization.

    PubMed

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-04-02

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided.

  14. Emergence of Leadership in a Group of Autonomous Robots.

    PubMed

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different "styles" of leadership (active and passive).

  15. Emergence of Leadership in a Group of Autonomous Robots

    PubMed Central

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different “styles” of leadership (active and passive). PMID:26340449

  16. Autonomous Mobile Robot System for Monitoring and Control of Penetration during Fixed Pipes Welding

    NASA Astrophysics Data System (ADS)

    Muramatsu, Masahiro; Suga, Yasuo; Mori, Kazuhiro

    In order to obtain sound welded joints in the welding of horizontal fixed pipes, it is important to control the back bead width in the first pass. However, it is difficult to obtain optimum back bead width, because the proper welding conditions change with welding position. In this paper, in order to fully automatize the welding of fixed pipes, a new method is developed to control the back bead width with monitoring the shape and dimensions of the molten pool from the reverse side by autonomous mobile robot system. This robot has spherical shape so as to move in a complex route including curved pipe, elbow joint and so on. It has also a camera to observe inner surface of pipe and recognize a route in which the robot moves. The robot moves to welding point in the pipe, and monitors the reverse side shape of molten pool during welding. The host computer processes the images of molten pool acquired by the robot vision system, and calculates the optimum welding conditions to realize adaptive control of welding. As a result of the welding control experiments, the effectiveness of this system for the penetration control of fixed pipes is demonstrated.

  17. Real-time map building and navigation for autonomous robots in unknown environments.

    PubMed

    Oriolo, G; Ulivi, G; Vendittelli, M

    1998-01-01

    An algorithmic solution method is presented for the problem of autonomous robot motion in completely unknown environments. Our approach is based on the alternate execution of two fundamental processes: map building and navigation. In the former, range measures are collected through the robot exteroceptive sensors and processed in order to build a local representation of the surrounding area. This representation is then integrated in the global map so far reconstructed by filtering out insufficient or conflicting information. In the navigation phase, an A*-based planner generates a local path from the current robot position to the goal. Such a path is safe inside the explored area and provides a direction for further exploration. The robot follows the path up to the boundary of the explored area, terminating its motion if unexpected obstacles are encountered. The most peculiar aspects of our method are the use of fuzzy logic for the efficient building and modification of the environment map, and the iterative application of A*, a complete planning algorithm which takes full advantage of local information. Experimental results for a NOMAD 200 mobile robot show the real-time performance of the proposed method, both in static and moderately dynamic environments.

  18. 3-D world modeling for an autonomous robot

    SciTech Connect

    Goldstein, M.; Pin, F.G.; Weisbin, C.R.

    1987-08-01

    This paper presents a methodology for a concise representation of the 3-D world model for a mobile robot, using range data. The process starts with the segmentation of the scene into ''objects'' that are given a unique label, based on principles of range continuity. Then the external surface of each object is partitioned into homogeneous surface patches. Contours of surface patches in 3-D space are identified by estimating the normal and curvature associated with each pixel. The resulting surface patches are then classified as planar, convex or concave. Since the world model uses a volumetric representation for the 3-D environment, planar surfaces are represented by thin volumetric polyhedra. Spherical and cylindrical surfaces are extracted and represented by appropriate volumetric primitives. All other surfaces are represented using the boolean union of spherical volumes (as described in a separate paper by the same authors). The result is a general, concise representation of the external 3-D world, which allows for efficient and robust 3-D object recognition. 20 refs., 14 figs.

  19. Remotely manipulated and autonomous robotic welding fabrication in space

    NASA Technical Reports Server (NTRS)

    Agapakis, J. E.; Masubuchi, K.

    1985-01-01

    The results of a NASA sponsored study, performed in order to establish the feasibility of remotely manipulated or unmanned welding fabrication systems for space construction, are presented. Possible space welding fabrication tasks and operational modes are classified and the capabilities and limitations of human operators and machines are outlined. Human performance in remote welding tasks was experimentally tested under the sensing and actuation constraints imposed by remote manipulation in outer space environments. Proposals for the development of space welding technology are made and necessary future R&D efforts are identified. The development of improved visual sensing strategies and computer encoding of the human welding engineering expertise are identified as essential, both for human operator assistance and for autonomous operation in all phases of welding fabrication. Novel uses of machine vision for the determination of the weld joint and bead geometry are proposed, and a prototype of a rule-based expert system is described for the interpretation of the visually detected weld features and defects.

  20. Remotely Manipulated And Autonomous Robotic Welding Fabrication In Space

    NASA Astrophysics Data System (ADS)

    Agapakis, John E.; Masubuchi, Koichi

    1985-12-01

    The results of a National Aeronautics and Space Administration (NASA) sponsored study, performed in order to establish the feasibility of remotely manipulated or unmanned welding fabrication systems for space construction, are first presented in this paper. Possible space welding fabrication tasks and operational modes are classified and the capabilities and limitations of human operators and machines are outlined. The human performance in remote welding tasks is experimentally tested under the sensing and actuation constraints imposed by remote manipulation in outer space environments. Proposals for the development of space welding technology are made and necessary future research and development (R&D) efforts are identified. The development of improved visual sensing strategies and computer encoding of the human welding engineering expertise are identified as essential, both for human operator assistance and for autonomous operation in all phases of welding fabrication. Results of a related follow-up study are then briefly presented. Novel uses of machine vision for the determination of the weld joint and bead geometry are proposed and implemented, and a first prototype of a rule-based expert system is developed for the interpretation of the visually detected weld features and defects.

  1. Autonomous robot navigation based on the evolutionary multi-objective optimization of potential fields

    NASA Astrophysics Data System (ADS)

    Herrera Ortiz, Juan Arturo; Rodríguez-Vázquez, Katya; Padilla Castañeda, Miguel A.; Arámbula Cosío, Fernando

    2013-01-01

    This article presents the application of a new multi-objective evolutionary algorithm called RankMOEA to determine the optimal parameters of an artificial potential field for autonomous navigation of a mobile robot. Autonomous robot navigation is posed as a multi-objective optimization problem with three objectives: minimization of the distance to the goal, maximization of the distance between the robot and the nearest obstacle, and maximization of the distance travelled on each field configuration. Two decision makers were implemented using objective reduction and discrimination in performance trade-off. The performance of RankMOEA is compared with NSGA-II and SPEA2, including both decision makers. Simulation experiments using three different obstacle configurations and 10 different routes were performed using the proposed methodology. RankMOEA clearly outperformed NSGA-II and SPEA2. The robustness of this approach was evaluated with the simulation of different sensor masks and sensor noise. The scheme reported was also combined with the wavefront-propagation algorithm for global path planning.

  2. Towards MRI-Based Autonomous Robotic US Acquisitions: A First Feasibility Study.

    PubMed

    Hennersperger, Christoph; Fuerst, Bernhard; Virga, Salvatore; Zettinig, Oliver; Frisch, Benjamin; Neff, Thomas; Navab, Nassir

    2016-10-24

    Robotic ultrasound has the potential to assist and guide physicians during interventions. In this work, we present a set of methods and a workflow to enable autonomous MRI-guided ultrasound acquisitions. Our approach uses a structured-light 3D scanner for patient-to-robot and image-to-patient calibration, which in turn is used to plan 3D ultrasound trajectories. These MRI-based trajectories are followed autonomously by the robot and are further refined online using automatic MRI/US registration. Despite the low spatial resolution of structured light scanners, the initial planned acquisition path can be followed with an accuracy of 2.46±0.96 mm. This leads to a good initialization of the MRI/US registration: the 3D-scan-based alignment for planning and acquisition shows an accuracy (distance between planned ultrasound and MRI) of 4.47 mm, and 0.97 mm after an online-update of the calibration based on a closed loop registration.

  3. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    PubMed Central

    Podder, Tarun K.; Buzurovic, Ivan; Huang, Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-01

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane’s model and the Army Material Systems Analysis Activity, i.e., Crow’s model, were applied. The MTBF was used as an important measure for assessing the system’s reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane’s postulation as well as Crow’s postulation of reliability growth. The Laplace test index was −3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved

  4. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    SciTech Connect

    Podder, Tarun K.; Buzurovic, Ivan; Huang Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-15

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane's model and the Army Material Systems Analysis Activity, i.e., Crow's model, were applied. The MTBF was used as an important measure for assessing the system's reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane's postulation as well as Crow's postulation of reliability growth. The Laplace test index was -3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved reliability. The MTBF

  5. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations.

  6. Developing a Telescope Simulator Towards a Global Autonomous Robotic Telescope Network

    NASA Astrophysics Data System (ADS)

    Giakoumidis, N.; Ioannou, Z.; Dong, H.; Mavridis, N.

    2013-05-01

    A robotic telescope network is a system that integrates a number of telescopes to observe a variety of astronomical targets without being operated by a human. This system autonomously selects and observes targets in accordance to an optimized target. It dynamically allocates telescope resources depending on the observation requests, specifications of the telescopes, target visibility, meteorological conditions, daylight, location restrictions and availability and many other factors. In this paper, we introduce a telescope simulator, which can control a telescope to a desired position in order to observe a specific object. The system includes a Client Module, a Server Module, and a Dynamic Scheduler module. We make use and integrate a number of open source software to simulate the movement of a robotic telescope, the telescope characteristics, the observational data and weather conditions in order to test and optimize our system.

  7. Development and training of a learning expert system in an autonomous mobile robot via simulation

    SciTech Connect

    Spelt, P.F.; Lyness, E.; DeSaussure, G. . Center for Engineering Systems Advanced Research)

    1989-11-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using a computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.

  8. Non-equilibrium assembly of microtubules: from molecules to autonomous chemical robots.

    PubMed

    Hess, H; Ross, Jennifer L

    2017-03-22

    Biological systems have evolved to harness non-equilibrium processes from the molecular to the macro scale. It is currently a grand challenge of chemistry, materials science, and engineering to understand and mimic biological systems that have the ability to autonomously sense stimuli, process these inputs, and respond by performing mechanical work. New chemical systems are responding to the challenge and form the basis for future responsive, adaptive, and active materials. In this article, we describe a particular biochemical-biomechanical network based on the microtubule cytoskeletal filament - itself a non-equilibrium chemical system. We trace the non-equilibrium aspects of the system from molecules to networks and describe how the cell uses this system to perform active work in essential processes. Finally, we discuss how microtubule-based engineered systems can serve as testbeds for autonomous chemical robots composed of biological and synthetic components.

  9. Simulation of Autonomous Robotic Multiple-Core Biopsy by 3D Ultrasound Guidance

    PubMed Central

    Liang, Kaicheng; Rogers, Albert J.; Light, Edward D.; von Allmen, Daniel; Smith, Stephen W.

    2010-01-01

    An autonomous multiple-core biopsy system guided by real-time 3D ultrasound and operated by a robotic arm with 6+1 degrees of freedom has been developed. Using a specimen of turkey breast as a tissue phantom, our system was able to first autonomously locate the phantom in the image volume and then perform needle sticks in each of eight sectors in the phantom in a single session, with no human intervention required. Based on the fraction of eight sectors successfully sampled in an experiment of five trials, a success rate of 93% was recorded. This system could have relevance in clinical procedures that involve multiple needle-core sampling such as prostate or breast biopsy. PMID:20687279

  10. Autonomous Navigation System for Mobile Robot Using Randomly Distributed Passive RFID Tags

    NASA Astrophysics Data System (ADS)

    Park, Sunhong; Hashimoto, Shuji

    This paper presents an autonomous navigation system for a mobile robot using randomly distributed passive RFID tags. In the case of randomly distributed RFID tags, it is difficult to provide the precise location of the robot especially in the area of sparse RFID tag distribution. This, combined with the wide turning radius of the robot, can cause the robot to enter a zigzag exploration path and miss the goal. In RFID-based navigation, the key is to reduce both the number of RFID tags and the localization error for practical use in a large space. To cope with these, we utilized the Read time, which measures the reading time of each RFID tag. With this, we could estimate accurately the localization and orientation without using any external sensors or increasing the RFID tags. The average estimation errors of 7.8cm in localization and 11 degrees in orientation were achieved with 102 RFID tags in the area of 4.2m by 6.2m. Our proposed method is verified with the path trajectories produced during navigation compared with conventional approaches.

  11. Neural network representation of sensor graphs in autonomous robot path planning

    SciTech Connect

    Jorgensen, C.C.

    1987-01-01

    This paper discusses a continuous valued associative neural network used for anticipatory robot navigation planning in partially learned environments. A navigation methodology is implemented in four steps. First, a room is represented as a lattice of connected voxels formed by dividing navigation space into equal sized volumetric cells. Each voxel is associated with a simulated neuron. The magnitude of a neurons activation corresponds to a probability of voxel occupancy calculated from a series of sonar readings taken by an autonomous robot. Neurons are trained with a series of room patterns derived from varying robot sensor perspectives. At another time, the robot is exposed to a single perspective of one of the rooms and utilizes the sensor return as a cue to prompt associative recall of a best guess of the complete interior of the room. A two step path planning operation is then invoked which uses line of sight readings and anticipated global information to form a trial path plan. The planning process merges a nearest neighbor grid cell technique and a simulated annealing gradient descent method to optimize traversal movements. In the final step, the path is followed until a mismatch between the estimated room and the actual sensor returns indicate incorrect anticipation. Implementation of the method on a Hypercube computer is discussed along with memory computation tradeoff requirements.

  12. 3-D world modeling based on combinatorial geometry for autonomous robot navigation

    SciTech Connect

    Goldstein, M.; Pin, F.G.; de Saussure, G.; Weisbin, C.R.

    1987-01-01

    In applications of robotics to surveillance and mapping at nuclear facilities, the scene to be described is fundamentally three-dimensional. Usually, only partial information concerning the 3-D environment is known a-priori. Using an autonomous robot, this information may be updated using range data to provide an accurate model of the environment. Range data quantify the distances from the sensor focal plane to the object surface. In other words, the 3-D coordinates of discrete points on the object surface are known. The approach proposed herein for 3-D world modeling is based on the Combinatorial Geometry (C.G.) Method which is widely used in Monte Carlo particle transport calculations. First, each measured point on the object surface is surrounded by a small solid sphere with a radius determined by the range to that point. Then, the 3-D shapes of the visible surfaces are obtained by taking the (Boolean) union of all the spheres. The result is a concise and unambiguous representation of the object's boundary surfaces. The distances from discrete points on the robot's boundary surface to various objects are calculated effectively using the C.G. type of representation. This feature is particularly useful for navigation purposes. The efficiency of the proposed approach is illustrated by a simulation of a spherical robot navigating in a 3-D room with several static obstacles.

  13. Localization of non-linearly modeled autonomous mobile robots using out-of-sequence measurements.

    PubMed

    Besada-Portas, Eva; Lopez-Orozco, Jose A; Lanillos, Pablo; de la Cruz, Jesus M

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost.

  14. Localization of Non-Linearly Modeled Autonomous Mobile Robots Using Out-of-Sequence Measurements

    PubMed Central

    Besada-Portas, Eva; Lopez-Orozco, Jose A.; Lanillos, Pablo; de la Cruz, Jesus M.

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  15. Autonomous robotic capture of non-cooperative target by adaptive extended Kalman filter based visual servo

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Zheng H.

    2016-05-01

    This paper presents a real-time, vision-based algorithm for the pose and motion estimation of non-cooperative targets and its application in visual servo robotic manipulator to perform autonomous capture. A hybrid approach of adaptive extended Kalman filter and photogrammetry is developed for the real-time pose and motion estimation of non-cooperative targets. Based on the pose and motion estimates, the desired pose and trajectory of end-effector is defined and the corresponding desired joint angles of the robotic manipulator are derived by inverse kinematics. A close-loop visual servo control scheme is then developed for the robotic manipulator to track, approach and capture the target. Validating experiments are designed and performed on a custom-built six degrees of freedom robotic manipulator with an eye-in-hand configuration. The experimental results demonstrate the feasibility, effectiveness and robustness of the proposed adaptive extended Kalman filter enabled pose and motion estimation and visual servo strategy.

  16. Navigation of Autonomous Mobile Robot under Decision-making Strategy tuned by Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Kamano, Takuya; Yasuno, Takashi; Suzuki, Takayuki; Harada, Hironobu

    This paper describes a novel application of genetic algorithm for navigation of an autonomous mobile robot (AMR) under unknown environments. In the navigation system, the AMR is controlled by the decision-making block, which consists of neural network. To achieve both successful navigation to the goal and the suitable obstacle avoidance, the connection weights of the neural network and speed gains for predefined actions are encoded as genotypes and are tuned simultaneously by genetic algorithm so that the static and dynamic danger-degrees, the energy consumption and the distance and direction errors decrease during the navigation. Experimental results demonstrate the validity of the proposed navigation system.

  17. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  18. Autonomous global sky monitoring with real-time robotic follow-up

    SciTech Connect

    Vestrand, W Thomas; Davis, H; Wren, J; Wozniak, P; Norman, B; White, R; Bloch, J; Fenimore, E; Hodge, Barry; Jah, Moriba; Rast, Richard

    2008-01-01

    We discuss the development of prototypes for a global grid of advanced 'thinking' sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  19. Concept for practical exercises for studying autonomous flying robots in a university environment: part II

    NASA Astrophysics Data System (ADS)

    Gageik, Nils; Dilger, Erik; Montenegro, Sergio; Schön, Stefan; Wildenhein, Rico; Creutzburg, Reiner; Fischer, Arno

    2015-03-01

    The present paper demonstrates the application of quadcopters as educational material for students in aerospace computer science, as it is already in usage today. The work with quadrotors teaches students theoretical and practical knowledge in the fields of robotics, control theory, aerospace and electrical engineering as well as embedded programming and computer science. For this the material, concept, realization and future view of such a course is discussed in this paper. Besides that, the paper gives a brief overview of student research projects following the course, which are related to the research and development of fully autonomous quadrotors.

  20. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  1. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  2. Recognition of 3D objects for autonomous mobile robot's navigation in automated shipbuilding

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Cho, Hyungsuck

    2007-10-01

    Nowadays many parts of shipbuilding process are automated, but the painting process is not, because of the difficulty of automated on-line painting quality measurement, harsh painting environment and the difficulty of robot navigation. However, the painting automation is necessary, because it can provide consistent performance of painting film thickness. Furthermore, autonomous mobile robots are strongly required for flexible painting work. However, the main problem of autonomous mobile robot's navigation is that there are many obstacles which are not expressed in the CAD data. To overcome this problem, obstacle detection and recognition are necessary to avoid obstacles and painting work effectively. Until now many object recognition algorithms have been studied, especially 2D object recognition methods using intensity image have been widely studied. However, in our case environmental illumination does not exist, so these methods cannot be used. To overcome this, to use 3D range data must be used, but the problem of using 3D range data is high computational cost and long estimation time of recognition due to huge data base. In this paper, we propose a 3D object recognition algorithm based on PCA (Principle Component Analysis) and NN (Neural Network). In the algorithm, the novelty is that the measured 3D range data is transformed into intensity information, and then adopts the PCA and NN algorithm for transformed intensity information to reduce the processing time and make the data easy to handle which are disadvantages of previous researches of 3D object recognition. A set of experimental results are shown to verify the effectiveness of the proposed algorithm.

  3. Demonstration of a Spoken Dialogue Interface for Planning Activities of a Semi-autonomous Robot

    NASA Technical Reports Server (NTRS)

    Dowding, John; Frank, Jeremy; Hockey, Beth Ann; Jonsson, Ari; Aist, Gregory

    2002-01-01

    Planning and scheduling in the face of uncertainty and change pushes the capabilities of both planning and dialogue technologies by requiring complex negotiation to arrive at a workable plan. Planning for use of semi-autonomous robots involves negotiation among multiple participants with competing scientific and engineering goals to co-construct a complex plan. In NASA applications this plan construction is done under severe time pressure so having a dialogue interface to the plan construction tools can aid rapid completion of the process. But, this will put significant demands on spoken dialogue technology, particularly in the areas of dialogue management and generation. The dialogue interface will need to be able to handle the complex dialogue strategies that occur in negotiation dialogues, including hypotheticals and revisions, and the generation component will require an ability to summarize complex plans. This demonstration will describe a work in progress towards building a spoken dialogue interface to the EUROPA planner for the purposes of planning and scheduling the activities of a semi-autonomous robot. A prototype interface has been built for planning the schedule of the Personal Satellite Assistant (PSA), a mobile robot designed for micro-gravity environments that is intended for use on the Space Shuttle and International Space Station. The spoken dialogue interface gives the user the capability to ask for a description of the plan, ask specific questions about the plan, and update or modify the plan. We anticipate that a spoken dialogue interface to the planner will provide a natural augmentation or alternative to the visualization interface, in situations in which the user needs very targeted information about the plan, in situations where natural language can express complex ideas more concisely than GUI actions, or in situations in which a graphical user interface is not appropriate.

  4. Control of distributed autonomous robotic systems using principles of pattern formation in nature and pedestrian behavior.

    PubMed

    Molnar, P; Starke, J

    2001-01-01

    Self-organized and error-resistant control of distributed autonomous robotic units in a manufacturing environment with obstacles where the robotic units have to be assigned to manufacturing targets in a cost effective way, is achieved by using two fundamental principles of nature. First, the selection behavior of modes is used which appears in pattern formation of physical, chemical and biological systems. Coupled selection equations based on these pattern formation principles can be used as dynamical system approach to assignment problems. These differential equations guarantee feasibility of the obtained solutions which is of great importance in industrial applications. Second, a model of behavioral forces is used, which has been successfully applied to describe self-organized crowd behavior of pedestrians. This novel approach includes collision avoidance as well as error resistivity. In particular, in systems where failures are of concern, the suggested approach outperforms conventional methods in covering up for sudden external changes like breakdowns of some robotic units. The capability of this system is demonstrated in computer simulations.

  5. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  6. Towards Autonomous Inspection of Space Systems Using Mobile Robotic Sensor Platforms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Saad, Ashraf; Litt, Jonathan S.

    2007-01-01

    The space transportation systems required to support NASA's Exploration Initiative will demand a high degree of reliability to ensure mission success. This reliability can be realized through autonomous fault/damage detection and repair capabilities. It is crucial that such capabilities are incorporated into these systems since it will be impractical to rely upon Extra-Vehicular Activity (EVA), visual inspection or tele-operation due to the costly, labor-intensive and time-consuming nature of these methods. One approach to achieving this capability is through the use of an autonomous inspection system comprised of miniature mobile sensor platforms that will cooperatively perform high confidence inspection of space vehicles and habitats. This paper will discuss the efforts to develop a small scale demonstration test-bed to investigate the feasibility of using autonomous mobile sensor platforms to perform inspection operations. Progress will be discussed in technology areas including: the hardware implementation and demonstration of robotic sensor platforms, the implementation of a hardware test-bed facility, and the investigation of collaborative control algorithms.

  7. Using Simulation to Evaluate Scientific Impact of Autonomous Robotic Capabilities for Mars

    NASA Astrophysics Data System (ADS)

    Haldemann, A. F.; McHenry, M. C.; Castano, R. A.; Cameraon, J. M.; Estlin, T. A.; Farr, T. G.; Jain, A.; Lee, M.; Leff, C. E.; Lim, C.; Nesnas, I. A.; Petras, R. D.; Pomerantz, M.; Powell, M.; Shu, I.; Wood, J.; Volpe, R.; Gaines, D. M.

    2006-12-01

    The Science Operations On Planetary Surfaces (SOOPS) task was created with the goal of evaluating, developing and validating methods for increasing the productivity of science operations on planetary surfaces. The highly integrated spacecraft-instrument payload systems of planetary surface missions create operational constraints (e.g. power, data volume, number of ground control interactions) that can reduce the effective science capabilities. Technological solutions have been proposed to mitigate the impact of those constraints on science return. For example, enhanced mobility autonomy, robotic arm autonomous deployment, and on- board image analysis have been implemented on the Mars Exploration Rovers. Next generation improvements involve on-board science driven decision-making and data collection. SOOPS takes a systems level approach to science operations and thus to evaluating and demonstrating the potential benefits of technologies that are already in development at the `component level'. A simulation environment---"Field Test in a Box" or SOOPS-FTB---has been developed with realistic terrains and scientifically pertinent information content. The terrain can be explored with a simulated spacecraft and instruments that are operated using an activity planning software interface which closely resembles that used for actual surface spacecraft missions. The simulation environment provides flexibility and control over experiments that help answer "what if" questions about the performance of proposed autonomous technologies. The experiments also help evaluate operator interaction with the autonomous system, and improve the designs of the control tools. We will report the recent results of SOOPS-FTB experiments with an on-board feature mapping capability, which is effectively an autonomous compression scheme. This example illustrates a demonstration of a new software scheme to operate within a known hardware configuration. It is also conceivable that SOOPS-FTB could be

  8. Autonomous charging to enable long-endurance missions for small aerial robots

    NASA Astrophysics Data System (ADS)

    Mulgaonkar, Yash; Kumar, Vijay

    2014-06-01

    The past decade has seen an increased interest towards research involving Autonomous Micro Aerial Vehicles (MAVs). The predominant reason for this is their agility and ability to perform tasks too difficult or dangerous for their human counterparts and to navigate into places where ground robots cannot reach. Among MAVs, rotary wing aircraft such as quadrotors have the ability to operate in confined spaces, hover at a given point in space and perch1 or land on a flat surface. This makes the quadrotor a very attractive aerial platform giving rise to a myriad of research opportunities. The potential of these aerial platforms is severely limited by the constraints on the flight time due to limited battery capacity. This in turn arises from limits on the payload of these rotorcraft. By automating the battery recharging process, creating autonomous MAVs that can recharge their on-board batteries without any human intervention and by employing a team of such agents, the overall mission time can be greatly increased. This paper describes the development, testing, and implementation of a system of autonomous charging stations for a team of Micro Aerial Vehicles. This system was used to perform fully autonomous long-term multi-agent aerial surveillance experiments with persistent station keeping. The scalability of the algorithm used in the experiments described in this paper was also tested by simulating a persistence surveillance scenario for 10 MAVs and charging stations. Finally, this system was successfully implemented to perform a 9½ hour multi-agent persistent flight test. Preliminary implementation of this charging system in experiments involving construction of cubic structures with quadrotors showed a three-fold increase in effective mission time.

  9. A field robot for autonomous laser-based N2O flux measurements

    NASA Astrophysics Data System (ADS)

    Molstad, Lars; Reent Köster, Jan; Bakken, Lars; Dörsch, Peter; Lien, Torgrim; Overskeid, Øyvind; Utstumo, Trygve; Løvås, Daniel; Brevik, Anders

    2014-05-01

    N2O measurements in multi-plot field trials are usually carried out by chamber-based manual gas sampling and subsequent laboratory-based gas chromatographic N2O determination. Spatial and temporal resolution of these measurements are commonly limited by available manpower. However, high spatial and temporal variability of N2O fluxes within individual field plots can add large uncertainties to time- and area-integrated flux estimates. Detailed mapping of this variability would improve these estimates, as well as help our understanding of the factors causing N2O emissions. An autonomous field robot was developed to increase the sampling frequency and to operate outside normal working hours. The base of this system was designed as an open platform able to carry versatile instrumentation. It consists of an electrically motorized platform powered by a lithium-ion battery pack, which is capable of autonomous navigation by means of a combined high precision real-time kinematic (RTK) GPS and an inertial measurement unit (IMU) system. On this platform an elevator is mounted, carrying a lateral boom with a static chamber on each side of the robot. Each chamber is equipped with a frame of plastic foam to seal the chamber when lowered onto the ground by the elevator. N2O flux from the soil covered by the two chambers is sequentially determined by circulating air between each chamber and a laser spectrometer (DLT-100, Los Gatos Research, Mountain View, CA, USA), which monitors the increase in N2O concentration. The target enclosure time is 1 - 2 minutes, but may be longer when emissions are low. CO2 concentrations are determined by a CO2/H2O gas analyzer (LI-840A, LI-COR Inc., Lincoln, NE, USA). Air temperature and air pressure inside both chambers are continuously monitored and logged. Wind speed and direction are monitored by a 3D sonic anemometer on top of the elevator boom. This autonomous field robot can operate during day and night time, and its working hours are only

  10. A real-time expert system for control of an autonomous mobile robot and for diagnosing unexpected occurrences

    SciTech Connect

    Kammer, D.W.; de Saussure, G.; Weisbin, C.R.

    1986-01-01

    The use of an expert system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explanation, verification and modification of the rules which determine the robot's behavior, and the domain of competence can be incrementally extended. However, real-time operation poses a number of challenges due to the dynamic nature of the data and the time constraints of dealing with a large data base. An implementation is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, monitoring, and diagnosis of unexpected occurrences during a navigation task. Control has been successfully implemented for goal directed navigation in the presence of moving obstacles.

  11. Real-time expert system for the control of autonomous robot navigation in the presence of moving obstacles

    SciTech Connect

    deSaussure, G.; Kammer, D.W.; Weisbin, C.R.

    1987-01-01

    The use of an expert system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explanation, verification and modification of the rules which determine the robot's behavior, and the domain of competence can be incrementally extended. However, real-time operation poses a number of challenges due to the dynamic nature of the data and the time constraints of dealing with a large database. An implementation is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, monitoring, and diagnosis of unexpected occurrences during a navigation task. Control has been successfully implemented for goal directed navigation in the presence of moving obstacles.

  12. Where neuroscience and dynamic system theory meet autonomous robotics: a contracting basal ganglia model for action selection.

    PubMed

    Girard, B; Tabareau, N; Pham, Q C; Berthoz, A; Slotine, J-J

    2008-05-01

    Action selection, the problem of choosing what to do next, is central to any autonomous agent architecture. We use here a multi-disciplinary approach at the convergence of neuroscience, dynamical system theory and autonomous robotics, in order to propose an efficient action selection mechanism based on a new model of the basal ganglia. We first describe new developments of contraction theory regarding locally projected dynamical systems. We exploit these results to design a stable computational model of the cortico-baso-thalamo-cortical loops. Based on recent anatomical data, we include usually neglected neural projections, which participate in performing accurate selection. Finally, the efficiency of this model as an autonomous robot action selection mechanism is assessed in a standard survival task. The model exhibits valuable dithering avoidance and energy-saving properties, when compared with a simple if-then-else decision rule.

  13. Distributed, Collaborative Human-Robotic Networks for Outdoor Experiments in Search, Identify and Track

    DTIC Science & Technology

    2011-01-11

    design 3.3 Computers Each robot is designed to mount two Mini-ITX form factor custom computers. Each computer is equipped with a Core 2 Duo Mobile...curve built from the output from the A* algorithm The planned paths are then fed into a modified vector polar histogram ( VPH ) controller which...provides motor actuation commands to the Segway platform. The VPH controller continuously aims for a look-ahead point on the path a set distance away

  14. Self-localization for an autonomous mobile robot based on an omni-directional vision system

    NASA Astrophysics Data System (ADS)

    Chiang, Shu-Yin; Lin, Kuang-Yu; Chia, Tsorng-Lin

    2013-12-01

    In this study, we designed an autonomous mobile robot based on the rules of the Federation of International Robotsoccer Association (FIRA) RoboSot category, integrating the techniques of computer vision, real-time image processing, dynamic target tracking, wireless communication, self-localization, motion control, path planning, and control strategy to achieve the contest goal. The self-localization scheme of the mobile robot is based on the algorithms featured in the images from its omni-directional vision system. In previous works, we used the image colors of the field goals as reference points, combining either dual-circle or trilateration positioning of the reference points to achieve selflocalization of the autonomous mobile robot. However, because the image of the game field is easily affected by ambient light, positioning systems exclusively based on color model algorithms cause errors. To reduce environmental effects and achieve the self-localization of the robot, the proposed algorithm is applied in assessing the corners of field lines by using an omni-directional vision system. Particularly in the mid-size league of the RobotCup soccer competition, selflocalization algorithms based on extracting white lines from the soccer field have become increasingly popular. Moreover, white lines are less influenced by light than are the color model of the goals. Therefore, we propose an algorithm that transforms the omni-directional image into an unwrapped transformed image, enhancing the extraction features. The process is described as follows: First, radical scan-lines were used to process omni-directional images, reducing the computational load and improving system efficiency. The lines were radically arranged around the center of the omni-directional camera image, resulting in a shorter computational time compared with the traditional Cartesian coordinate system. However, the omni-directional image is a distorted image, which makes it difficult to recognize the

  15. An algorithm for image clusters detection and identification based on color for an autonomous mobile robot

    SciTech Connect

    Uy, D.L.

    1996-02-01

    An algorithm for detection and identification of image clusters or {open_quotes}blobs{close_quotes} based on color information for an autonomous mobile robot is developed. The input image data are first processed using a crisp color fuszzyfier, a binary smoothing filter, and a median filter. The processed image data is then inputed to the image clusters detection and identification program. The program employed the concept of {open_quotes}elastic rectangle{close_quotes}that stretches in such a way that the whole blob is finally enclosed in a rectangle. A C-program is develop to test the algorithm. The algorithm is tested only on image data of 8x8 sizes with different number of blobs in them. The algorithm works very in detecting and identifying image clusters.

  16. The Robotically Controlled Telescope (RCT): First Five Years of Fully Autonomous Operation

    NASA Astrophysics Data System (ADS)

    Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Strolger, Louis-Gregory; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    We review the status of the 1.3-meter Robotically Controlled Telescope (RCT), located at Kitt Peak National Observatory in Arizona. Through the efforts of a consortium of institutions, the RCT has been refurbished and automated to obtain optical images in support of a wide variety of astrophysical research investigations. The refurbished RCT came back on line in 2003, with observing undertaken via pre-scheduled scripts. Since 2007 the observatory has operated in a fully autonomous mode to acquire observations for the numerous and diverse research programs being pursued by the consortium membership and for guest observers. Many challenges and obstacles have been overcome throughout the refurbishment and automation, allowing this venerable telescope to continue its productive history.

  17. Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.

    PubMed

    Ribeiro, C H; Hemerly, E M

    1999-06-01

    Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.

  18. Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.

    PubMed

    Ribeiro, C H; Hemerly, E M

    2000-02-01

    Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.

  19. Autonomous trajectory generation for mobile robots with non-holonomic and steering angle constraints

    SciTech Connect

    Pin, F.G.; Vasseur, H.A.

    1990-01-01

    This paper presents an approach to the trajectory planning of mobile platforms characterized by non-holonomic constraints and constraints on the steering angle and steering angle rate. The approach is based on geometric reasoning and provides deterministic trajectories for all pairs of initial and final configurations (position x, y, and orientation {theta}) of the robot. Furthermore, the method generates trajectories taking into account the forward and reverse mode of motion of the vehicle, or combination of these when complex maneuvering is involved or when the environment is obstructed with obstacles. The trajectory planning algorithm is described, and examples of trajectories generated for a variety of environmental conditions are presented. The generation of the trajectories only takes a few milliseconds of run time on a micro Vax, making the approach quite attractive for use as a real-time motion planner for teleoperated or sensor-based autonomous vehicles in complex environments. 10 refs., 11 figs.

  20. Portable robot for autonomous venipuncture using 3D near infrared image guidance.

    PubMed

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2013-09-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model.

  1. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    NASA Astrophysics Data System (ADS)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  2. Portable robot for autonomous venipuncture using 3D near infrared image guidance

    PubMed Central

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2015-01-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model. PMID:26120592

  3. Multiresolutional schemata for unsupervised learning of autonomous robots for 3D space operation

    NASA Technical Reports Server (NTRS)

    Lacaze, Alberto; Meystel, Michael; Meystel, Alex

    1994-01-01

    This paper describes a novel approach to the development of a learning control system for autonomous space robot (ASR) which presents the ASR as a 'baby' -- that is, a system with no a priori knowledge of the world in which it operates, but with behavior acquisition techniques that allows it to build this knowledge from the experiences of actions within a particular environment (we will call it an Astro-baby). The learning techniques are rooted in the recursive algorithm for inductive generation of nested schemata molded from processes of early cognitive development in humans. The algorithm extracts data from the environment and by means of correlation and abduction, it creates schemata that are used for control. This system is robust enough to deal with a constantly changing environment because such changes provoke the creation of new schemata by generalizing from experiences, while still maintaining minimal computational complexity, thanks to the system's multiresolutional nature.

  4. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  5. Robotic-Controlled, Autonomous Friction Stir Welding Processes for In-Situ Fabrication, Maintenance, and Repair

    NASA Astrophysics Data System (ADS)

    Zhou, W.

    NASA s new vision of human and robotic missions to the Moon Mars and beyond will demand large and permanent infrastructures on the Moon and other planets including power plants communication towers human and biomass habitats launch and landing facilities fabrication and repair workshops and research facilities so that material utilization and product development can be carried out and subsisted in-situ The conventional approach of transporting pre-constructed fabricated structures from earth to the Moon planets will no longer be feasible due to limited lifting capacity and extremely high transportation costs associated with long duration space travel To minimize transport of pre-made large structures between earth and the Moon planets minimize crew time for the fabrication and assembly of infrastructures on the Moon planets and to assure crew safety and maintain quality during the operation there is a strong need for robotic capabilities for in-situ fabrication maintenance and repair Clearly development of innovative autonomous in-situ fabrication maintenance and repair technologies is crucial to the success of both NASA s unmanned preparation missions and manned exploration missions In-space material joining is not new to NASA Many lessons were learned from NASA s International Space Welding Experiment which employed the Electron Beam Welding process for space welding experiments Significant safety concerns related to high-energy beams arcing spatter elecromagnetic fields and molten particles were

  6. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-03-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.

  7. Teaching and implementing autonomous robotic lab walkthroughs in a biotech laboratory through model-based visual tracking

    NASA Astrophysics Data System (ADS)

    Wojtczyk, Martin; Panin, Giorgio; Röder, Thorsten; Lenz, Claus; Nair, Suraj; Heidemann, Rüdiger; Goudar, Chetan; Knoll, Alois

    2010-01-01

    After utilizing robots for more than 30 years for classic industrial automation applications, service robots form a constantly increasing market, although the big breakthrough is still awaited. Our approach to service robots was driven by the idea of supporting lab personnel in a biotechnology laboratory. After initial development in Germany, a mobile robot platform extended with an industrial manipulator and the necessary sensors for indoor localization and object manipulation, has been shipped to Bayer HealthCare in Berkeley, CA, USA, a global player in the sector of biopharmaceutical products, located in the San Francisco bay area. The determined goal of the mobile manipulator is to support the off-shift staff to carry out completely autonomous or guided, remote controlled lab walkthroughs, which we implement utilizing a recent development of our computer vision group: OpenTL - an integrated framework for model-based visual tracking.

  8. Remote Sensing of Radiation Dose Rate by a Robot for Outdoor Usage

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Doi, K.; Kanematsu, H.; Utsumi, Y.; Hashimoto, R.; Takashina, T.

    2013-04-01

    In the present paper, the design and prototyping of a telemetry system, in which GPS, camera, and scintillation counter were mounted on a crawler type traveling vehicle, were conducted for targeting outdoor usage such as school playground. As a result, the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt. The results were as follows: (1) It was confirmed that the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt (running speed: 17[m/min]). (2) It was confirmed that the location information captured by GPS is visible on the Google map, and that the incorporation of video information is also possible to play. (3)A radiation dose rate of 0.09[μSv / h] was obtained in the ground. The value is less than the 1/40 ([3.8μSv / h]) allowable radiation dose rate for children in Fukushima Prefecture.(4)As a further work, modifying to program traveling, the measurement of the distribution of the radiation dose rate in a school of Fukushima Prefecture, and class delivery on radiation measurement will be carried out.

  9. Hedonic quality or reward? A study of basic pleasure in homeostasis and decision making of a motivated autonomous robot

    PubMed Central

    Lewis, Matthew; Cañamero, Lola

    2016-01-01

    We present a robot architecture and experiments to investigate some of the roles that pleasure plays in the decision making (action selection) process of an autonomous robot that must survive in its environment. We have conducted three sets of experiments to assess the effect of different types of pleasure—related versus unrelated to the satisfaction of physiological needs—under different environmental circumstances. Our results indicate that pleasure, including pleasure unrelated to need satisfaction, has value for homeostatic management in terms of improved viability and increased flexibility in adaptive behavior. PMID:28018120

  10. Visual identification and similarity measures used for on-line motion planning of autonomous robots in unknown environments

    NASA Astrophysics Data System (ADS)

    Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar

    2017-02-01

    In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.

  11. On the design of neuro-controllers for individual and social learning behaviour in autonomous robots: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Pini, Giovanni; Tuci, Elio

    2008-06-01

    In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).

  12. Hedonic quality or reward? A study of basic pleasure in homeostasis and decision making of a motivated autonomous robot.

    PubMed

    Lewis, Matthew; Cañamero, Lola

    2016-10-01

    We present a robot architecture and experiments to investigate some of the roles that pleasure plays in the decision making (action selection) process of an autonomous robot that must survive in its environment. We have conducted three sets of experiments to assess the effect of different types of pleasure-related versus unrelated to the satisfaction of physiological needs-under different environmental circumstances. Our results indicate that pleasure, including pleasure unrelated to need satisfaction, has value for homeostatic management in terms of improved viability and increased flexibility in adaptive behavior.

  13. Autonomous global sky surveillance with real-time robotic follow-up: Night Sky Awareness through Thinking Telescopes Technology

    NASA Astrophysics Data System (ADS)

    Vestrand, T.; Davis, H.; Wren, J.; Wozniak, P.; Norman, B.; White, R.; Bloch, J.; Fenimore, E.; Hogge, B.; Jah, M.; Rast, R.

    We discuss the development of prototypes for a global grid of advanced "thinking" sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time, follow-up observations. The layered, fault-tolerant, network uses relatively inexpensive robotic EO sensors to provide persistent autonomous monitoring and real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  14. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  15. Real-time expert system for control of an autonomous mobile robot including diagnosis of unexpected occurrences

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Kammer, D.W.

    1986-01-01

    An autonomous mobile robot deals with the empirical world which is never fully predictable, hence it must continually monitor its performance by comparing the actual responses of sensors to their exected responses. Where a discrepancy occurs, the source of the discrepancy must be diagnosed and on-line corrective actions or replanning may be required. The use of a production system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explaining, verifying and modifying the rules which determine the robot's behavior; it also permits the incremental extension of the domain of competence. However, real-time operation poses a number of challenges due to the dynamic nature of the data and because the system must frequently deal with a large knowledge base in a limited time. An implementation of a control system is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, executing and monitoring a set of navigational tasks. One of the essential components of the problem domain is the occurrence of an ''unexpected'' happening e.g., as new obstacles are moved into the domain during the robot traverse, or when an obstacle undetectable by the long-range sonar sensors is suddenly observed by a proximity sensor. In a recent demonstration of the system, the detection of a problem generated an interrupt alarm, a diagnostic procedure, and a new plan, which was successfully executed in real time.

  16. Biomimetic evolutionary analysis: testing the adaptive value of vertebrate tail stiffness in autonomous swimming robots.

    PubMed

    Long, J H; Koob, T J; Irving, K; Combie, K; Engel, V; Livingston, N; Lammert, A; Schumacher, J

    2006-12-01

    For early vertebrates, a long-standing hypothesis is that vertebrae evolved as a locomotor adaptation, stiffening the body axis and enhancing swimming performance. While supported by biomechanical data, this hypothesis has not been tested using an evolutionary approach. We did so by extending biomimetic evolutionary analysis (BEA), which builds physical simulations of extinct systems, to include use of autonomous robots as proxies of early vertebrates competing in a forage navigation task. Modeled after free-swimming larvae of sea squirts (Chordata, Urochordata), three robotic tadpoles (;Tadros'), each with a propulsive tail bearing a biomimetic notochord of variable spring stiffness, k (N m(-1)), searched for, oriented to, and orbited in two dimensions around a light source. Within each of ten generations, we selected for increased swimming speed, U (m s(-1)) and decreased time to the light source, t (s), average distance from the source, R (m) and wobble maneuvering, W (rad s(-2)). In software simulation, we coded two quantitative trait loci (QTL) that determine k: bending modulus, E (Nm(-2)) and length, L (m). Both QTL were mutated during replication, independently assorted during meiosis and, as haploid gametes, entered into the gene pool in proportion to parental fitness. After random mating created three new diploid genotypes, we fabricated three new offspring tails. In the presence of both selection and chance events (mutation, genetic drift), the phenotypic means of this small population evolved. The classic hypothesis was supported in that k was positively correlated (r(2)=0.40) with navigational prowess, NP, the dimensionless ratio of U to the product of R, t and W. However, the plausible adaptive scenario, even in this simplified system, is more complex, since the remaining variance in NP was correlated with the residuals of R and U taken with respect to k, suggesting that changes in k alone are insufficient to explain the evolution of NP.

  17. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics

    NASA Astrophysics Data System (ADS)

    Jaffe, Jules S.; Franks, Peter J. S.; Roberts, Paul L. D.; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-01

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1-10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical-biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics.

  18. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics.

    PubMed

    Jaffe, Jules S; Franks, Peter J S; Roberts, Paul L D; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-24

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1-10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical-biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics.

  19. Adjustably Autonomous Multi-agent Plan Execution with an Internal Spacecraft Free-Flying Robot Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Nicewarner, Keith

    2006-01-01

    We present an multi-agent model-based autonomy architecture with monitoring, planning, diagnosis, and execution elements. We discuss an internal spacecraft free-flying robot prototype controlled by an implementation of this architecture and a ground test facility used for development. In addition, we discuss a simplified environment control life support system for the spacecraft domain also controlled by an implementation of this architecture. We discuss adjustable autonomy and how it applies to this architecture. We describe an interface that provides the user situation awareness of both autonomous systems and enables the user to dynamically edit the plans prior to and during execution as well as control these agents at various levels of autonomy. This interface also permits the agents to query the user or request the user to perform tasks to help achieve the commanded goals. We conclude by describing a scenario where these two agents and a human interact to cooperatively detect, diagnose and recover from a simulated spacecraft fault.

  20. Autonomous Scheduling of the 1.3-meter Robotically Controlled Telescope (RCT)

    NASA Astrophysics Data System (ADS)

    Strolger, Louis-Gregory; Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    The 1.3-meter telescope at Kitt Peak operates as a fully robotic instrument for optical imaging. An autonomous scheduling algorithm is an essential component of this observatory, and has been designed to manage numerous requests in various imaging modes in a manner similar to how requests are managed at queue-scheduled observatories, but with greater efficiency. Built from the INSGEN list generator and process spawner originally developed for the Berkeley Automatic Imaging Telescope, the RCT scheduler manages and integrates multi-user observations in real time, according to target and exposure information and program-specific constraints (e.g., user assigned priority, moon avoidance, airmass, or temporal constraints), while accounting for instrument limitations, meteorologic conditions, and other technical constraints. The robust system supports time-critical requests, such as with coordinated observations, while also providing short-term (hours) and long-term (days) monitoring capabilities, and one-off observations. We discuss the RCT scheduler, its current decision tree, and future prospects including integration with active partner-share monitoring (which factor into future observation requests) to insure fairness and parity of requests.

  1. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics

    PubMed Central

    Jaffe, Jules S.; Franks, Peter J. S.; Roberts, Paul L. D.; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-01

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1–10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical–biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics. PMID:28117837

  2. Towards autonomous locomotion: CPG-based control of smooth 3D slithering gait transition of a snake-like robot.

    PubMed

    Bing, Zhenshan; Cheng, Long; Chen, Guang; Röhrbein, Florian; Huang, Kai; Knoll, Alois

    2017-04-04

    Snake-like robots with 3D locomotion ability have significant advantages of adaptive travelling in diverse complex terrain over traditional legged or wheeled mobile robots. Despite numerous developed gaits, these snake-like robots suffer from unsmooth gait transitions by changing the locomotion speed, direction, and body shape, which would potentially cause undesired movement and abnormal torque. Hence, there exists a knowledge gap for snake-like robots to achieve autonomous locomotion. To address this problem, this paper presents the smooth slithering gait transition control based on a lightweight central pattern generator (CPG) model for snake-like robots. First, based on the convergence behavior of the gradient system, a lightweight CPG model with fast computing time was designed and compared with other widely adopted CPG models. Then, by reshaping the body into a more stable geometry, the slithering gait was modified, and studied based on the proposed CPG model, including the gait transition of locomotion speed, moving direction, and body shape. In contrast to sinusoid-based method, extensive simulations and prototype experiments finally demonstrated that smooth slithering gait transition can be effectively achieved using the proposed CPG-based control method without generating undesired locomotion and abnormal torque.

  3. The VIPER project (Visualization Integration Platform for Exploration Research): a biologically inspired autonomous reconfigurable robotic platform for diverse unstructured environments

    NASA Astrophysics Data System (ADS)

    Schubert, Oliver J.; Tolle, Charles R.

    2004-09-01

    Over the last decade the world has seen numerous autonomous vehicle programs. Wheels and track designs are the basis for many of these vehicles. This is primarily due to four main reasons: a vast preexisting knowledge base for these designs, energy efficiency of power sources, scalability of actuators, and the lack of control systems technologies for handling alternate highly complex distributed systems. Though large efforts seek to improve the mobility of these vehicles, many limitations still exist for these systems within unstructured environments, e.g. limited mobility within industrial and nuclear accident sites where existing plant configurations have been extensively changed. These unstructured operational environments include missions for exploration, reconnaissance, and emergency recovery of objects within reconfigured or collapsed structures, e.g. bombed buildings. More importantly, these environments present a clear and present danger for direct human interactions during the initial phases of recovery operations. Clearly, the current classes of autonomous vehicles are incapable of performing in these environments. Thus the next generation of designs must include highly reconfigurable and flexible autonomous robotic platforms. This new breed of autonomous vehicles will be both highly flexible and environmentally adaptable. Presented in this paper is one of the most successful designs from nature, the snake-eel-worm (SEW). This design implements shape memory alloy (SMA) actuators which allow for scaling of the robotic SEW designs from sub-micron scale to heavy industrial implementations without major conceptual redesigns as required in traditional hydraulic, pneumatic, or motor driven systems. Autonomous vehicles based on the SEW design posses the ability to easily move between air based environments and fluid based environments with limited or no reconfiguration. Under a SEW designed vehicle, one not only achieves vastly improved maneuverability within a

  4. Passive Optically Encoded Transponder (POET) An Acquisition And Alignment Target For Autonomous Robotics

    NASA Astrophysics Data System (ADS)

    White, G. K.

    1987-02-01

    Relative position information concerning an object that is to be acquired, attached to, or manipulated in some way by a robotic system is usually supplied by a known database or through vision information of some kind. Vision systems normally require some degree of intelligence to produce complete position information and therefore are relatively sophisticated, slow, or both. Simple "targets" require some amount of pattern recognition in autonomous operations and do not usually lend themselves to precision applications. This paper describes work on a discrete optical element prototype target which when interrogated by a video camera system, will provide noncontact relative position information about all 6 degrees-of-freedom (DOF). This information is available within the active field of view (FOV) of the transponder and could be processed by microprocessor-based, software algorithms with simple pattern recognition capabilities. The interrogation system (camera) is composed of a standard charge injection device (CID) array video camera, a controllable macrozoom lens, a liquid crystal shutter (LCS), and a point-source multispectral illuminator. This allows the transponder to be used where a standard video camera vision system is needed, or already implemented, and results in a relatively fast system (approximately 10 Hz). A passive optically encoded transponder (POET) implemented in a "stick-on" holographic optical element (HOE) is proposed as a next generation target, to supply relative position information in all 6 DOF for acquisition and precision alignment. In applications requiring maximum bandwidth and resolution, the fact that no "pattern recognition" is required in the proposed system results in the ability to interrogate the transponder in real time with a dedicated nonvision, interrogation system, resulting in a multiorder of magnitude increase in speed. The transponder (target) is configured to provide optimum information for the intended use. Being

  5. Control Algorithms and Simulated Environment Developed and Tested for Multiagent Robotics for Autonomous Inspection of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Wong, Edmond

    2005-01-01

    The NASA Glenn Research Center and academic partners are developing advanced multiagent robotic control algorithms that will enable the autonomous inspection and repair of future propulsion systems. In this application, on-wing engine inspections will be performed autonomously by large groups of cooperative miniature robots that will traverse the surfaces of engine components to search for damage. The eventual goal is to replace manual engine inspections that require expensive and time-consuming full engine teardowns and allow the early detection of problems that would otherwise result in catastrophic component failures. As a preliminary step toward the long-term realization of a practical working system, researchers are developing the technology to implement a proof-of-concept testbed demonstration. In a multiagent system, the individual agents are generally programmed with relatively simple controllers that define a limited set of behaviors. However, these behaviors are designed in such a way that, through the localized interaction among individual agents and between the agents and the environment, they result in self-organized, emergent group behavior that can solve a given complex problem, such as cooperative inspection. One advantage to the multiagent approach is that it allows for robustness and fault tolerance through redundancy in task handling. In addition, the relatively simple agent controllers demand minimal computational capability, which in turn allows for greater miniaturization of the robotic agents.

  6. Perception for Outdoor Navigation

    DTIC Science & Technology

    1991-12-01

    driving in traffic. The fifth and final chapter, ’Combining artificial neural networks and symbolic processing for autonomous robot guidance’, shows how we combine neural nets with map data in a complete system.

  7. Assessing the Impact of an Autonomous Robotics Competition for STEM Education

    ERIC Educational Resources Information Center

    Chung, C. J. ChanJin; Cartwright, Christopher; Cole, Matthew

    2014-01-01

    Robotics competitions for K-12 students are popular, but are students really learning and improving their STEM scores through robotics competitions? If not, why not? If they are, how much more effective is learning through competitions than traditional classes? Is there room for improvement? What is the best robotics competition model to maximize…

  8. Master's in Autonomous Systems: An Overview of the Robotics Curriculum and Outcomes at ISEP, Portugal

    ERIC Educational Resources Information Center

    Silva, E.; Almeida, J.; Martins, A.; Baptista, J. P.; Campos Neves, B.

    2013-01-01

    Robotics research in Portugal is increasing every year, but few students embrace it as one of their first choices for study. Until recently, job offers for engineers were plentiful, and those looking for a degree in science and technology would avoid areas considered to be demanding, like robotics. At the undergraduate level, robotics programs are…

  9. Performance of multiple tasks by an autonomous robot using visual and ultrasound sensing

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Dickens, M. ); Weisbin, C.R. )

    1990-01-01

    While there have been many successful mobile robot experiments, only a few papers have addressed issues pertaining to the range of applicability, or robustness, of robotic systems. The purpose of this paper is to report results of a series of benchmark experiments done to determine and quantify the robustness of an integrated hardware and software system of a mobile robot. 5 refs., 6 figs.

  10. Autonomous Power: From War to Peace in the I-Robot Millennium

    DTIC Science & Technology

    2015-02-25

    operationalization of autonomous power at highest intergovernmental level. 15. SUBJECT TERMS: Autonomy, National Power, Artificial Intelligence 16. SECURITY...future, humans will create a new autonomous species based on artificial intelligence that will eventually surpass humans in cognitive superiority. In...Perspectives 9 Chapter 2: Autonomy 13 Artificial Intelligence 13 Autonomous AI Control 15 Chapter 3

  11. Application of chaotic dynamics in a recurrent neural network to control: hardware implementation into a novel autonomous roving robot.

    PubMed

    Li, Yongtao; Kurata, Shuhei; Morita, Shogo; Shimizu, So; Munetaka, Daigo; Nara, Shigetoshi

    2008-09-01

    Originating from a viewpoint that complex/chaotic dynamics would play an important role in biological system including brains, chaotic dynamics introduced in a recurrent neural network was applied to control. The results of computer experiment was successfully implemented into a novel autonomous roving robot, which can only catch rough target information with uncertainty by a few sensors. It was employed to solve practical two-dimensional mazes using adaptive neural dynamics generated by the recurrent neural network in which four prototype simple motions are embedded. Adaptive switching of a system parameter in the neural network results in stationary motion or chaotic motion depending on dynamical situations. The results of hardware implementation and practical experiment using it show that, in given two-dimensional mazes, the robot can successfully avoid obstacles and reach the target. Therefore, we believe that chaotic dynamics has novel potential capability in controlling, and could be utilized to practical engineering application.

  12. Current challenges in autonomous vehicle development

    NASA Astrophysics Data System (ADS)

    Connelly, J.; Hong, W. S.; Mahoney, R. B., Jr.; Sparrow, D. A.

    2006-05-01

    The field of autonomous vehicles is a rapidly growing one, with significant interest from both government and industry sectors. Autonomous vehicles represent the intersection of artificial intelligence (AI) and robotics, combining decision-making with real-time control. Autonomous vehicles are desired for use in search and rescue, urban reconnaissance, mine detonation, supply convoys, and more. The general adage is to use robots for anything dull, dirty, dangerous or dumb. While a great deal of research has been done on autonomous systems, there are only a handful of fielded examples incorporating machine autonomy beyond the level of teleoperation, especially in outdoor/complex environments. In an attempt to assess and understand the current state of the art in autonomous vehicle development, a few areas where unsolved problems remain became clear. This paper outlines those areas and provides suggestions for the focus of science and technology research. The first step in evaluating the current state of autonomous vehicle development was to develop a definition of autonomy. A number of autonomy level classification systems were reviewed. The resulting working definitions and classification schemes used by the authors are summarized in the opening sections of the paper. The remainder of the report discusses current approaches and challenges in decision-making and real-time control for autonomous vehicles. Suggested research focus areas for near-, mid-, and long-term development are also presented.

  13. Operator-centered control of a semi-autonomous industrial robot

    SciTech Connect

    Spelt, P.F.; Jones, S.L.

    1994-12-31

    This paper presents work done by Oak Ridge National Laboratory and Remotec, Inc., to develop a new operator-centered control system for Remotec`s Andros telerobot. Andros robots are presently used by numerous electric utilities, the armed forces, and numerous law enforcement agencies to perform tasks which are hazardous for human operators. This project has automated task components and enhanced the video graphics display of the robot`s position in the environment to significantly reduce operator workload. The procedure of automating a telerobot requires the addition of computer power to the robot, along with a variety of sensors and encoders to provide information about the robots performance in and relationship to its environment The resulting vehicle serves as a platform for research on strategies to integrate automated tasks with those performed by a human operator. The addition of these capabilities will greatly enhance the safety and efficiency of performance in hazardous environments.

  14. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  15. Robotics.

    ERIC Educational Resources Information Center

    Waddell, Steve; Doty, Keith L.

    1999-01-01

    "Why Teach Robotics?" (Waddell) suggests that the United States lags behind Europe and Japan in use of robotics in industry and teaching. "Creating a Course in Mobile Robotics" (Doty) outlines course elements of the Intelligent Machines Design Lab. (SK)

  16. Application of autonomous robotics to surveillance of waste storage containers for radioactive surface contamination

    SciTech Connect

    Sweeney, F.J.; Beckerman, M.; Butler, P.L.; Jones, J.P.; Reister, D.B.

    1991-01-01

    This paper describes a proof-of-principal demonstration performed with the HERMIES-III mobile robot to automate the inspection of waste storage drums for radioactive surface contamination and thereby reduce the human burden of operating a robot and worker exposure to potentially hazardous environments. Software and hardware for the demonstration were developed by a team consisting of Oak Ridge National Laboratory, and the Universities of Florida, Michigan, Tennessee, and Texas. Robot navigation, machine vision, manipulator control, parallel processing and human-machine interface techniques developed by the team were demonstrated utilizing advanced computer architectures. The demonstration consists of over 100,000 lines of computer code executing on nine computers.

  17. Application of autonomous robotics to surveillance of waste storage containers for radioactive surface contamination

    SciTech Connect

    Sweeney, F.J.; Beckerman, M.; Butler, P.L.; Jones, J.P.; Reister, D.B.

    1991-12-31

    This paper describes a proof-of-principal demonstration performed with the HERMIES-III mobile robot to automate the inspection of waste storage drums for radioactive surface contamination and thereby reduce the human burden of operating a robot and worker exposure to potentially hazardous environments. Software and hardware for the demonstration were developed by a team consisting of Oak Ridge National Laboratory, and the Universities of Florida, Michigan, Tennessee, and Texas. Robot navigation, machine vision, manipulator control, parallel processing and human-machine interface techniques developed by the team were demonstrated utilizing advanced computer architectures. The demonstration consists of over 100,000 lines of computer code executing on nine computers.

  18. Multi-sensor integration for autonomous robots in nuclear power plants

    SciTech Connect

    Mann, R.C.; Jones, J.P.; Beckerman, M.; Glover, C.W.; Farkas, L.; Bilbro, G.L.; Snyder, W.

    1989-01-01

    As part of a concerted RandD program in advanced robotics for hazardous environments, scientists and engineers at the Oak Ridge National Laboratory (ORNL) are performing research in the areas of systems integration, range-sensor-based 3-D world modeling, and multi-sensor integration. This program features a unique teaming arrangement that involves the universities of Florida, Michigan, Tennessee, and Texas; Odetics Corporation; and ORNL. This paper summarizes work directed at integrating information extracted from data collected with range sensors and CCD cameras on-board a mobile robot, in order to produce reliable descriptions of the robot's environment. Specifically, the paper describes the integration of two-dimensional vision and sonar range information, and an approach to integrate registered luminance and laser range images. All operations are carried out on-board the mobile robot using a 16-processor hypercube computer. 14 refs., 4 figs.

  19. Multi-Tier Multi-Agent Autonomous Robotic Planetary Surface/Subsurface Reconnaissance for Life

    NASA Astrophysics Data System (ADS)

    Fink, W.; Dohm, J. M.; Tarbell, M. A.; Hare, T. M.; Baker, V. R.; Schulze-Makuch, D.; Furfaro, R.; Fairén, A. G.; Ferré, T. P. A.; Miyamoto, H.; Komatsu, G.; Mahaney, W. C.

    2006-03-01

    Tier-scalable autonomous reconnaissance enables intelligent, unconstrained, and distributed science-driven exploration of prime locations on Venus, Mars, Io, Europa, Titan, and elsewhere, allowing for increased science return and the search for life.

  20. Magician Simulator. A Realistic Simulator for Heterogenous Teams of Autonomous Robots

    DTIC Science & Technology

    2011-01-18

    Communication is an issue in that data from the robots is expected to be provided at least once each second across an built up area that is up to 500m...0.5 meter, with updates expected every second. Other major issues included in the competition are as follows: • Power limitations—Robots need to...the simulator. A. The Environment The following issues are considered in the simulator: • The possibility of defining up to three phases, each

  1. Magician Simulator: A Realistic Simulator for Heterogenous Teams of Autonomous Robots. MAGIC 2010 Challenge

    DTIC Science & Technology

    2011-02-07

    communications, solar panels, and low power computer control. All components and peripherals were packaged as interchangeable modules, four per scout...Figure 1 for an example based on our simulation of the proposed Grand Challenge environment). Communication is an issue in that data from the robots...major issues included in the competition are as follows: • Power limitations—Robots need to return to base or enter a designated service zone (DSZ

  2. DC Motor Drive for Small Autonomous Robots with Educational and Research Purpose

    NASA Astrophysics Data System (ADS)

    Krklješ, Damir; Babković, Kalman; Nagy, László; Borovac, Branislav; Nikolić, Milan

    Many student robot competitions have been established during the last decade. One of them, and the most popular in Europe, is the European competition EUROBOT. The basic aim of this competition is to promote the robotics among young people, mostly students and high school pupils. The additional outcome of the competition is the development of faculty curriculums that are based on this competition. Such curriculum has been developed at the Faculty of Technical Science in Novi Sad. The curriculum duration is two semesters. During the first semester the theoretical basis is presented to the students. During the second semester the students, divided into teams of three to five students, develop the robots which will take part in the incoming EUROBOT competition. Since the time for the robot development is short, the basic electronic kit is provided for the students. The basic parts of the kit are two DC motor drives dedicated to the robot locomotion. The drives will also be used in the research concerning the multi segment robot foot. This paper presents the DC motor drive and its features. The experimental results concerning speed and position regulations and also the current limiting is presented too.

  3. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  4. Autonomous intelligent assembly systems LDRD 105746 final report.

    SciTech Connect

    Anderson, Robert J.

    2013-04-01

    This report documents a three-year to develop technology that enables mobile robots to perform autonomous assembly tasks in unstructured outdoor environments. This is a multi-tier problem that requires an integration of a large number of different software technologies including: command and control, estimation and localization, distributed communications, object recognition, pose estimation, real-time scanning, and scene interpretation. Although ultimately unsuccessful in achieving a target brick stacking task autonomously, numerous important component technologies were nevertheless developed. Such technologies include: a patent-pending polygon snake algorithm for robust feature tracking, a color grid algorithm for uniquely identification and calibration, a command and control framework for abstracting robot commands, a scanning capability that utilizes a compact robot portable scanner, and more. This report describes this project and these developed technologies.

  5. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self

  6. Intelligent Mobile Autonomous System

    DTIC Science & Technology

    1987-01-01

    jerk application. (c) Negative jerk application. Group (a). Application of positve jerk. Force is increased from initial value to force of resistance...fundamentals of the new emerging area of autonomous robotics . The goal of this research is to develop a theory of design and functioning of Intelligent...scientific research. This report contributes to a new rapidly developing area of autonomous robotics . Actual experience of dealing with autonomous robots (or

  7. Generating Self-Reliant Teams of Autonomous Cooperating Robots: Desired design Characteristics

    SciTech Connect

    Parker, L.E.

    1999-05-01

    The difficulties in designing a cooperative team are significant. Several of the key questions that must be resolved when designing a cooperative control architecture include: How do we formulate, describe, decompose, and allocate problems among a group of intelligent agents? How do we enable agents to communicate and interact? How do we ensure that agents act coherently in their actions? How do we allow agents to recognize and reconcile conflicts? However, in addition to these key issues, the software architecture must be designed to enable multi-robot teams to be robust, reliable, and flexible. Without these capabilities, the resulting robot team will not be able to successfully deal with the dynamic and uncertain nature of the real world. In this extended abstract, we first describe these desired capabilities. We then briefly describe the ALLIANCE software architecture that we have previously developed for multi-robot cooperation. We then briefly analyze the ALLIANCE architecture in terms of the desired design qualities identified.

  8. Creative Engineering Based Education with Autonomous Robots Considering Job Search Support

    NASA Astrophysics Data System (ADS)

    Takezawa, Satoshi; Nagamatsu, Masao; Takashima, Akihiko; Nakamura, Kaeko; Ohtake, Hideo; Yoshida, Kanou

    The Robotics Course in our Mechanical Systems Engineering Department offers “Robotics Exercise Lessons” as one of its Problem-Solution Based Specialized Subjects. This is intended to motivate students learning and to help them acquire fundamental items and skills on mechanical engineering and improve understanding of Robotics Basic Theory. Our current curriculum was established to accomplish this objective based on two pieces of research in 2005: an evaluation questionnaire on the education of our Mechanical Systems Engineering Department for graduates and a survey on the kind of human resources which companies are seeking and their expectations for our department. This paper reports the academic results and reflections of job search support in recent years as inherited and developed from the previous curriculum.

  9. Ground Simulation of an Autonomous Satellite Rendezvous and Tracking System Using Dual Robotic Systems

    NASA Technical Reports Server (NTRS)

    Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.

    2012-01-01

    A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.

  10. Behavior generation strategy of artificial behavioral system by self-learning paradigm for autonomous robot tasks

    NASA Astrophysics Data System (ADS)

    Dağlarli, Evren; Temeltaş, Hakan

    2008-04-01

    In this study, behavior generation and self-learning paradigms are investigated for the real-time applications of multi-goal mobile robot tasks. The method is capable to generate new behaviors and it combines them in order to achieve multi goal tasks. The proposed method is composed from three layers: Behavior Generating Module, Coordination Level and Emotion -Motivation Level. Last two levels use Hidden Markov models to manage dynamical structure of behaviors. The kinematics and dynamic model of the mobile robot with non-holonomic constraints are considered in the behavior based control architecture. The proposed method is tested on a four-wheel driven and four-wheel steered mobile robot with constraints in simulation environment and results are obtained successfully.

  11. First observations of teleseismic P-waves with autonomous underwater robots: towards future global network of mobile seismometers

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexei; Nolet, Guust; Hello, Yann; Simons, Frederik; Bonnieux, Sébastien

    2013-04-01

    We report here the first successful observations of underwater acoustic signals generated by teleseismic P-waves recorded by autonomous robots MERMAID (short for Mobile Earthquake Recording in Marine Areas by Independent Divers). During 2011-2012 we have conducted three test campaigns for a total duration of about 8 weeks in the Ligurian Sea which have allowed us to record nine teleseismic events (distance more than 60 degree) of magnitudes higher than 6 and one closer event (distance 23 degree) of magnitude 5.5. Our results indicate that no simple relation exists between the magnitude of the source event and the signal-to-noise ratio (SNR) of the corresponding acoustic signals. Other factors, such as fault orientation and meteorological conditions, play an important role in the detectability of the seismic events. We also show examples of the events recorded during these test runs and how their frequency characteristics allow them to be recognized automatically by an algorithm based on the wavelet transform. We shall also report on more recent results obtained during the first fully autonomous run (currently ongoing) of the final MERMAID design in the Mediterranean Sea.

  12. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot.

    PubMed

    Chen, Alvin I; Balter, Max L; Maguire, Timothy J; Yarmush, Martin L

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture.

  13. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot

    PubMed Central

    Chen, Alvin I.; Balter, Max L.; Maguire, Timothy J.; Yarmush, Martin L.

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture. PMID:26779381

  14. A modular structure for the control of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Perebaskine, Victor-Olivier

    1992-02-01

    A hierarchically organized robot control structure is described. Four steps were achieved: characterization of the basic element types composing the functional layer, and identification of interaction modes between the different components of the control structure; specification and implementation of specific communication mechanisms to support these interaction modes in the control structure; study of the module structure, and its control; implementation of the system and validation by several experiments. One of the major aspects of this system is the possibility to program the robot's reactivity according to the mission requirements. Another important aspect is the ability to modify the relationships between modules during mission execution.

  15. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  16. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  17. Mobbing Behavior and Deceit and its Role in Bioinspired Autonomous Robotic Agents

    DTIC Science & Technology

    2012-01-01

    behavior mainly displayed in cooperative birds but can also be found in animals such as meerkats [13] and squirrels [11] shown in figure 1. This...Robotics and Automa- tion, Vol. 14, No. 6, December 1998, pp. 926-939. [13] Graw, B. and Manser, M., “The function of mobbing coopera- tive meerkats

  18. Navy Requirements for Controlling Multiple Off-Board Robots Using the Autonomous Unmanned Vehicle Workbench

    DTIC Science & Technology

    2007-06-01

    I18N ) .................................................................9 3. Java Look + Feel...User Interface GWOT Global War on Terror xx HSI Human-Systems Integration HTML Hypertext Markup Language I18N Internationalization ID...without legal restrictions. 9 2. Internationalization ( I18N ) The United States is not the only country using AUVs. When robots from the US

  19. Demonstration of Waypoint Navigation for A Semi-Autonomous Prototype Surf-Zone Robot

    DTIC Science & Technology

    2006-06-01

    3 1. Biologically Based Gait . . . . . . . . . . . . . . . . . . . 4 2. Limited Slip Differential . . . . . . . . . . . . . . . . . . 6 II. ROBOT...phase and the tripod gait is resumed. . . . . . . 6 10. The limited slip differential device. . . . . . . . . . . . . . . . . . . . . 7 11. The BL2000...limited slip differential which engages when a Wheg cannot move either because it is stuck or has encountered an obstacle. When this occurs, the other set

  20. 3-D Ultrasound Guidance of Autonomous Robot for Location of Ferrous Shrapnel

    PubMed Central

    Rogers, Albert J.; Light, Edward D.

    2010-01-01

    Vibrations can be induced in ferromagnetic shrapnel by a variable electromagnet. Real time 3-D color Doppler ultrasound located the induced motion in a needle fragment and determined its 3-D position in the scanner coordinates. This information was used to guide a robot which moved a probe to touch the shrapnel fragment. PMID:19574140

  1. Approaching Complexity through Planful Play: Kindergarten Children's Strategies in Constructing an Autonomous Robot's Behavior

    ERIC Educational Resources Information Center

    Levy, S. T.; Mioduser, D.

    2010-01-01

    This study investigates how young children master, construct and understand intelligent rule-based robot behaviors, focusing on their strategies in gradually meeting the tasks' complexity. The wider aim is to provide a comprehensive map of the kinds of transitions and learning that take place in constructing simple emergent behaviors, particularly…

  2. Design of a low-cost high-performance autonomous robot for nuclear environments

    SciTech Connect

    Burhanpurkar, V.P.

    1994-12-31

    This paper presents two key aspects of a novel low-cost modular mobile robot architecture for nuclear environments. Key features of the system are (a) a novel ultrasonic sensor for scene analysis in unstructured environments and (b) an efficient ground-search algorithm for ground-level contamination mapping without a priori maps or preprogramming.

  3. 3-D ultrasound guidance of autonomous robot for location of ferrous shrapnel.

    PubMed

    Rogers, Albert J; Light, Edward D; Smith, Stephen W

    2009-07-01

    Vibrations can be induced in ferromagnetic shrapnel by a variable electromagnet. Real time 3-D color Doppler ultrasound located the induced motion in a needle fragment and determined its 3-D position in the scanner coordinates. This information was used to guide a robot which moved a probe to touch the shrapnel fragment.

  4. The research of autonomous obstacle avoidance of mobile robot based on multi-sensor integration

    NASA Astrophysics Data System (ADS)

    Zhao, Ming; Han, Baoling

    2016-11-01

    The object of this study is the bionic quadruped mobile robot. The study has proposed a system design plan for mobile robot obstacle avoidance with the binocular stereo visual sensor and the self-control 3D Lidar integrated with modified ant colony optimization path planning to realize the reconstruction of the environmental map. Because the working condition of a mobile robot is complex, the result of the 3D reconstruction with a single binocular sensor is undesirable when feature points are few and the light condition is poor. Therefore, this system integrates the stereo vision sensor blumblebee2 and the Lidar sensor together to detect the cloud information of 3D points of environmental obstacles. This paper proposes the sensor information fusion technology to rebuild the environment map. Firstly, according to the Lidar data and visual data on obstacle detection respectively, and then consider two methods respectively to detect the distribution of obstacles. Finally fusing the data to get the more complete, more accurate distribution of obstacles in the scene. Then the thesis introduces ant colony algorithm. It has analyzed advantages and disadvantages of the ant colony optimization and its formation cause deeply, and then improved the system with the help of the ant colony optimization to increase the rate of convergence and precision of the algorithm in robot path planning. Such improvements and integrations overcome the shortcomings of the ant colony optimization like involving into the local optimal solution easily, slow search speed and poor search results. This experiment deals with images and programs the motor drive under the compiling environment of Matlab and Visual Studio and establishes the visual 2.5D grid map. Finally it plans a global path for the mobile robot according to the ant colony algorithm. The feasibility and effectiveness of the system are confirmed by ROS and simulation platform of Linux.

  5. A demonstration of autonomous navigation and machine vision using the HERMIES-IIB robot

    SciTech Connect

    Burks, B.L.; Barnett, D.L.; Jones, J.P.; Killough, S.M.

    1987-01-01

    In this paper, advances to our mobile robot series (currently HERMIES-IIB) to include 8 NCUBE processors on-board, (computationally equivalent to 8 Vax 11/780's) operating in parallel, and augmentation of the sensor suite with cameras to facilitate on-board vision analysis and goal finding are described. The essential capabilities of the expert system described in earlier papers have been ported to the on-board HERMIES-IIB computers thereby eliminating off-board computation. A successful experiment is described in which a robot is placed in an initial arbitrary location without prior specification of the room contents, successfully discovers and navigates around stationary and moving obstacles, picks up and moves small obstacles, searches for a control panel, and reads the meters found on the panel. 19 refs., 5 figs.

  6. Design and implementation of a mechanically heterogeneous robot group

    NASA Astrophysics Data System (ADS)

    Sukhatme, Gaurav S.; Montgomery, James F.; Mataric, Maja J.

    1999-08-01

    This paper describes the design and construction of a cooperative, heterogeneous robot group comprised of one semi-autonomous aerial robot and two autonomous ground robots. The robots are designed to perform automated surveillance and reconnaissance of an urban outdoor area using onboard sensing. The ground vehicles have GPS, sonar for obstacle detection and avoidance, and a simple color- based vision system. Navigation is performed using an optimal mixture of odometry and GPS. The helicopter is equipped with a GPS/INS system, a camera, and a framegrabber. Each robot has an embedded 486 PC/104 processor running the QNX real-time operating system. Individual robot controllers are behavior-based and decentralized. We describe a control strategy and architecture that coordinates the robots with minimal top- down planning. The overall system is controlled at high level by a single human operator using a specially designed control unit. The operator is able to task the group with a mission using a minimal amount of training. The group can re-task itself based on sensor inputs and can also be re- tasked by the operator. We describe a particular reconnaissance mission that the robots have been tested with, and lessons learned during the design and implementation. Our initial results with these experiments are encouraging given the challenging mechanics of the aerial robot. We conclude the paper with a discussion of ongoing and future work.

  7. Stations Outdoors

    ERIC Educational Resources Information Center

    Madison, John P.; And Others

    1976-01-01

    Described is a program of outdoor education utilizing activity-oriented learning stations. Described are 13 activities including: a pond study, orienteering, nature crafts, outdoor mathematics, linear distance measurement, and area measurement. (SL)

  8. Robust Agent Control of an Autonomous Robot with Many Sensors and Actuators

    DTIC Science & Technology

    1993-05-01

    The neural con- troller was developed by Beer and was inspired by Pearson’s flexor burst- generator model of cockroach locomotion. The goal was to...Workshop on Intelligent Robots and Systems’, Ibaraki, Japan, pp. 383-388. Beer , R. & Chiel, H. (1993), Simulations of Cockroach Locomotion and Es...Previous work exploring fully distributed, insect-like locomotion controllers has only been addressed for flat terrain ( Beer , Chiel, Quinn & Espenschied

  9. Automated Cartography by an Autonomous Mobile Robot Using Ultrasonic Range Finders

    DTIC Science & Technology

    1993-09-01

    Macintosh Powerbook 145 notebook computer with an Artic- ulate Systems Voice Navigator voice interface is provided for user communications with the...for corrosion and loose fittings. A voice interface is currently under development to provide a more intuitive robot/hu- man. This system will enable...Please read the owner’s manual prior to operating this computer. A Voice Navigator voice interface system is also available for voice recognition

  10. The Backseat Control Architecture for Autonomous Robotic Vehicles: A Case Study with the Iver2 AUV

    DTIC Science & Technology

    2010-06-01

    the vehicle to perform real-time adaptive environ- mental sampling. In addition to the oceanographic sensors, each vehicle is equipped with a 16...acoustic data to a set of 8-channel analog to digital (A/D) converter boards described in II-A.5, allowing the vehicle to perform real-time underwater target...for oceanographic sampling with the autonomous kayaks are described in [9]. III. THE BACKSEAT CONTROL ARCHITECTURE A. Overview The iOceanServerComms

  11. The Rise of Robots: The Military’s Use of Autonomous Lethal Force

    DTIC Science & Technology

    2015-02-17

    autonomous lethal engagement ability. As military professionals, we have a duty to ensure the legal framework, proper policy, moral and ethical ...recommendations for potential paths forward. To facilitate the discussion, the paper is divided into three major areas: the legal implications, ethical ...the legal framework, proper policy, moral and ethical considerations, as well as proper tactics and doctrine are in place to ensure compliance with

  12. A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion.

    PubMed

    Manfredi, L; Assaf, T; Mintchev, S; Marrazza, S; Capantini, L; Orofino, S; Ascari, L; Grillner, S; Wallén, P; Ekeberg, O; Stefanini, C; Dario, P

    2013-10-01

    The bioinspired approach has been key in combining the disciplines of robotics with neuroscience in an effective and promising fashion. Indeed, certain aspects in the field of neuroscience, such as goal-directed locomotion and behaviour selection, can be validated through robotic artefacts. In particular, swimming is a functionally important behaviour where neuromuscular structures, neural control architecture and operation can be replicated artificially following models from biology and neuroscience. In this article, we present a biomimetic system inspired by the lamprey, an early vertebrate that locomotes using anguilliform swimming. The artefact possesses extra- and proprioceptive sensory receptors, muscle-like actuation, distributed embedded control and a vision system. Experiments on optimised swimming and on goal-directed locomotion are reported, as well as the assessment of the performance of the system, which shows high energy efficiency and adaptive behaviour. While the focus is on providing a robotic platform for testing biological models, the reported system can also be of major relevance for the development of engineering system applications.

  13. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  14. Novel Microbial Diversity Retrieved by Autonomous Robotic Exploration of the World's Deepest Vertical Phreatic Sinkhole

    NASA Astrophysics Data System (ADS)

    Sahl, Jason W.; Fairfield, Nathaniel; Harris, J. Kirk; Wettergreen, David; Stone, William C.; Spear, John R.

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (˜318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  15. An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers

    NASA Technical Reports Server (NTRS)

    Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun

    2007-01-01

    One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the

  16. Robotics

    SciTech Connect

    Scheide, A.W.

    1983-11-01

    This article reviews some of the technical areas and history associated with robotics, provides information relative to the formation of a Robotics Industry Committee within the Industry Applications Society (IAS), and describes how all activities relating to robotics will be coordinated within the IEEE. Industrial robots are being used for material handling, processes such as coating and arc welding, and some mechanical and electronics assembly. An industrial robot is defined as a programmable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through variable programmed motions for a variety of tasks. The initial focus of the Robotics Industry Committee will be on the application of robotics systems to the various industries that are represented within the IAS.

  17. Dissociated emergent-response system and fine-processing system in human neural network and a heuristic neural architecture for autonomous humanoid robots.

    PubMed

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence.

  18. Dissociated Emergent-Response System and Fine-Processing System in Human Neural Network and a Heuristic Neural Architecture for Autonomous Humanoid Robots

    PubMed Central

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence. PMID:21331371

  19. Autonomous Marine Robotic Technology Reveals an Expansive Benthic Bacterial Community Relevant to Regional Nitrogen Biogeochemistry.

    PubMed

    Valentine, David L; Fisher, G Burch; Pizarro, Oscar; Kaiser, Carl L; Yoerger, Dana; Breier, John A; Tarn, Jonathan

    2016-10-06

    Benthic accumulations of filamentous, mat-forming bacteria occur throughout the oceans where bisulfide mingles with oxygen or nitrate, providing key but poorly quantified linkages between elemental cycles of carbon, nitrogen and sulfur. Here we used the autonomous underwater vehicle Sentry to conduct a contiguous, 12.5 km photoimaging survey of sea-floor colonies of filamentous bacteria between 80 and 579 m water depth, spanning the continental shelf to the deep suboxic waters of the Santa Barbara Basin (SBB). The survey provided >31 000 images and revealed contiguous, white-colored bacterial colonization coating > ∼80% of the ocean floor and spanning over 1.6 km, between 487 and 523 m water depth. Based on their localization within the stratified waters of the SBB we hypothesize a dynamic and annular biogeochemical zonation by which the bacteria capitalize on periodic flushing events to accumulate and utilize nitrate. Oceanographic time series data bracket the imaging survey and indicate rapid and contemporaneous nitrate loss, while autonomous capture of microbial communities from the benthic boundary layer concurrent with imaging provides possible identities for the responsible bacteria. Based on these observations we explore the ecological context of such mats and their possible importance in the nitrogen cycle of the SBB.

  20. Compact 3D lidar based on optically coupled horizontal and vertical scanning mechanism for the autonomous navigation of robots

    NASA Astrophysics Data System (ADS)

    Lee, Min-Gu; Baeg, Seung-Ho; Lee, Ki-Min; Lee, Hae-Seok; Baeg, Moon-Hong; Park, Jong-Ok; Kim, Hong-Ki

    2011-06-01

    The purpose of this research is to develop a new 3D LIDAR sensor, named KIDAR-B25, for measuring 3D image information with high range accuracy, high speed and compact size. To measure a distance to the target object, we developed a range measurement unit, which is implemented by the direct Time-Of-Flight (TOF) method using TDC chip, a pulsed laser transmitter as an illumination source (pulse width: 10 ns, wavelength: 905 nm, repetition rate: 30kHz, peak power: 20W), and an Si APD receiver, which has high sensitivity and wide bandwidth. Also, we devised a horizontal and vertical scanning mechanism, climbing in a spiral and coupled with the laser optical path. Besides, control electronics such as the motor controller, the signal processing unit, the power distributor and so on, are developed and integrated in a compact assembly. The key point of the 3D LIDAR design proposed in this paper is to use the compact scanning mechanism, which is coupled with optical module horizontally and vertically. This KIDAR-B25 has the same beam propagation axis for emitting pulse laser and receiving reflected one with no optical interference each other. The scanning performance of the KIDAR-B25 has proven with the stable operation up to 20Hz (vertical), 40Hz (horizontal) and the time is about 1.7s to reach the maximum speed. The range of vertical plane can be available up to +/-10 degree FOV (Field Of View) with a 0.25 degree angular resolution. The whole horizontal plane (360 degree) can be also available with 0.125 degree angular resolution. Since the KIDAR-B25 sensor has been planned and developed to be used in mobile robots for navigation, we conducted an outdoor test for evaluating its performance. The experimental results show that the captured 3D imaging data can be usefully applicable to the navigation of the robot for detecting and avoiding the moving objects with real time.

  1. Nasa's Ant-Inspired Swarmie Robots

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.

    2016-01-01

    As humans push further beyond the grasp of earth, robotic missions in advance of human missions will play an increasingly important role. These robotic systems will find and retrieve valuable resources as part of an in-situ resource utilization (ISRU) strategy. They will need to be highly autonomous while maintaining high task performance levels. NASA Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots to be used as a ground-based research platform for ISRU missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in a previously unmapped environment and return those resources to a central site. This talk will guide the audience through the Swarmie robot project from its conception by students in a New Mexico research lab to its robot trials in an outdoor parking lot at NASA. The software technologies and techniques used on the project will be discussed, as well as various challenges and solutions that were encountered by the development team along the way.

  2. The Summer Robotic Autonomy Course

    NASA Technical Reports Server (NTRS)

    Nourbakhsh, Illah R.

    2002-01-01

    We offered a first Robotic Autonomy course this summer, located at NASA/Ames' new NASA Research Park, for approximately 30 high school students. In this 7-week course, students worked in ten teams to build then program advanced autonomous robots capable of visual processing and high-speed wireless communication. The course made use of challenge-based curricula, culminating each week with a Wednesday Challenge Day and a Friday Exhibition and Contest Day. Robotic Autonomy provided a comprehensive grounding in elementary robotics, including basic electronics, electronics evaluation, microprocessor programming, real-time control, and robot mechanics and kinematics. Our course then continued the educational process by introducing higher-level perception, action and autonomy topics, including teleoperation, visual servoing, intelligent scheduling and planning and cooperative problem-solving. We were able to deliver such a comprehensive, high-level education in robotic autonomy for two reasons. First, the content resulted from close collaboration between the CMU Robotics Institute and researchers in the Information Sciences and Technology Directorate and various education program/project managers at NASA/Ames. This collaboration produced not only educational content, but will also be focal to the conduct of formative and summative evaluations of the course for further refinement. Second, CMU rapid prototyping skills as well as the PI's low-overhead perception and locomotion research projects enabled design and delivery of affordable robot kits with unprecedented sensory- locomotory capability. Each Trikebot robot was capable of both indoor locomotion and high-speed outdoor motion and was equipped with a high-speed vision system coupled to a low-cost pan/tilt head. As planned, follow the completion of Robotic Autonomy, each student took home an autonomous, competent robot. This robot is the student's to keep, as she explores robotics with an extremely capable tool in the

  3. Vertical stream curricula integration of problem-based learning using an autonomous vacuum robot in a mechatronics course

    NASA Astrophysics Data System (ADS)

    Chin, Cheng; Yue, Keng

    2011-10-01

    Difficulties in teaching a multi-disciplinary subject such as the mechatronics system design module in Departments of Mechatronics Engineering at Temasek Polytechnic arise from the gap in experience and skill among staff and students who have different backgrounds in mechanical, computer and electrical engineering within the Mechatronics Department. The departments piloted a new vertical stream curricula model (VSCAM) to enhance student learning in mechatronics system design through integration of educational activities from the first to the second year of the course. In this case study, a problem-based learning (PBL) method on an autonomous vacuum robot in the mechatronics systems design module was proposed to allow the students to have hands-on experience in the mechatronics system design. The proposed works included in PBL consist of seminar sessions, weekly works and project presentation to provide holistic assessment on teamwork and individual contributions. At the end of VSCAM, an integrative evaluation was conducted using confidence logs, attitude surveys and questionnaires. It was found that the activities were quite appreciated by the participating staff and students. Hence, PBL has served as an effective pedagogical framework for teaching multidisciplinary subjects in mechatronics engineering education if adequate guidance and support are given to staff and students.

  4. Semi-autonomous robots for reactor containments. Annual summary report, [1993--1994

    SciTech Connect

    Not Available

    1994-05-06

    During 1993, the activity at the University was split into two primary groups. One group provided direct support for the development and testing of the RVIR vehicle. This effort culminated in a demonstration of the vehicle at ORNL during December. The second group of researchers focused attention on pushing the technology forward in the areas of radiation imaging, navigation, and sensing modalities. A major effort in technology transfer took place during this year. All of these efforts reflected in the periodic progress reports which are attached. During 1994, our attention will change from the Nuclear Energy program to the Environmental Restoration and Waste Management office. The immediate needs of the Robotics Technology Development Program within the Office of Technology Development of EM drove this change in target applications. The University will be working closely with the national laboratories to further develop and transfer existing technologies to mobile platforms which are currently being designed and employed in seriously hazardous environments.

  5. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  6. A novel autonomous, bioinspired swimming robot developed by neuroscientists and bioengineers.

    PubMed

    Stefanini, C; Orofino, S; Manfredi, L; Mintchev, S; Marrazza, S; Assaf, T; Capantini, L; Sinibaldi, E; Grillner, S; Wallén, P; Dario, P

    2012-06-01

    This paper describes the development of a new biorobotic platform inspired by the lamprey. Design, fabrication and implemented control are all based on biomechanical and neuroscientific findings on this eel-like fish. The lamprey model has been extensively studied and characterized in recent years because it possesses all basic functions and control mechanisms of higher vertebrates, while at the same time having fewer neurons and simplified neural structures. The untethered robot has a flexible body driven by compliant actuators with proprioceptive feedback. It also has binocular vision for vision-based navigation. The platform has been successfully and extensively experimentally tested in aquatic environments, has high energy efficiency and is ready to be used as investigation tool for high level motor tasks.

  7. Robotics

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An overview of research being done into the use of robotic devices in space by MSFC is discussed. The video includes footage and explanations of robots being used to blast layers of thermal coating from the Space Shuttle's external tanks, the Shuttle's Remote Manipulator Arm, and animations of an Orbiting Maneuvering Vehicle to retrieve and repair satellites.

  8. Effects of robot-driven gait orthosis treadmill training on the autonomic response in rehabilitation-responsive stroke and cervical spondylotic myelopathy patients.

    PubMed

    Magagnin, Valentina; Bo, Ivano; Turiel, Maurizio; Fornari, Maurizio; Caiani, Enrico G; Porta, Alberto

    2010-06-01

    Body weight supported treadmill training (BWSTT) assisted with a robotic-driven gait orthosis is utilized in rehabilitation of individuals with lost motor skills. A typical rehabilitation session included: sitting, standing, suspension, robotic-assisted walking at 1.5 and 2.5km/h, respectively with 50% body weight support and recovery. While the effects of robotic-assisted BWSTT on motor performances were deeply studied, the influences on the cardiovascular control are still unknown. The aim of the study was to evaluate in stroke (ST) and cervical spondylotic myelopathy (CSM) patients: (1) the autonomic response during a traditional robotic-assisted BWSTT session of motor rehabilitation; (2) the effects of 30 daily sessions of BWSTT on cardiovascular regulation. The autonomic response was assessed through symbolic analysis of short-term heart rate variability in 11 pathologic subjects (5 ST and 6 CSM patients) whose motor skills were improved as a result of the rehabilitation therapy. Results showed variable individual responses to the rehabilitation session in ST patients at the beginning of the therapy. At the end of the rehabilitation process, the responses of ST patients were less variable and more similar to those previously observed in healthy subjects. CSM patients exhibited an exaggerated vagal response to the fastest walking phase during the first rehabilitative session. This abnormal response was limited after the last rehabilitative session. We conclude that robotic-assisted BWSTT is helpful in restoring cardiovascular control in rehabilitation-responsive ST patients and limiting vagal responses in rehabilitation-responsive CSM patients.

  9. A multimodal interface for real-time soldier-robot teaming

    NASA Astrophysics Data System (ADS)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  10. Automatic learning rate adjustment for self-supervising autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    Described is an application in which an Artificial Neural Network (ANN) controls the positioning of a robot arm with five degrees of freedom by using visual feedback provided by two cameras. This application and the specific ANN model, local liner maps, are based on the work of Ritter, Martinetz, and Schulten. We extended their approach by generating a filtered, average positioning error from the continuous camera feedback and by coupling the learning rate to this error. When the network learns to position the arm, the positioning error decreases and so does the learning rate until the system stabilizes at a minimum error and learning rate. This abolishes the need for a predetermined cooling schedule. The automatic cooling procedure results in a closed loop control with no distinction between a learning phase and a production phase. If the positioning error suddenly starts to increase due to an internal failure such as a broken joint, or an environmental change such as a camera moving, the learning rate increases accordingly. Thus, learning is automatically activated and the network adapts to the new condition after which the error decreases again and learning is 'shut off'. The automatic cooling is therefore a prerequisite for the autonomy and the fault tolerance of the system.

  11. Image processing for navigation on a mobile embedded platform: design of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Loose, Harald; Lemke, Christiane; Papazov, Chavdar

    2006-02-01

    This paper deals with intelligent mobile platforms connected to a camera controlled by a small hardware-platform called RCUBE. This platform is able to provide features of a typical actuator-sensor board with various inputs and outputs as well as computing power and image recognition capabilities. Several intelligent autonomous RCBUE devices can be equipped and programmed to participate in the BOSPORUS network. These components form an intelligent network for gathering sensor and image data, sensor data fusion, navigation and control of mobile platforms. The RCUBE platform provides a standalone solution for image processing, which will be explained and presented. It plays a major role for several components in a reference implementation of the BOSPORUS system. On the one hand, intelligent cameras will be positioned in the environment, analyzing the events from a fixed point of view and sharing their perceptions with other components in the system. On the other hand, image processing results will contribute to a reliable navigation of a mobile system, which is crucially important. Fixed landmarks and other objects appropriate for determining the position of a mobile system can be recognized. For navigation other methods are added, i.e. GPS calculations and odometers.

  12. LandingNav: a precision autonomous landing sensor for robotic platforms on planetary bodies

    NASA Astrophysics Data System (ADS)

    Katake, Anup; Bruccoleri, Chrisitian; Singla, Puneet; Junkins, John L.

    2010-01-01

    Increased interest in the exploration of extra terrestrial planetary bodies calls for an increase in the number of spacecraft landing on remote planetary surfaces. Currently, imaging and radar based surveys are used to determine regions of interest and a safe landing zone. The purpose of this paper is to introduce LandingNav, a sensor system solution for autonomous landing on planetary bodies that enables landing on unknown terrain. LandingNav is based on a novel multiple field of view imaging system that leverages the integration of different state of the art technologies for feature detection, tracking, and 3D dense stereo map creation. In this paper we present the test flight results of the LandingNav system prototype. Sources of errors due to hardware limitations and processing algorithms were identified and will be discussed. This paper also shows that addressing the issues identified during the post-flight test data analysis will reduce the error down to 1-2%, thus providing for a high precision 3D range map sensor system.

  13. Outdoor allergens.

    PubMed Central

    Burge, H A; Rogers, C A

    2000-01-01

    Outdoor allergens are an important part of the exposures that lead to allergic disease. Understanding the role of outdoor allergens requires a knowledge of the nature of outdoor allergen-bearing particles, the distributions of their source, and the nature of the aerosols (particle types, sizes, dynamics of concentrations). Primary sources for outdoor allergens include vascular plants (pollen, fern spores, soy dust), and fungi (spores, hyphae). Nonvascular plants, algae, and arthropods contribute small numbers of allergen-bearing particles. Particles are released from sources into the air by wind, rain, mechanical disturbance, or active discharge mechanisms. Once airborne, they follow the physical laws that apply to all airborne particles. Although some outdoor allergens penetrate indoor spaces, exposure occurs mostly outdoors. Even short-term peak outdoor exposures can be important in eliciting acute symptoms. Monitoring of airborne biological particles is usually by particle impaction and microscopic examination. Centrally located monitoring stations give regional-scale measurements for aeroallergen levels. Evidence for the role of outdoor allergens in allergic rhinitis is strong and is rapidly increasing for a role in asthma. Pollen and fungal spore exposures have both been implicated in acute exacerbations of asthma, and sensitivity to some fungal spores predicts the existence of asthma. Synergism and/or antagonism probably occurs with other outdoor air particles and gases. Control involves avoidance of exposure (staying indoors, preventing entry of outdoor aerosols) as well as immunotherapy, which is effective for pollen but of limited effect for spores. Outdoor allergens have been the subject of only limited studies with respect to the epidemiology of asthma. Much remains to be studied with respect to prevalence patterns, exposure and disease relationships, and control. PMID:10931783

  14. Outdoor Mathematics

    ERIC Educational Resources Information Center

    Kennard, Jackie

    2007-01-01

    One of the most interesting developments in teaching has been the growing importance of the outdoor environment. Whether it be playground, garden or field, the outdoors offers a range of challenging experiences, especially in the delivery of early mathematics. Oral feedback to parents, together with photographic displays, can show them that…

  15. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  16. Performance of a scanning laser line striper in outdoor lighting

    NASA Astrophysics Data System (ADS)

    Mertz, Christoph

    2013-05-01

    For search and rescue robots and reconnaissance robots it is important to detect objects in their vicinity. We have developed a scanning laser line striper that can produce dense 3D images using active illumination. The scanner consists of a camera and a MEMS-micro mirror based projector. It can also detect the presence of optically difficult material like glass and metal. The sensor can be used for autonomous operation or it can help a human operator to better remotely control the robot. In this paper we will evaluate the performance of the scanner under outdoor illumination, i.e. from operating in the shade to operating in full sunlight. We report the range, resolution and accuracy of the sensor and its ability to reconstruct objects like grass, wooden blocks, wires, metal objects, electronic devices like cell phones, blank RPG, and other inert explosive devices. Furthermore we evaluate its ability to detect the presence of glass and polished metal objects. Lastly we report on a user study that shows a significant improvement in a grasping task. The user is tasked with grasping a wire with the remotely controlled hand of a robot. We compare the time it takes to complete the task using the 3D scanner with using a traditional video camera.

  17. Development of dog-like retrieving capability in a ground robot

    NASA Astrophysics Data System (ADS)

    MacKenzie, Douglas C.; Ashok, Rahul; Rehg, James M.; Witus, Gary

    2013-01-01

    This paper presents the Mobile Intelligence Team's approach to addressing the CANINE outdoor ground robot competition. The competition required developing a robot that provided retrieving capabilities similar to a dog, while operating fully autonomously in unstructured environments. The vision team consisted of Mobile Intelligence, the Georgia Institute of Technology, and Wayne State University. Important computer vision aspects of the project were the ability to quickly learn the distinguishing characteristics of novel objects, searching images for the object as the robot drove a search pattern, identifying people near the robot for safe operations, correctly identify the object among distractors, and localizing the object for retrieval. The classifier used to identify the objects will be discussed, including an analysis of its performance, and an overview of the entire system architecture presented. A discussion of the robot's performance in the competition will demonstrate the system's successes in real-world testing.

  18. Does It "Want" or "Was It Programmed to..."? Kindergarten Children's Explanations of an Autonomous Robot's Adaptive Functioning

    ERIC Educational Resources Information Center

    Levy, Sharona T.; Mioduser, David

    2008-01-01

    This study investigates young children's perspectives in explaining a self-regulating mobile robot, as they learn to program its behaviors from rules. We explore their descriptions of a robot in action to determine the nature of their explanatory frameworks: psychological or technological. We have also studied the role of an adult's intervention…

  19. Effects of automation and task load on task switching during human supervision of multiple semi-autonomous robots in a dynamic environment.

    PubMed

    Squire, P N; Parasuraman, R

    2010-08-01

    The present study assessed the impact of task load and level of automation (LOA) on task switching in participants supervising a team of four or eight semi-autonomous robots in a simulated 'capture the flag' game. Participants were faster to perform the same task than when they chose to switch between different task actions. They also took longer to switch between different tasks when supervising the robots at a high compared to a low LOA. Task load, as manipulated by the number of robots to be supervised, did not influence switch costs. The results suggest that the design of future unmanned vehicle (UV) systems should take into account not simply how many UVs an operator can supervise, but also the impact of LOA and task operations on task switching during supervision of multiple UVs. The findings of this study are relevant for the ergonomics practice of UV systems. This research extends the cognitive theory of task switching to inform the design of UV systems and results show that switching between UVs is an important factor to consider.

  20. CASSY Robot

    NASA Astrophysics Data System (ADS)

    Pittman, Anna; Wright, Ann; Rice, Aaron; Shyaka, Claude

    2014-03-01

    The CASSY Robot project involved two square robots coded in RobotC. The goal was to code a robot to do a certain set of tasks autonomously. To begin with, our task was to code the robot so that it would roam a certain area, marked off by black tape. When the robot hit the black tape, it knew to back up and turn around. It was able to do this thanks to the light sensor that was attached to the bottom of the robot. Also, whenever the robot hit an obstacle, it knew to stop, back up, and turn around. This was primarily to prevent the robot from hurting itself if it hit an obstacle. This was accomplished by using touch sensors set up as bumpers. Once that was accomplished, we attached sonar sensors and created code so that one robot was able to find and track the other robot in a sort of intruder/police scenario. The overall goal of this project was to code the robot so that we can test it against a robot coded exactly the same, but using Layered Mode Selection Logic. Professor.

  1. Proceedings of the 1989 CESAR/CEA (Center for Engineering Systems Advanced Research/Commissariat a l'Energie Atomique) workshop on autonomous mobile robots (May 30--June 1, 1989)

    SciTech Connect

    Harber, K.S.; Pin, F.G. . Center for Engineering Systems Advanced Research)

    1990-03-01

    The US DOE Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) and the Commissariat a l'Energie Atomique's (CEA) Office de Robotique et Productique within the Directorat a la Valorization are working toward a long-term cooperative agreement and relationship in the area of Intelligent Systems Research (ISR). This report presents the proceedings of the first CESAR/CEA Workshop on Autonomous Mobile Robots which took place at ORNL on May 30, 31 and June 1, 1989. The purpose of the workshop was to present and discuss methodologies and algorithms under development at the two facilities in the area of perception and navigation for autonomous mobile robots in unstructured environments. Experimental demonstration of the algorithms and comparison of some of their features were proposed to take place within the framework of a previously mutually agreed-upon demonstration scenario or base-case.'' The base-case scenario described in detail in Appendix A, involved autonomous navigation by the robot in an a priori unknown environment with dynamic obstacles, in order to reach a predetermined goal. From the intermediate goal location, the robot had to search for and locate a control panel, move toward it, and dock in front of the panel face. The CESAR demonstration was successfully accomplished using the HERMIES-IIB robot while subsets of the CEA demonstration performed using the ARES robot simulation and animation system were presented. The first session of the workshop focused on these experimental demonstrations and on the needs and considerations for establishing benchmarks'' for testing autonomous robot control algorithms.

  2. Autonomous Mobile Robots.

    DTIC Science & Technology

    1986-01-30

    accurate, very maneuverable. problem of building a generally intelligent machine. Among self-powered vehicle carrying a small manipulator. Pluto living...drive system to carry on the longer various tasks. range work stalled in Pluto . We are working on a special-purpose manipulator for grasping doorknobs and...avoiding obstacles en route. This would encour- tions for use on board the vehicles. age efforts in stereo, sonar, path planning, and vision-based Pluto

  3. Autonomous Robotic Scientist

    NASA Astrophysics Data System (ADS)

    Woods, M.; Ward, R.; Honary, E.; Barnes, D.; Pullan, D.; Long, D.; Draper, C.

    2007-08-01

    Due to the success of the recent NASA interplanetary space mission MER (NASA, 2007) it hashighlighted the importance of using a roving science platform for exploration. Near and long- termrequirements for future interplanetary missions place increasing demands on rover performance toextract maximum benefit from the large effort and funding committed to such missions. Missions aremore diverse in their science objectives and require improved robustness and reliability over longerdistances during surface operations. To keep pace with these complex and evolving requirements it is essential that the level of autonomyusedon future missions be increased in order to improve the responsiveness of historical operations models which are biased towards an open-loop response for high-level anaylsis and decisionmaking. The driver of the CREST rover development initiative is the need to achieve: More accurate delivery of science instrumentation, New science opportunities, Large increases in the sciencedata returned to Earth, More robust and reliable operations and More efficient use of operationalresources. This paper presents our work and results obtained to date.

  4. On approximate reasoning and minimal models for the development of robust outdoor vehicle navigation schemes

    SciTech Connect

    Pin, F.G.

    1993-11-01

    Outdoor sensor-based operation of autonomous robots has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. Two basic principles, or philosophies, and their associated methodologies are proposed in an attempt to remedy some of these difficulties. The first principle is based on the concept of ``minimal model`` for accomplishing given tasks and proposes to utilize only the minimum level of information and precision necessary to accomplish elemental functions of complex tasks. This approach diverges completely from the direction taken by most artificial vision studies which conventionally call for crisp and detailed analysis of every available component in the perception data. The paper will first review the basic concepts of this approach and will discuss its pragmatic feasibility when embodied in a behaviorist framework. The second principle which is proposed deals with implicit representation of uncertainties using Fuzzy Set Theory-based approximations and approximate reasoning, rather than explicit (crisp) representation through calculation and conventional propagation techniques. A framework which merges these principles and approaches is presented, and its application to the problem of sensor-based outdoor navigation of a mobile robot is discussed. Results of navigation experiments with a real car in actual outdoor environments are also discussed to illustrate the feasibility of the overall concept.

  5. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot.

    PubMed

    Kitson, Philip J; Glatzel, Stefan; Cronin, Leroy

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic 'programs' which can run on similar low cost, user-constructed robotic platforms towards an 'open-source' regime in the area of chemical synthesis.

  6. The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot

    PubMed Central

    Kitson, Philip J; Glatzel, Stefan

    2016-01-01

    An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic ‘programs’ which can run on similar low cost, user-constructed robotic platforms towards an ‘open-source’ regime in the area of chemical synthesis. PMID:28144350

  7. Construction of Human Habitation Facility on Mars Using Low-Power Low-Mass Autonomous Robotic System

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Y.; Bao, X.; Badescu, M.; Beegle, L.; Sherrit, S.; Zacny, K.

    2012-06-01

    Critical to humans operation on Mars upon landing is the availability of an established infra-structure. A percussive fabrication system that produces blocks and works authonomously with a rover and its robotic arm would address this need.

  8. Dynamic multisensor fusion for mobile robot navigation in an indoor environment

    NASA Astrophysics Data System (ADS)

    Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.

    2001-10-01

    In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.

  9. Outdoor Integration

    ERIC Educational Resources Information Center

    Tatarchuk, Shawna; Eick, Charles

    2011-01-01

    An outdoor classroom is an exciting way to connect the learning of science to nature and the environment. Many school grounds include gardens, grassy areas, courtyards, and wooded areas. Some even have nearby streams or creeks. These are built-in laboratories for inquiry! In the authors' third-grade classroom, they align and integrate…

  10. Outdoor Activities.

    ERIC Educational Resources Information Center

    Minneapolis Independent School District 275, Minn.

    Twenty-four activities suitable for outdoor use by elementary school children are outlined. Activities designed to make children aware of their environment include soil painting, burr collecting, insect and pond water collecting, studies of insect galls and field mice, succession studies, and a model of natural selection using dyed toothpicks. A…

  11. Semi-Autonomous Collaborative Control of Multi-Robotic Systems for Multi-Task Multi-Target Pairing

    DTIC Science & Technology

    2011-11-01

    this paper proposes a control method for a single- master multi-slave ( SMMS ) teleoperator to cooperatively con- trol a team of mobile robots for a multi... SMMS ) teleoperator to cooperatively control a team of mobile robots for a multi-target mission. The major components of the proposed control method...required human resources and amplifying the human effort, the single-master multi-slave ( SMMS ) teleoperation has been con- sidered in this paper. Fong et

  12. Simulation of the outdoor energy efficiency of an autonomous solar kit based on meteorological data for a site in Central Europa

    NASA Astrophysics Data System (ADS)

    Bouzaki, Mohammed Moustafa; Chadel, Meriem; Benyoucef, Boumediene; Petit, Pierre; Aillerie, Michel

    2016-07-01

    This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europa (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In the proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.

  13. The Design and Implementation of a Semi-Autonomous Surf-Zone Robot Using Advanced Sensors and a Common Robot Operating System

    DTIC Science & Technology

    2011-06-01

    cockroach and its ability to overcome adverse obstacles. This design concept has been incorporated into several versions of surf-zone robots. An...Saunderson, J. Sattar, L.-a. Torres-Mendez, M. Jenkin, A. German , A. Hogue, A. Ripsman, J. Zacher, E. Milios, H. Liu, P. Zhang, M. Buehler, and C

  14. A Qualitative Approach to Mobile Robot Navigation Using RFID

    NASA Astrophysics Data System (ADS)

    Hossain, M.; Rashid, M. M.; Bhuiyan, M. M. I.; Ahmed, S.; Akhtaruzzaman, M.

    2013-12-01

    Radio Frequency Identification (RFID) system allows automatic identification of items with RFID tags using radio-waves. As the RFID tag has its unique identification number, it is also possible to detect a specific region where the RFID tag lies in. Recently it is widely been used in mobile robot navigation, localization, and mapping both in indoor and outdoor environment. This paper represents a navigation strategy for autonomous mobile robot using passive RFID system. Conventional approaches, such as landmark or dead-reckoning with excessive number of sensors, have complexities in establishing the navigation and localization process. The proposed method satisfies less complexity in navigation strategy as well as estimation of not only the position but also the orientation of the autonomous robot. In this research, polar coordinate system is adopted on the navigation surface where RFID tags are places in a grid with constant displacements. This paper also presents the performance comparisons among various grid architectures through simulation to establish a better solution of the navigation system. In addition, some stationary obstacles are introduced in the navigation environment to satisfy the viability of the navigation process of the autonomous mobile robot.

  15. Autonomous mobile communication relays

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Everett, Hobart R.; Manouk, Narek; Verma, Ambrish

    2002-07-01

    Maintaining a solid radio communication link between a mobile robot entering a building and an external base station is a well-recognized problem. Modern digital radios, while affording high bandwidth and Internet-protocol-based automatic routing capabilities, tend to operate on line-of-sight links. The communication link degrades quickly as a robot penetrates deeper into the interior of a building. This project investigates the use of mobile autonomous communication relay nodes to extend the effective range of a mobile robot exploring a complex interior environment. Each relay node is a small mobile slave robot equipped with sonar, ladar, and 802.11b radio repeater. For demonstration purposes, four Pioneer 2-DX robots are used as autonomous mobile relays, with SSC-San Diego's ROBART III acting as the lead robot. The relay robots follow the lead robot into a building and are automatically deployed at various locations to maintain a networked communication link back to the remote operator. With their on-board external sensors, they also act as rearguards to secure areas already explored by the lead robot. As the lead robot advances and RF shortcuts are detected, relay nodes that become unnecessary will be reclaimed and reused, all transparent to the operator. This project takes advantage of recent research results from several DARPA-funded tasks at various institutions in the areas of robotic simulation, ad hoc wireless networking, route planning, and navigation. This paper describes the progress of the first six months of the project.

  16. Habituation: a non-associative learning rule design for spiking neurons and an autonomous mobile robots implementation.

    PubMed

    Cyr, André; Boukadoum, Mounir

    2013-03-01

    This paper presents a novel bio-inspired habituation function for robots under control by an artificial spiking neural network. This non-associative learning rule is modelled at the synaptic level and validated through robotic behaviours in reaction to different stimuli patterns in a dynamical virtual 3D world. Habituation is minimally represented to show an attenuated response after exposure to and perception of persistent external stimuli. Based on current neurosciences research, the originality of this rule includes modulated response to variable frequencies of the captured stimuli. Filtering out repetitive data from the natural habituation mechanism has been demonstrated to be a key factor in the attention phenomenon, and inserting such a rule operating at multiple temporal dimensions of stimuli increases a robot's adaptive behaviours by ignoring broader contextual irrelevant information.

  17. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    PubMed

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

  18. Demonstration of a Semi-Autonomous Hybrid Brain-Machine Interface using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

    PubMed Central

    McMullen, David P.; Hotson, Guy; Katyal, Kapil D.; Wester, Brock A.; Fifer, Matthew S.; McGee, Timothy G.; Harris, Andrew; Johannes, Matthew S.; Vogelstein, R. Jacob; Ravitz, Alan D.; Anderson, William S.; Thakor, Nitish V.; Crone, Nathan E.

    2014-01-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 seconds for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs. PMID:24760914

  19. Robotics

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert O.

    2007-01-01

    Lunar robotic functions include: 1. Transport of crew and payloads on the surface of the moon; 2. Offloading payloads from a lunar lander; 3. Handling the deployment of surface systems; with 4. Human commanding of these functions from inside a lunar vehicle, habitat, or extravehicular (space walk), with Earth-based supervision. The systems that will perform these functions may not look like robots from science fiction. In fact, robotic functions may be automated trucks, cranes and winches. Use of this equipment prior to the crew s arrival or in the potentially long periods without crews on the surface, will require that these systems be computer controlled machines. The public release of NASA's Exploration plans at the 2nd Space Exploration Conference (Houston, December 2006) included a lunar outpost with as many as four unique mobility chassis designs. The sequence of lander offloading tasks involved as many as ten payloads, each with a unique set of geometry, mass and interface requirements. This plan was refined during a second phase study concluded in August 2007. Among the many improvements to the exploration plan were a reduction in the number of unique mobility chassis designs and a reduction in unique payload specifications. As the lunar surface system payloads have matured, so have the mobility and offloading functional requirements. While the architecture work continues, the community can expect to see functional requirements in the areas of surface mobility, surface handling, and human-systems interaction as follows: Surface Mobility 1. Transport crew on the lunar surface, accelerating construction tasks, expanding the crew s sphere of influence for scientific exploration, and providing a rapid return to an ascent module in an emergency. The crew transport can be with an un-pressurized rover, a small pressurized rover, or a larger mobile habitat. 2. Transport Extra-Vehicular Activity (EVA) equipment and construction payloads. 3. Transport habitats and

  20. Robotic Telesurgery Research

    DTIC Science & Technology

    2010-10-01

    concepts have been pursued to provide imaging capabilities for use in robotic surgery . The first concept involves developing a camera system that...Oleynikov, D. Project Title: CAESAR: Computer Automated Enhanced Support and Analysis for Robotic Surgery Source of Support: Intelligent Automation, Inc...successful autonomous robotic surgery . REFERENCES Dolghi, O., Strabala, K., Wortman, T., Goede, M., Farritor, S., & Oleynikov. (2010). Miniature

  1. Robotic and artificial intelligence for keyhole neurosurgery: the ROBOCAST project, a multi-modal autonomous path planner.

    PubMed

    De Momi, E; Ferrigno, G

    2010-01-01

    The robot and sensors integration for computer-assisted surgery and therapy (ROBOCAST) project (FP7-ICT-2007-215190) is co-funded by the European Union within the Seventh Framework Programme in the field of information and communication technologies. The ROBOCAST project focuses on robot- and artificial-intelligence-assisted keyhole neurosurgery (tumour biopsy and local drug delivery along straight or turning paths). The goal of this project is to assist surgeons with a robotic system controlled by an intelligent high-level controller (HLC) able to gather and integrate information from the surgeon, from diagnostic images, and from an array of on-field sensors. The HLC integrates pre-operative and intra-operative diagnostics data and measurements, intelligence augmentation, multiple-robot dexterity, and multiple sensory inputs in a closed-loop cooperating scheme including a smart interface for improved haptic immersion and integration. This paper, after the overall architecture description, focuses on the intelligent trajectory planner based on risk estimation and human criticism. The current status of development is reported, and first tests on the planner are shown by using a real image stack and risk descriptor phantom. The advantages of using a fuzzy risk description are given by the possibility of upgrading the knowledge on-field without the intervention of a knowledge engineer.

  2. Robotics

    NASA Technical Reports Server (NTRS)

    Rothschild, Lynn J.

    2012-01-01

    Earth's upper atmosphere is an extreme environment: dry, cold, and irradiated. It is unknown whether our aerobiosphere is limited to the transport of life, or there exist organisms that grow and reproduce while airborne (aerophiles); the microenvironments of suspended particles may harbor life at otherwise uninhabited altitudes[2]. The existence of aerophiles would significantly expand the range of planets considered candidates for life by, for example, including the cooler clouds of a hot Venus-like planet. The X project is an effort to engineer a robotic exploration and biosampling payload for a comprehensive survey of Earth's aerobiology. While many one-shot samples have been retrieved from above 15 km, their results are primarily qualitative; variations in method confound comparisons, leaving such major gaps in our knowledge of aerobiology as quantification of populations at different strata and relative species counts[1]. These challenges and X's preliminary solutions are explicated below. X's primary balloon payload is undergoing a series of calibrations before beginning flights in Spring 2012. A suborbital launch is currently planned for Summer 2012. A series of ground samples taken in Winter 2011 is being used to establish baseline counts and identify likely background contaminants.

  3. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  4. Autonomous Global Sky Surveillance with Real-Time Robotic Follow-up: Night Sky Awareness through Thinking Telescopes Technology

    DTIC Science & Technology

    2008-09-01

    therefore act as a binocular monitoring system employing closed loop feedback that autonomously identifies, generates alerts, and makes detailed...from space debris as well as satellites. To suppress false triggers, the “ binocular ” RAPTOR system uses two wide-field arrays (RAPTOR-A and RAPTOR-B...human vision, RAPTOR uses “ binocular ” imaging to distinguish distant objects from nearby objects and to suppress imaging faults found in a single “eye

  5. Outdoor Education -- Edinburgh

    ERIC Educational Resources Information Center

    Parker, Terry

    1974-01-01

    In Scotland, outdoor education is seen as a combination of outdoor pursuits and environmental studies. The article describes various centres in the Edinburgh area, outdoor education expeditions, and programs, such as mountaineering, rock climbing, orienteering, and canoeing. (KM)

  6. Cooperative robotics: bringing autonomy to explosive ordnance disposal robots

    NASA Astrophysics Data System (ADS)

    Del Signore, Michael J.; Czop, Andrew; Hacker, Kurt

    2008-04-01

    An ongoing effort within the US Naval EOD Technology Division (NAVEODTECHDIV) is exploring the integration of autonomous robotic technologies onto current and future Explosive Ordnance Disposal (EOD) robot platforms. The Cooperative Robotics program, though the support of the Joint Ground Robotics Enterprise (JGRE), has identified several autonomous robotic technologies useful to the EOD operator, and with the collaboration of academia and industry is in the process of bringing these technologies to EOD robot operators in the field. Initiated in January 2007, the Cooperative Robotics program includes the demonstration of various autonomous technologies to the EOD user community, and the optimization of these technologies for use on small EOD Unmanned Ground Vehicles (UGVs) in relevant environments. Through close interaction with actual EOD operators, these autonomous behaviors will be designed to work within the bounds of current EOD Tactics, Techniques, and Procedures (TTP). This paper will detail the ongoing and future efforts encompassing the Cooperative Robotics program including: technology demonstrations of autonomous robotic capabilities, development of autonomous capability requirements based on user focus groups, optimization of autonomous UGV behaviors to enable use in relevant environments based on current EOD TTP, and finally the transition of these technologies to current and future EOD robotic systems.

  7. Some Outdoor Educators' Experiences of Outdoor Education

    ERIC Educational Resources Information Center

    Gunn, Terry

    2006-01-01

    The phenomenological study presented in this paper attempts to determine, from outdoor educators, what it meant for them to be teaching outdoor education in Victorian secondary schools during 2004. In 1999, Lugg and Martin surveyed Victorian secondary schools to determine the types of outdoor education programs being run, the objectives of those…

  8. Autonomous Scheduling and Operation of the 1.3-meter Robotically Controlled Telescope (RCT) at Kitt Peak

    NASA Astrophysics Data System (ADS)

    Gelderman, Richard; Strolger, L.; Carini, M.; Marchenko, S.; Reddy Yaramala, S.; Rumph, M.; van Fleet, R.; Wood, J. D.

    2007-12-01

    The 1.3-meter (50-inch) telescope at Kitt Peak has been restored to operation as a fully robotic instrument for optical imaging. Once known as the Remotely Controlled Telescope, it is again being called the RCT, now standing for Robotically Controlled Telescope. The automation of the observatory has included development of a computer control system designed to accommodate and appropriately manage the myriad of optical observing modes typically managed at other multi-user, general-purpose observatories, but with much greater efficiency. The observation scheduling routine for the RCT is based on the insgen list generator and process spawner originally developed for the Berkeley Automatic Imaging Telescope (Richmond, Treffers, & Filippenko 1992). The software schedules observation requests according to target information and program-specific technical constraints, such as a user assigned priority, moon avoidance, airmass, seeing, etc.), taking into account telescope limitations, sky conditions, and technical and organizational constraints. The system supports research programs involving time-critical requests, coordinated observations and short-term (hours) and long-term (days) monitoring. We also discuss the execution and storage of the observations, the methods for the periodic accounting of partner shares (which factor into weighting of future observation requests), and our plans for providing public access to the data.

  9. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot

    PubMed Central

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R.; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  10. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot.

    PubMed

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-09-09

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm.

  11. The Outdoor Programming Handbook.

    ERIC Educational Resources Information Center

    Watters, Ron

    This manual provides guidelines for developing outdoor recreation programs. The manual was prepared for adult outdoor recreation programs, but could be useful for other age groups as well. The following topics are discussed: (1) the historical perspectives of outdoor recreation programming; (2) outdoor programming models, including the club model,…

  12. Outdoor Education Manual.

    ERIC Educational Resources Information Center

    Ontario Teachers' Federation, Toronto.

    A guide and introduction to outdoor education for the classroom teacher, the manual lists 5 aims and objectives of outdoor education and discusses the means for reaching these objectives through field trips, camping, outdoor learning centers, and outdoor teaching in school environs. Selected activities are described by subject area: arts and…

  13. Outdoor Education Manual.

    ERIC Educational Resources Information Center

    Nashville - Davidson County Metropolitan Public Schools, TN.

    Creative ways to use the outdoors as a part of the regular school curriculum are outlined in this teacher's manual for the elementary grades. Presented for consideration are the general objectives of outdoor education, suggestions for evaluating outdoor education experiences, and techniques for teaching outdoor education. The purpose and functions…

  14. Autonomous Integrated Navigation for Indoor Robots Utilizing On-Line Iterated Extended Rauch-Tung-Striebel Smoothing

    PubMed Central

    Xu, Yuan; Chen, Xiyuan; Li, Qinghua

    2013-01-01

    In order to reduce the estimated errors of the inertial navigation system (INS)/Wireless sensor network (WSN)-integrated navigation for mobile robots indoors, this work proposes an on-line iterated extended Rauch-Tung-Striebel smoothing (IERTSS) utilizing inertial measuring units (IMUs) and an ultrasonic positioning system. In this mode, an iterated Extended Kalman filter (IEKF) is used in forward data processing of the Extended Rauch-Tung-Striebel smoothing (ERTSS) to improve the accuracy of the filtering output for the smoother. Furthermore, in order to achieve the on-line smoothing, IERTSS is embedded into the average filter. For verification, a real indoor test has been done to assess the performance of the proposed method. The results show that the proposed method is effective in reducing the errors compared with the conventional schemes.

  15. Planning Flight Paths of Autonomous Aerobots

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric; Elfes, Alberto; Sharma, Shivanjli

    2009-01-01

    Algorithms for planning flight paths of autonomous aerobots (robotic blimps) to be deployed in scientific exploration of remote planets are undergoing development. These algorithms are also adaptable to terrestrial applications involving robotic submarines as well as aerobots and other autonomous aircraft used to acquire scientific data or to perform surveying or monitoring functions.

  16. Robotic intelligence kernel

    DOEpatents

    Bruemmer, David J.

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  17. Recent Advances in Robotics and Career Opportunities for Physicists

    NASA Astrophysics Data System (ADS)

    Bouchier, Paul

    2011-03-01

    Some of the most significant advances in robotic systems over the last year are shown in this talk, which covers both autonomous and partly autonomous robots. A few robotic employers, both in Texas and elsewhere are profiled, with an emphasis on opportunities of interest to physicists. The presenter is president of the Dallas Personal Robotics Group.

  18. Evolution of a Performance Metric for Urban Search and Rescue Robots (2003)

    DTIC Science & Technology

    2003-09-01

    fully autonomous robots , a robot almost directly from the mid-size soccer league, and even a blimp. The two fully autonomous teams demonstrated robots... autonomous robots (Figure 7). The first place team was the Idaho National Engineering and Environmental Laboratory (INEEL) team from the USA. This...walls, obstacles and victim locations. Other interesting systems included two low-cost but fully autonomous robots . Both teams focused on the low

  19. Outdoors classes

    NASA Astrophysics Data System (ADS)

    Szymanska-Markowska, Barbara

    2016-04-01

    Why should students be trapped within the four walls of the classroom when there are a lot of ideas to have lessons led in the different way? I am not a fan of having lessons at school. For many students it is also boring to stay only at school, too. So I decided to organize workshops and trips to Universities or outdoors. I created KMO ( Discoverer's Club for Teenagers) at my school where students gave me some ideas and we started to make them real. I teach at school where students don't like science. I try hard to change their point of view about it. That's why I started to take parts in different competitions with my students. Last year we measured noise everywhere by the use of applications on a tablet to convince them that noise is very harmful for our body and us. We examined that the most harmful noises were at school's breaks, near the motorways and in the households. We also proved that acoustic screens, which were near the motorways, didn't protect us from noise. We measured that 30 meters from the screens the noise is the same as the motorway. We won the main prize for these measurements. We also got awards for calculating the costs of a car supplied by powered by a solar panel. We measured everything by computer. This year we decided to write an essay about trees and weather. We went to the forest and found the cut trees because we wanted to read the age of tree from the stump. I hadn't known earlier that we could read the weather from the tree's grain. We examined a lot of trees and we can tell that trees are good carriers of information about weather and natural disasters. I started studies safety education and I have a lot of ideas how to get my students interested in this subject that is similar to P.E., physics and chemistry, too. I hope that I will use my abilities from European Space Education Resource Office and GIFT workshop. I plan to use satellite and space to teach my students how they can check information about terrorism, floods or other

  20. Robotic transportation.

    PubMed

    Lob, W S

    1990-09-01

    Mobile robots perform fetch-and-carry tasks autonomously. An intelligent, sensor-equipped mobile robot does not require dedicated pathways or extensive facility modification. In the hospital, mobile robots can be used to carry specimens, pharmaceuticals, meals, etc. between supply centers, patient areas, and laboratories. The HelpMate (Transitions Research Corp.) mobile robot was developed specifically for hospital environments. To reach a desired destination, Help-Mate navigates with an on-board computer that continuously polls a suite of sensors, matches the sensor data against a pre-programmed map of the environment, and issues drive commands and path corrections. A sender operates the robot with a user-friendly menu that prompts for payload insertion and desired destination(s). Upon arrival at its selected destination, the robot prompts the recipient for a security code or physical key and awaits acknowledgement of payload removal. In the future, the integration of HelpMate with robot manipulators, test equipment, and central institutional information systems will open new applications in more localized areas and should help overcome difficulties in filling transport staff positions.

  1. Networking a mobile robot

    NASA Astrophysics Data System (ADS)

    McKee, Gerard T.

    1994-10-01

    Conventional mobile robotic systems are `stand alone'. Program development involves loading programs into the mobile, via an umbilical. Autonomous operation, in this context, means `isolation': the user cannot interact with the program as the robot is moving around. Recent research in `swarm robotics' has exploited wireless networks as a means of providing inter- robot communication, but the population is still isolated from the human user. In this paper we report on research we are conducting into the provision of mobile robots as resources on a local area computer network, and thus breaking the isolation barrier. We are making use of new multimedia workstation and wireless networking technology to link the robots to the network in order to provide a new type of resource for the user. We model the robot as a set of resources and propose a client-server architecture as the basis for providing user access to the robots. We describe the types of resources each robot can provide and we outline the potential for cooperative robotics, human-robot cooperation, and teleoperation and autonomous robot behavior within this context.

  2. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  3. Heterogeneous Multi-Robot Cooperation

    DTIC Science & Technology

    1994-02-01

    the objects the robots manipulate are hazardous waste. I have not actually applied the robots to reA toxic waste spills, since they are simply small...1993] Bruce Randall Donald, James Jennings, and Daniela Rus. To- wards a theory of information invariants for cooperating autonomous mobile robots

  4. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  5. Partnership: Recycling $/$ Outdoor Education.

    ERIC Educational Resources Information Center

    Weir, Phil

    1996-01-01

    The Ottawa Board of Education (Ontario, Canada) has committed revenues generated by a districtwide recycling program to help fund the MacSkimming Outdoor Education Centre. A partnership between recycling and outdoor education is valuable in developing an environmental ethic among students and in finding new ways to fund outdoor education. (LP)

  6. Outdoor Environments. Beginnings Workshop.

    ERIC Educational Resources Information Center

    Child Care Information Exchange, 2003

    2003-01-01

    Presents seven articles on outdoor play environments: "Are We Losing Ground?" (Greenman); "Designing and Creating Natural Play Environments for Young Children" (Keeler); "Adventure Playgrounds and Outdoor Safety Issues" (McGinnis); "Trust, the Earth and Children: Birth to Three" (Young); "Outdoor Magic…

  7. Education and Outdoor Recreation.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    Responsibility for meeting the needs and demands of the public for outdoor recreation has led the Bureau of Outdoor Recreation to cooperate with educational institutions and others in order to assist in establishing education programs and activities and to encourage public use and benefits from outdoor recreation. To this end the Bureau conducts…

  8. Brightness Invariant Port Recognition for Robotic Aircraft Refueling

    DTIC Science & Technology

    1990-12-13

    autonomous robotic systems for many possible military applications. One such application that the robotics research group at the Air Force Institute of...1.1 Motivation Currently, the development of autonomous robotic systems is a major interest to the Air Force. With the reduction of resources and...to help alleviate or eliminate manpower intensive activities, it is becoming more critical to use autonomous robotic systems to perform some logistic

  9. Aerial Explorers and Robotic Ecosystems

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Pisanich, Greg

    2004-01-01

    A unique bio-inspired approach to autonomous aerial vehicle, a.k.a. aerial explorer technology is discussed. The work is focused on defining and studying aerial explorer mission concepts, both as an individual robotic system and as a member of a small robotic "ecosystem." Members of this robotic ecosystem include the aerial explorer, air-deployed sensors and robotic symbiotes, and other assets such as rovers, landers, and orbiters.

  10. A cognitive robotic system based on the Soar cognitive architecture for mobile robot navigation, search, and mapping missions

    NASA Astrophysics Data System (ADS)

    Hanford, Scott D.

    Most unmanned vehicles used for civilian and military applications are remotely operated or are designed for specific applications. As these vehicles are used to perform more difficult missions or a larger number of missions in remote environments, there will be a great need for these vehicles to behave intelligently and autonomously. Cognitive architectures, computer programs that define mechanisms that are important for modeling and generating domain-independent intelligent behavior, have the potential for generating intelligent and autonomous behavior in unmanned vehicles. The research described in this presentation explored the use of the Soar cognitive architecture for cognitive robotics. The Cognitive Robotic System (CRS) has been developed to integrate software systems for motor control and sensor processing with Soar for unmanned vehicle control. The CRS has been tested using two mobile robot missions: outdoor navigation and search in an indoor environment. The use of the CRS for the outdoor navigation mission demonstrated that a Soar agent could autonomously navigate to a specified location while avoiding obstacles, including cul-de-sacs, with only a minimal amount of knowledge about the environment. While most systems use information from maps or long-range perceptual capabilities to avoid cul-de-sacs, a Soar agent in the CRS was able to recognize when a simple approach to avoiding obstacles was unsuccessful and switch to a different strategy for avoiding complex obstacles. During the indoor search mission, the CRS autonomously and intelligently searches a building for an object of interest and common intersection types. While searching the building, the Soar agent builds a topological map of the environment using information about the intersections the CRS detects. The agent uses this topological model (along with Soar's reasoning, planning, and learning mechanisms) to make intelligent decisions about how to effectively search the building. Once the

  11. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  12. Mobile robot knowledge base

    NASA Astrophysics Data System (ADS)

    Heath Pastore, Tracy; Barnes, Mitchell; Hallman, Rory

    2005-05-01

    Robot technology is developing at a rapid rate for both commercial and Department of Defense (DOD) applications. As a result, the task of managing both technology and experience information is growing. In the not-to-distant past, tracking development efforts of robot platforms, subsystems and components was not too difficult, expensive, or time consuming. To do the same today is a significant undertaking. The Mobile Robot Knowledge Base (MRKB) provides the robotics community with a web-accessible, centralized resource for sharing information, experience, and technology to more efficiently and effectively meet the needs of the robot system user. The resource includes searchable information on robot components, subsystems, mission payloads, platforms, and DOD robotics programs. In addition, the MRKB website provides a forum for technology and information transfer within the DOD robotics community and an interface for the Robotic Systems Pool (RSP). The RSP manages a collection of small teleoperated and semi-autonomous robotic platforms, available for loan to DOD and other qualified entities. The objective is to put robots in the hands of users and use the test data and fielding experience to improve robot systems.

  13. An Advanced Telereflexive Tactical Response Robot

    DTIC Science & Technology

    2001-01-01

    Autonomous Robots 1 1 ,3947 ,2001 @ 2001 Kluwer Academic Publishers. Manufactured in The Netherlands. An Advanced Telereflexive Tactical Response... Robot H.R. EVERETT, G.A. GILBREATH AND D.A. CICCIMARO SPAWAR Systems Center*, San Diego, Code D371, 53406 Woodward Road, San Diego, CA 92152-7383...centered mapping” strategy. Keywords: robotic sensors, tactical response robot robotics , teleoperated, telereflexive, non-lethal response, world modeling

  14. Market-Driven Multi-Robot Exploration

    DTIC Science & Technology

    2002-01-01

    For many real-world applications, autonomous robots must execute complex tasks in unknown or partially known unstructured environments. This work...to communication interruptions and failures. Results showing the capabilities of our system on a team of exploring autonomous robots are also given.

  15. NASA's Robotic Lander Takes Flight

    NASA Video Gallery

    On Wednesday, June 8, the lander prototype managed by the Robotic Lunar Lander Development Project at NASA's Marshall Space Flight Center in Huntsville, Ala., hovered autonomously for 15 seconds at...

  16. Optimizing Safe Motion for Autonomous Vehicles

    DTIC Science & Technology

    1994-09-01

    k of the vehicle motion (dk/ds) as the only control variable for the vehicle where s is the length along the vehicle trajectory. Previous motion...function for vehicle motion control is demonstrated by algorithmic simulation and by usc on the autonomous mobile robot Yamabico 11I at the Naval...only control variable for the vehicle, where s is the length along the vehicle trajectory. Previous motion planning of the autonomous mobile robot

  17. Outdoor Education Manual.

    ERIC Educational Resources Information Center

    Gooyers, Cobina; And Others

    Designed for teachers to provide students with an awareness of the world of nature which surrounds them, the manual presents the philosophy of outdoor education, goals and objectives of the school program, planning for outdoor education, the Wildwood Programs, sequential program planning for students, program booking and resource list. Content…

  18. Outdoor Recreation Management

    ERIC Educational Resources Information Center

    Jubenville, Alan

    The complex problems facing the manager of an outdoor recreation area are outlined and discussed. Eighteen chapters cover the following primary concerns of the manager of such a facility: (1) an overview of the management process; (2) the basic outdoor recreation management model; (3) the problem-solving process; (4) involvement of the public in…

  19. Fundamentals of Outdoor Enjoyment.

    ERIC Educational Resources Information Center

    Mitchell, Jim; Fear, Gene

    The purpose of this preventive search and rescue teachers guide is to help high school aged youth understand the complexities and priorities necessary to manage a human body in outdoor environments and the value of planning ahead to have on hand the skills and equipment needed for outdoor survival, comfort, and enjoyment. Separate sections present…

  20. Outdoor Education Resource Guide.

    ERIC Educational Resources Information Center

    Prince George's County Board of Education, Upper Marlboro, MD.

    Developed primarily as a source of information for teachers planning outdoor education experiences, the material in this resource book can be used by any teacher in environmental studies. Subjects and activities most often taught as part of the outdoor education program are outlined both as resource (basic information) and teaching units. The…

  1. Healthy Air Outdoors

    MedlinePlus

    ... families and can even shorten their lives. Outdoor Air Pollution and Health Outdoor air pollution continues to threaten the lives and health of ... sources such as fires and dust contribute to air pollution. Learn more Fighting for Healthy Air The American ...

  2. Maple Leaf Outdoor Centre.

    ERIC Educational Resources Information Center

    Maguire, Molly; Gunton, Ric

    2000-01-01

    Maple Leaf Outdoor Centre (Ontario) has added year-round outdoor education facilities and programs to help support its summer camp for disadvantaged children. Schools, youth centers, religious groups, and athletic teams conduct their own programs, collaborate with staff, or use staff-developed programs emphasizing adventure education and personal…

  3. Photography in Outdoor Education.

    ERIC Educational Resources Information Center

    O'Connell, Cornelius; And Others

    The use of photography to add a new dimension to outdoor education activities is described in this paper. It is noted that photography can be an aid to outdoor education in a number of ways: students learn to communicate ideas visually, students learn to think through problems and find ways of solving them, students gain increased appreciation of…

  4. Hunting and Outdoor Education.

    ERIC Educational Resources Information Center

    Matthews, Bruce E.

    1991-01-01

    This article addresses the controversy over including hunting as a part of outdoor education. Historically, figures such as Julian Smith, of the Outdoor Education Project of the 1950's, advocated hunting as a critical element of educating children and youth about care and protection of natural resources. Henry David Thoreau saw hunting experiences…

  5. Outdoor Classroom Coordinator

    ERIC Educational Resources Information Center

    Keeler, Rusty

    2010-01-01

    Everybody loves the idea of children playing outdoors. Outside, children get to experience the seasons, challenge their minds and bodies, connect with the natural world, and form a special relationship with the planet. But in order for children to get the most of their outdoor time it is important that the environment be prepared by caring adults…

  6. Effective Thinking Outdoors.

    ERIC Educational Resources Information Center

    Hyde, Rod

    1997-01-01

    Effective Thinking Outdoors (ETO) is an organization that teaches thinking skills and strategies via significant outdoor experiences. Identifies the three elements of thinking as creativity, play, and persistence; presents a graphic depiction of the problem-solving process and aims; and describes an ETO exercise, determining old routes of travel…

  7. Outdoorsman: Outdoor Cooking.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    This Outdoor Cookery manual provides information and instruction on the basic outdoor skills of building suitable cooking fires, handling fires safely, and storing food. The necessity of having the right kind of fire is stressed (high flames for boiling, low for stewing, and coals for frying and broiling). Tips on gauging temperature, what types…

  8. Outdoor Education: Resource Catalogue.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education, Winnipeg.

    Compiled to serve as a reference to help teachers locate outdoor education materials available from Canadian government and private agencies, this catalogue lists services and publications which can be utilized by educators in planning and implementing outdoor education programs. Among the services listed is a sampling of organizations,…

  9. Outdoor Recreation Space Standards.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    With increased concern in recent years for the quality of our cultural and physical environment, there has been a corresponding increase in the need for information on standards used for planning playgrounds and parks, sports facilities, and outdoor areas for camping and hiking. Standards for various types of outdoor recreational facilities as…

  10. The Outdoor Classroom.

    ERIC Educational Resources Information Center

    Thomas, Dorothy E.

    An Outdoor Classroom to prepare pre-service and in-service teachers to utilize vital natural resources as an outdoor laboratory was established in 1974 by Elizabeth City State University. Because of its proximity to the Great Dismal Swamp and the Atlantic, the university's geographical location made it especially suitable for such a course of…

  11. Teleautonomous guidance for mobile robots

    NASA Technical Reports Server (NTRS)

    Borenstein, J.; Koren, Y.

    1990-01-01

    Teleautonomous guidance (TG), a technique for the remote guidance of fast mobile robots, has been developed and implemented. With TG, the mobile robot follows the general direction prescribed by an operator. However, if the robot encounters an obstacle, it autonomously avoids collision with that obstacle while trying to match the prescribed direction as closely as possible. This type of shared control is completely transparent and transfers control between teleoperation and autonomous obstacle avoidance gradually. TG allows the operator to steer vehicles and robots at high speeds and in cluttered environments, even without visual contact. TG is based on the virtual force field (VFF) method, which was developed earlier for autonomous obstacle avoidance. The VFF method is especially suited to the accommodation of inaccurate sensor data (such as that produced by ultrasonic sensors) and sensor fusion, and allows the mobile robot to travel quickly without stopping for obstacles.

  12. Detection, Location and Grasping Objects Using a Stereo Sensor on UAV in Outdoor Environments.

    PubMed

    Ramon Soria, Pablo; Arrue, Begoña C; Ollero, Anibal

    2017-01-07

    The article presents a vision system for the autonomous grasping of objects with Unmanned Aerial Vehicles (UAVs) in real time. Giving UAVs the capability to manipulate objects vastly extends their applications, as they are capable of accessing places that are difficult to reach or even unreachable for human beings. This work is focused on the grasping of known objects based on feature models. The system runs in an on-board computer on a UAV equipped with a stereo camera and a robotic arm. The algorithm learns a feature-based model in an offline stage, then it is used online for detection of the targeted object and estimation of its position. This feature-based model was proved to be robust to both occlusions and the presence of outliers. The use of stereo cameras improves the learning stage, providing 3D information and helping to filter features in the online stage. An experimental system was derived using a rotary-wing UAV and a small manipulator for final proof of concept. The robotic arm is designed with three degrees of freedom and is lightweight due to payload limitations of the UAV. The system has been validated with different objects, both indoors and outdoors.

  13. Detection, Location and Grasping Objects Using a Stereo Sensor on UAV in Outdoor Environments

    PubMed Central

    Ramon Soria, Pablo; Arrue, Begoña C.; Ollero, Anibal

    2017-01-01

    The article presents a vision system for the autonomous grasping of objects with Unmanned Aerial Vehicles (UAVs) in real time. Giving UAVs the capability to manipulate objects vastly extends their applications, as they are capable of accessing places that are difficult to reach or even unreachable for human beings. This work is focused on the grasping of known objects based on feature models. The system runs in an on-board computer on a UAV equipped with a stereo camera and a robotic arm. The algorithm learns a feature-based model in an offline stage, then it is used online for detection of the targeted object and estimation of its position. This feature-based model was proved to be robust to both occlusions and the presence of outliers. The use of stereo cameras improves the learning stage, providing 3D information and helping to filter features in the online stage. An experimental system was derived using a rotary-wing UAV and a small manipulator for final proof of concept. The robotic arm is designed with three degrees of freedom and is lightweight due to payload limitations of the UAV. The system has been validated with different objects, both indoors and outdoors. PMID:28067851

  14. Semi autonomous mine detection system

    NASA Astrophysics Data System (ADS)

    Few, Doug; Versteeg, Roelof; Herman, Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude - from an autonomous robotic perspective - the rapid development and deployment of fieldable systems.

  15. Semi autonomous mine detection system

    SciTech Connect

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.

  16. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    ZAPATA ENGINEERING challenged our engineers and scientists, which included robotics expertise from Carnegie Mellon University, to design a solution to meet our client's requirements for rapid digital geophysical and radiological data collection of a munitions test range with no down-range personnel. A prime concern of the project was to minimize exposure of personnel to unexploded ordnance and radiation. The field season was limited by extreme heat, cold and snow. Geographical Information System (GIS) tools were used throughout this project to accurately define the limits of mapped areas, build a common mapping platform from various client products, track production progress, allocate resources and relate subsurface geophysical information to geographical features for use in rapidly reacquiring targets for investigation. We were hopeful that our platform could meet the proposed 35 acres per day, towing both a geophysical package and a radiological monitoring trailer. We held our breath and crossed our fingers as the autonomous Speedrower began to crawl across the playa lakebed. We met our proposed production rate, and we averaged just less than 50 acres per 12-hour day using the autonomous platform with a path tracking error of less than +/- 4 inches. Our project team mapped over 1,800 acres in an 8-week (4 days per week) timeframe. The expertise of our partner, Carnegie Mellon University, was recently demonstrated when their two autonomous vehicle entries finished second and third at the 2005 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. 'The Grand Challenge program was established to help foster the development of autonomous vehicle technology that will some day help save the lives of Americans who are protecting our country on the battlefield', said DARPA Grand Challenge Program Manager, Ron Kurjanowicz. Our autonomous remote-controlled vehicle (ARCV) was a modified New Holland 2550 Speedrower retrofitted to allow the machine

  17. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  18. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  19. Imitative Robotic Control: The Puppet Master

    DTIC Science & Technology

    2014-07-09

    such as individual joint control. EOD and bomb squads have shown interest in using a puppet control device to control existing robots , such as the...AUTONOMOUS GROUND SYSTEMS (AGS) TECHNICAL SESSION AUGUST 12-14, 2014 - NOVI, MICHIGAN IMITATIVE ROBOTIC CONTROL: THE PUPPET MASTER David Rusbarsky...When controlling a robot outside of autonomous mode, a good control device needs to give the user full control of the system while enabling the

  20. Military Space Robotics

    DTIC Science & Technology

    1987-04-30

    space will need to be designed for service by autonomous robots. Objects in space will have to be hardened for protection against directed- energy ...weapons, kinetic- energy weapons, and natural radiation. Extraterrestrial m~itrials (ETM) will be preferable as a lower-cost alternative to earth-launched...RESOURCES IN ORBIT ROBOTS IN ORBIT ABILITY TO UTILIZE ETM TO CONSTRUCT ABILITY TO CONSTRUCT LARGE ORBITAL RESOURCES STRUCTURES IN ORBIT . ABILITY TO FORM