Science.gov

Sample records for outdoor autonomous robots

  1. An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots

    DTIC Science & Technology

    2006-04-01

    An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots E.B. Pacis, B. Sights, G. Ahuja, G. Kogut, H.R. Everett...TITLE AND SUBTITLE An Adaptive Localization System for Outdoor/Indoor Navigation for Autonomous Robots 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...demonstrated a series of collaborative behaviors of multiple autonomous robots in a force-protection scenario. Stand- alone sensors detected intruder

  2. Autonomous robot using infrared thermal camera to discriminate objects in outdoor scene

    NASA Technical Reports Server (NTRS)

    Caillas, C.

    1990-01-01

    A complete autonomous legged robot is beig designed at Carnegie Mellon University to perform planetary exploration without human supervision. This robot must traverse unknown and geographically diverse areas in order to collect samples of materials. This paper describes how thermal imaging can be used to identify materials in order to find good footfall positions and collection sites of material. First, a model developed for determining the temperature of materials in an outdoor scene is presented. By applying this model, it is shown that it is possible to determine a physical characteristic of the material: thermal inertia. Second, experimental results are described that consist in recording thermal images of an outdoor scene constituted with sand and rock. Third, results and limitations of applying the model to experimental images are analyzed. Finally, the paper analyzes how basic segmentation algorithms can be combined with the thermal inertia segmentation in order to improve the discrimination of different kinds of materials.

  3. An adaptive localization system for outdoor/indoor navigation for autonomous robots

    NASA Astrophysics Data System (ADS)

    Pacis, E. B.; Sights, B.; Ahuja, G.; Kogut, G.; Everett, H. R.

    2006-05-01

    Many envisioned applications of mobile robotic systems require the robot to navigate in complex urban environments. This need is particularly critical if the robot is to perform as part of a synergistic team with human forces in military operations. Historically, the development of autonomous navigation for mobile robots has targeted either outdoor or indoor scenarios, but not both, which is not how humans operate. This paper describes efforts to fuse component technologies into a complete navigation system, allowing a robot to seamlessly transition between outdoor and indoor environments. Under the Joint Robotics Program's Technology Transfer project, empirical evaluations of various localization approaches were conducted to assess their maturity levels and performance metrics in different exterior/interior settings. The methodologies compared include Markov localization, global positioning system, Kalman filtering, and fuzzy-logic. Characterization of these technologies highlighted their best features, which were then fused into an adaptive solution. A description of the final integrated system is discussed, including a presentation of the design, experimental results, and a formal demonstration to attendees of the Unmanned Systems Capabilities Conference II in San Diego in December 2005.

  4. Robotic Lander Completes Multiple Outdoor Flight

    NASA Image and Video Library

    NASA’s Robotic Lander Development Project in Huntsville, Ala., has successfully completed seven autonomous outdoor flight tests of a lander prototype, dubbed Mighty Eagle. On Oct. 14, Mighty Eagl...

  5. Miniaturized autonomous robot

    NASA Astrophysics Data System (ADS)

    Ishihara, Hidenori; Fukuda, Toshio

    1998-01-01

    Many projects developing the miniaturized autonomous robot have been carried out in the whole world. This paper deals with our challenges developing a miniaturized autonomous robot. The miniaturized autonomous robot is defined as the miniaturized closed-loop system with micro processor, microactuators and microsensors. We have developed the micro autonomous robotic system (MARS) consisting of the microprocessor, microsensors, microactuators, communication units and batteries. The MARS controls itself by the downloaded program supplied through the IR communication system. In this paper, we demonstrate several performance of the MARS, and discuss the properties of the miniaturized autonomous robot.

  6. Micro autonomous robotic system

    NASA Astrophysics Data System (ADS)

    Ishihara, Hidenori; Fukuda, Toshio

    1995-12-01

    This paper deals with the structural proposal of the micro autonomous robotic system, and shows the design of the prototype. We aim at developing the micro robot, which autonomously acts based on its detection, in order to propose a solution to constitute the micro autonomous robotic system. However, as miniaturizing the size, the number of the sensors gets restricted and the information from them becomes lack. Lack of the information makes it difficult to realize an intelligence of quality. Because of that, the micro robotic system needs to develop the simple algorithm. In this paper, we propose the simply logical algorithms to control the actuator, and show the performance of the micro robot controlled by them, and design the Micro Line Trace Robot, which dimension is about 1 cm cube and which moves along the black line on the white-colored ground, and the programmable micro autonomous robot, which dimension is about 2 cm cube and which performs according to the program optionally.

  7. Experiments in autonomous robotics

    SciTech Connect

    Hamel, W.R.

    1987-01-01

    The Center for Engineering Systems Advanced Research (CESAR) is performing basic research in autonomous robotics for energy-related applications in hazardous environments. The CESAR research agenda includes a strong experimental component to assure practical evaluation of new concepts and theories. An evolutionary sequence of mobile research robots has been planned to support research in robot navigation, world sensing, and object manipulation. A number of experiments have been performed in studying robot navigation and path planning with planar sonar sensing. Future experiments will address more complex tasks involving three-dimensional sensing, dexterous manipulation, and human-scale operations.

  8. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  9. Demonstration of autonomous air monitoring through robotics

    SciTech Connect

    Rancatore, R.

    1989-11-01

    The project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. The robot was also modified to carry a HNU PI-101 Photoionization Detector air monitoring device. A sonar range finder, which already was an integral part of the Surveyor, was repositioned to the front of the robot chassis to detect large obstacles in the path of the robot. In addition, the software of the onboard computer was also extensively modified to provide: navigation control, dynamic steering to smoothly follow the wire-course without hesitation, obstacle avoidance, autonomous shut down and remote reporting of toxic substance detection.

  10. Autonomous detection of indoor and outdoor signs

    NASA Astrophysics Data System (ADS)

    Holden, Steven; Snorrason, Magnus; Goodsell, Thomas; Stevens, Mark R.

    2005-05-01

    Most goal-oriented mobile robot tasks involve navigation to one or more known locations. This is generally done using GPS coordinates and landmarks outdoors, or wall-following and fiducial marks indoors. Such approaches ignore the rich source of navigation information that is already in place for human navigation in all man-made environments: signs. A mobile robot capable of detecting and reading arbitrary signs could be tasked using directions that are intuitive to hu-mans, and it could report its location relative to intuitive landmarks (a street corner, a person's office, etc.). Such ability would not require active marking of the environment and would be functional in the absence of GPS. In this paper we present an updated version of a system we call Sign Understanding in Support of Autonomous Navigation (SUSAN). This system relies on cues common to most signs, the presence of text, vivid color, and compact shape. By not relying on templates, SUSAN can detect a wide variety of signs: traffic signs, street signs, store-name signs, building directories, room signs, etc. In this paper we focus on the text detection capability. We present results summarizing probability of detection and false alarm rate across many scenes containing signs of very different designs and in a variety of lighting conditions.

  11. Cooperative Autonomous Robots for Reconnaissance

    DTIC Science & Technology

    2009-03-06

    REPORT Cooperative Autonomous Robots for Reconnaissance 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Collaborating mobile robots equipped with WiFi ...Cooperative Autonomous Robots for Reconnaissance Report Title ABSTRACT Collaborating mobile robots equipped with WiFi transceivers are configured as a mobile...equipped with WiFi transceivers are configured as a mobile ad-hoc network. Algorithms are developed to take advantage of the distributed processing

  12. GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge

    DTIC Science & Technology

    2004-01-01

    GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge Reid Simmons, Allison Bruce, Dani Goldberg, Adam Goode, Michael Montemerlo, Nicholas...2004 2. REPORT TYPE 3. DATES COVERED - 4. TITLE AND SUBTITLE GRACE and GEORGE: Autonomous Robots for the AAAI Robot Challenge 5a. CONTRACT...Simmons. “A Social Robot that Stands in Line.” Autonomous Robots , 12:3 pp.313-324, May 2002. [Ortony, 1988] A. Ortony, G. L. Clore, and A. Collins

  13. Autonomous mobile robot

    SciTech Connect

    Mattaboni, P.J.

    1987-01-20

    This patent describes a mobile robot of the type having (a) a vision system, (b) memory means for storing data derived from the robot vision system, and (c) a computer for processing data derived from the robot's vision system, the improvement wherein the robot's vision system comprises (i) a first array of ranging transducers for obtaining data on the position and distance of far objects in a volume of space, the transducers of the first array being symmetrically disposed on the mobile robot with respect to an axis of symmetry within the mobile robot. Each transducer of the first array is fixed in position with respect to that axis of symmetry and sees a portion of the volume of space seen by its entire array; (ii) a second array of ranging transducers for obtaining data of the position and distance of near objects in the same or an overlapping volume of space, the transducers of the second array being symmetrically disposed on the mobile robot with respect to the axis of symmetry. Each transducer of the second array is fixed in position with respect to the axis of symmetry and sees a portion of the volume of space seen by its entire array, the angle of view of the transducers of the second array being different from the angle of view of the transducers of the first array with respect to the same object in space; and (iii) means for polling the ranging transducers in sequences determined by the computer.

  14. Open multiagent architecture extended to distributed autonomous robotic systems

    NASA Astrophysics Data System (ADS)

    Sellem, Philippe; Amram, Eric; Luzeaux, Dominique

    2000-07-01

    Our research deals with the design and experiment of a control architecture for an autonomous outdoor mobile robot which uses mainly vision for perception. In this case of a single robot, we have designed a hybrid architecture with an attention mechanism that allows dynamic selection of perception processes. Building on this work, we have developed an open multi-agent architecture, for standard multi-task operating system, using the C++ programming language and Posix threads. Our implementation features of efficient and fully generic messages between agents, automatic acknowledgement receipts and built-in synchronization capabilities. Knowledge is distributed among robots according to a collaborative scheme: every robot builds its own representation of the world and shares it with others. Pieces of information are exchanged when decisions have to be made. Experiments are to be led with two outdoor ActiveMedia Pioneer AT mobile robots. Distributed perception, using mainly vision but also ultrasound, will serve as proof of concept.

  15. Autonomous mobile robots: Vehicles with cognitive control

    SciTech Connect

    Meystel, A.

    1987-01-01

    This book explores a new rapidly developing area of robotics. It describes the state-of-the-art intelligence control, applied machine intelligence, and research and initial stages of manufacturing of autonomous mobile robots. A complete account of the theoretical and experimental results obtained during the last two decades together with some generalizations on Autonomous Mobile Systems are included in this book. Contents: Introduction; Requirements and Specifications; State-of-the-art in Autonomous Mobile Robots Area; Structure of Intelligent Mobile Autonomous System; Planner, Navigator; Pilot; Cartographer; Actuation Control; Computer Simulation of Autonomous Operation; Testing the Autonomous Mobile Robot; Conclusions; Bibliography.

  16. Autonomous caregiver following robotic wheelchair

    NASA Astrophysics Data System (ADS)

    Ratnam, E. Venkata; Sivaramalingam, Sethurajan; Vignesh, A. Sri; Vasanth, Elanthendral; Joans, S. Mary

    2011-12-01

    In the last decade, a variety of robotic/intelligent wheelchairs have been proposed to meet the need in aging society. Their main research topics are autonomous functions such as moving toward some goals while avoiding obstacles, or user-friendly interfaces. Although it is desirable for wheelchair users to go out alone, caregivers often accompany them. Therefore we have to consider not only autonomous functions and user interfaces but also how to reduce caregivers' load and support their activities in a communication aspect. From this point of view, we have proposed a robotic wheelchair moving with a caregiver side by side based on the MATLAB process. In this project we discussing about robotic wheel chair to follow a caregiver by using a microcontroller, Ultrasonic sensor, keypad, Motor drivers to operate robot. Using camera interfaced with the DM6437 (Davinci Code Processor) image is captured. The captured image are then processed by using image processing technique, the processed image are then converted into voltage levels through MAX 232 level converter and given it to the microcontroller unit serially and ultrasonic sensor to detect the obstacle in front of robot. In this robot we have mode selection switch Automatic and Manual control of robot, we use ultrasonic sensor in automatic mode to find obstacle, in Manual mode to use the keypad to operate wheel chair. In the microcontroller unit, c language coding is predefined, according to this coding the robot which connected to it was controlled. Robot which has several motors is activated by using the motor drivers. Motor drivers are nothing but a switch which ON/OFF the motor according to the control given by the microcontroller unit.

  17. Miniature Autonomous Robotic Vehicle (MARV)

    SciTech Connect

    Feddema, J.T.; Kwok, K.S.; Driessen, B.J.; Spletzer, B.L.; Weber, T.M.

    1996-12-31

    Sandia National Laboratories (SNL) has recently developed a 16 cm{sup 3} (1 in{sup 3}) autonomous robotic vehicle which is capable of tracking a single conducting wire carrying a 96 kHz signal. This vehicle was developed to assess the limiting factors in using commercial technology to build miniature autonomous vehicles. Particular attention was paid to the design of the control system to search out the wire, track it, and recover if the wire was lost. This paper describes the test vehicle and the control analysis. Presented in the paper are the vehicle model, control laws, a stability analysis, simulation studies and experimental results.

  18. Autonomous Robot Skill Acquisition

    DTIC Science & Technology

    2011-05-01

    ability to collect and exploit previous experience to become able to solve harder and harder problems over time with less and less cognitive effort...amount of experience required to learn a reasonable policy from scratch in most interesting domains is unrealistic for robots operating in the real world...thesis, but without whom it would simply not have happened. Eliza Nelson’s love, enthusiasm and endless patience made all the difference when, in the

  19. Vision-Based Real-Time Traversable Region Detection for Mobile Robot in the Outdoors.

    PubMed

    Deng, Fucheng; Zhu, Xiaorui; He, Chao

    2017-09-13

    Environment perception is essential for autonomous mobile robots in human-robot coexisting outdoor environments. One of the important tasks for such intelligent robots is to autonomously detect the traversable region in an unstructured 3D real world. The main drawback of most existing methods is that of high computational complexity. Hence, this paper proposes a binocular vision-based, real-time solution for detecting traversable region in the outdoors. In the proposed method, an appearance model based on multivariate Gaussian is quickly constructed from a sample region in the left image adaptively determined by the vanishing point and dominant borders. Then, a fast, self-supervised segmentation scheme is proposed to classify the traversable and non-traversable regions. The proposed method is evaluated on public datasets as well as a real mobile robot. Implementation on the mobile robot has shown its ability in the real-time navigation applications.

  20. A power autonomous monopedal robot

    NASA Astrophysics Data System (ADS)

    Krupp, Benjamin T.; Pratt, Jerry E.

    2006-05-01

    We present the design and initial results of a power-autonomous planar monopedal robot. The robot is a gasoline powered, two degree of freedom robot that runs in a circle, constrained by a boom. The robot uses hydraulic Series Elastic Actuators, force-controllable actuators which provide high force fidelity, moderate bandwidth, and low impedance. The actuators are mounted in the body of the robot, with cable drives transmitting power to the hip and knee joints of the leg. A two-stroke, gasoline engine drives a constant displacement pump which pressurizes an accumulator. Absolute position and spring deflection of each of the Series Elastic Actuators are measured using linear encoders. The spring deflection is translated into force output and compared to desired force in a closed loop force-control algorithm implemented in software. The output signal of each force controller drives high performance servo valves which control flow to each of the pistons of the actuators. In designing the robot, we used a simulation-based iterative design approach. Preliminary estimates of the robot's physical parameters were based on past experience and used to create a physically realistic simulation model of the robot. Next, a control algorithm was implemented in simulation to produce planar hopping. Using the joint power requirements and range of motions from simulation, we worked backward specifying pulley diameter, piston diameter and stroke, hydraulic pressure and flow, servo valve flow and bandwidth, gear pump flow, and engine power requirements. Components that meet or exceed these specifications were chosen and integrated into the robot design. Using CAD software, we calculated the physical parameters of the robot design, replaced the original estimates with the CAD estimates, and produced new joint power requirements. We iterated on this process, resulting in a design which was prototyped and tested. The Monopod currently runs at approximately 1.2 m/s with the weight of all

  1. Structured control for autonomous robots

    SciTech Connect

    Simmons, R.G. . School of Computer Science)

    1994-02-01

    To operate in rich, dynamic environments, autonomous robots must be able to effectively utilize and coordinate their limited physical and occupational resources. As complexity increases, it becomes necessary to impose explicit constraints on the control of planning, perception, and action to ensure that unwanted interactions between behaviors do not occur. This paper advocates developing complex robot systems by layering reactive behaviors onto deliberative components. In this structured control approach, the deliberative components handle normal situations and the reactive behaviors, which are explicitly constrained as to when and how they are activated, handle exceptional situations. The Task Control Architecture (TCA) has been developed to support this approach. TCA provides an integrated set of control constructs useful for implementing deliberative and reactive behaviors. The control constructs facilitate modular and evolutionary system development: they are used to integrate and coordinate planning, perception, and execution, and to incrementally improve the efficiency and robustness of the robot systems. To date, TCA has been used in implementing a half-dozen mobile robot systems, including an autonomous six-legged rover and indoor mobile manipulator.

  2. [Mobile autonomous robots-Possibilities and limits].

    PubMed

    Maehle, E; Brockmann, W; Walthelm, A

    2002-02-01

    Besides industrial robots, which today are firmly established in production processes, service robots are becoming more and more important. They shall provide services for humans in different areas of their professional and everyday environment including medicine. Most of these service robots are mobile which requires an intelligent autonomous behaviour. After characterising the different kinds of robots the relevant paradigms of intelligent autonomous behaviour for mobile robots are critically discussed in this paper and illustrated by three concrete examples of robots realized in Lübeck. In addition a short survey of actual kinds of surgical robots as well as an outlook to future developments is given.

  3. Lethality and Autonomous Robots: An Ethical Stance

    DTIC Science & Technology

    2007-01-01

    Lethality and Autonomous Robots : An Ethical Stance Ronald C. Arkin and Lilia Moshkina College of Computing Georgia Institute of Technology Atlanta... autonomous robots that maintain an ethical infrastructure to govern their behavior will be referred to as humane-oids. 2. Understanding the Ethical...2007 4. TITLE AND SUBTITLE Lethality and Autonomous Robots : An Ethical Stance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  4. Autonomous Robotic Inspection in Tunnels

    NASA Astrophysics Data System (ADS)

    Protopapadakis, E.; Stentoumis, C.; Doulamis, N.; Doulamis, A.; Loupos, K.; Makantasis, K.; Kopsiaftis, G.; Amditis, A.

    2016-06-01

    In this paper, an automatic robotic inspector for tunnel assessment is presented. The proposed platform is able to autonomously navigate within the civil infrastructures, grab stereo images and process/analyse them, in order to identify defect types. At first, there is the crack detection via deep learning approaches. Then, a detailed 3D model of the cracked area is created, utilizing photogrammetric methods. Finally, a laser profiling of the tunnel's lining, for a narrow region close to detected crack is performed; allowing for the deduction of potential deformations. The robotic platform consists of an autonomous mobile vehicle; a crane arm, guided by the computer vision-based crack detector, carrying ultrasound sensors, the stereo cameras and the laser scanner. Visual inspection is based on convolutional neural networks, which support the creation of high-level discriminative features for complex non-linear pattern classification. Then, real-time 3D information is accurately calculated and the crack position and orientation is passed to the robotic platform. The entire system has been evaluated in railway and road tunnels, i.e. in Egnatia Highway and London underground infrastructure.

  5. Three-dimensional vision sensors for autonomous robots

    NASA Astrophysics Data System (ADS)

    Uchiyama, Takashi; Okabayashi, Keizyu; Wakitani, Jun

    1993-09-01

    A three dimensional measurement system, which is important for developing autonomous robots is described. Industrial robots used in today's plants are of the preprogrammed teaching playback type. It is necessary to develop autonomous robots which can work based on sensor information for intelligent manufacturing systems. Moreover, practical use of robots which work in unstructured environments such as outdoors and in space is expected. To realize this, a function to measure objects and the environment three-dimensionally is a key technology. Additional important requirements for robotic sensors are real-time processing and compactness. We have developed smart 3-D vision sensors for the purpose of realizing autonomous robots. These are two kinds of sensors with different functions corresponding to the application. One is a slitted light range finder ( SLRF ) to measure stationary objects. The other is a real-time tracking vision ( RTTV ) which can measure moving objects at high speed. SLRF uses multiple slitted lights which are generated by a semiconductor laser through an interference filter and a cylindrical lens. Furthermore, we developed a liquid crystal shutter with multiple electrodes. We devised a technique to make coded slitted light by putting this shutter in front of the light source. As a result, using the principle of triangulation, objects can be measured in three dimensions. In addition, high-speed image input was enabled by projecting multiple slitted light at the same time. We have confirmed the effectiveness of the SLRF applied to a hand-eye system using a robot.

  6. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.

  7. A Biologically-Inspired Autonomous Robot

    DTIC Science & Technology

    1993-12-13

    AD-A273 909 DTIC ELECTE SDEC,2 01993 A PERFORMANCE REPORT A Biologically-Inspired Autonomous Robot Grant N00014-90-J- 1545 Period of Performance: 3...rough estimate of the torque generated by the electrical activation of the muscle dunng the movement. " The previous simulation of the robot has been...reaction forces for the robot that shares features with Full’s force measurements of cockroach walking. "* The 18 motor driver circuits for the robot have

  8. Control of autonomous robot using neural networks

    NASA Astrophysics Data System (ADS)

    Barton, Adam; Volna, Eva

    2017-07-01

    The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.

  9. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  10. Hierarchical loop detection for mobile outdoor robots

    NASA Astrophysics Data System (ADS)

    Lang, Dagmar; Winkens, Christian; Häselich, Marcel; Paulus, Dietrich

    2012-01-01

    Loop closing is a fundamental part of 3D simultaneous localization and mapping (SLAM) that can greatly enhance the quality of long-term mapping. It is essential for the creation of globally consistent maps. Conceptually, loop closing is divided into detection and optimization. Recent approaches depend on a single sensor to recognize previously visited places in the loop detection stage. In this study, we combine data of multiple sensors such as GPS, vision, and laser range data to enhance detection results in repetitively changing environments that are not sufficiently explained by a single sensor. We present a fast and robust hierarchical loop detection algorithm for outdoor robots to achieve a reliable environment representation even if one or more sensors fail.

  11. Tele-robotic/autonomous control using controlshell

    SciTech Connect

    Wilhelmsen, K.C.; Hurd, R.L.; Couture, S.

    1996-12-10

    A tele-robotic and autonomous controller architecture for waste handling and sorting has been developed which uses tele-robotics, autonomous grasping and image processing. As a starting point, prior work from LLNL and ORNL was restructured and ported to a special real-time development environment. Significant improvements in collision avoidance, force compliance, and shared control aspects were then developed. Several orders of magnitude improvement were made in some areas to meet the speed and robustness requirements of the application.

  12. Control algorithms for autonomous robot navigation

    SciTech Connect

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  13. Automatic learning by an autonomous mobile robot

    SciTech Connect

    de Saussure, G.; Spelt, P.F.; Killough, S.M.; Pin, F.G.; Weisbin, C.R.

    1989-01-01

    This paper describes recent research in automatic learning by the autonomous mobile robot HERMIES-IIB at the Center for Engineering Systems Advanced Research (CESAR). By acting on the environment and observing the consequences during a set of training examples, the robot learns a sequence of successful manipulations on a simulated control panel. The robot learns to classify panel configurations in order to deal with new configurations that are not part of the original training set. 5 refs., 2 figs.

  14. Tele/Autonomous Robot For Nuclear Facilities

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.

    1994-01-01

    Fail-safe tele/autonomous robotic system makes it unnecessary for human technicians to enter nuclear-fuel-reprocessing facilities and other high-radiation or otherwise hazardous industrial environments. Used to carry out experiments as exchanging equipment modules, turning bolts, cleaning surfaces, and grappling turning objects by use of mixture of autonomous actions and teleoperation with either single arm or two cooperating arms. System capable of fully autonomous operation, teleoperation or shared control.

  15. Tele/Autonomous Robot For Nuclear Facilities

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.

    1994-01-01

    Fail-safe tele/autonomous robotic system makes it unnecessary for human technicians to enter nuclear-fuel-reprocessing facilities and other high-radiation or otherwise hazardous industrial environments. Used to carry out experiments as exchanging equipment modules, turning bolts, cleaning surfaces, and grappling turning objects by use of mixture of autonomous actions and teleoperation with either single arm or two cooperating arms. System capable of fully autonomous operation, teleoperation or shared control.

  16. Autonomous Student Experiences in Outdoor and Adventure Education

    ERIC Educational Resources Information Center

    Daniel, Brad; Bobilya, Andrew J.; Kalisch, Kenneth R.; McAvoy, Leo H.

    2014-01-01

    This article explores the current state of knowledge regarding the use of autonomous student experiences (ASE) in outdoor and adventure education (OAE) programs. ASE are defined as components (e.g., solo, final expedition) in which participants have a greater measure of choice and control over the planning, execution, and outcomes of their…

  17. Autonomous Student Experiences in Outdoor and Adventure Education

    ERIC Educational Resources Information Center

    Daniel, Brad; Bobilya, Andrew J.; Kalisch, Kenneth R.; McAvoy, Leo H.

    2014-01-01

    This article explores the current state of knowledge regarding the use of autonomous student experiences (ASE) in outdoor and adventure education (OAE) programs. ASE are defined as components (e.g., solo, final expedition) in which participants have a greater measure of choice and control over the planning, execution, and outcomes of their…

  18. Mapping planetary caves with an autonomous, heterogeneous robot team

    NASA Astrophysics Data System (ADS)

    Husain, Ammar; Jones, Heather; Kannan, Balajee; Wong, Uland; Pimentel, Tiago; Tang, Sarah; Daftry, Shreyansh; Huber, Steven; Whittaker, William L.

    Caves on other planetary bodies offer sheltered habitat for future human explorers and numerous clues to a planet's past for scientists. While recent orbital imagery provides exciting new details about cave entrances on the Moon and Mars, the interiors of these caves are still unknown and not observable from orbit. Multi-robot teams offer unique solutions for exploration and modeling subsurface voids during precursor missions. Robot teams that are diverse in terms of size, mobility, sensing, and capability can provide great advantages, but this diversity, coupled with inherently distinct low-level behavior architectures, makes coordination a challenge. This paper presents a framework that consists of an autonomous frontier and capability-based task generator, a distributed market-based strategy for coordinating and allocating tasks to the different team members, and a communication paradigm for seamless interaction between the different robots in the system. Robots have different sensors, (in the representative robot team used for testing: 2D mapping sensors, 3D modeling sensors, or no exteroceptive sensors), and varying levels of mobility. Tasks are generated to explore, model, and take science samples. Based on an individual robot's capability and associated cost for executing a generated task, a robot is autonomously selected for task execution. The robots create coarse online maps and store collected data for high resolution offline modeling. The coordination approach has been field tested at a mock cave site with highly-unstructured natural terrain, as well as an outdoor patio area. Initial results are promising for applicability of the proposed multi-robot framework to exploration and modeling of planetary caves.

  19. Reference test courses for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Jacoff, Adam; Messina, Elena; Evans, John

    2001-09-01

    One approach to measuring the performance of intelligent systems is to develop standardized or reproducible tests. These tests may be in a simulated environment or in a physical test course. The National Institute of Standards and Technology has developed a test course for evaluating the performance of mobile autonomous robots operating in an urban search and rescue mission. The test course is designed to simulate a collapsed building structure at various levels of fidelity. The course will be used in robotic competitions, such as the American Association for Artificial Intelligence (AAAI) Mobile Robot Competition and the RoboCup Rescue. Designed to be repeatable and highly reconfigurable, the test course challenges a robot's cognitive capabilities such as perception, knowledge representation, planning, autonomy and collaboration. The goal of the test course is to help define useful performance metrics for autonomous mobile robots which, if widely accepted, could accelerate development of advanced robotic capabilities by promoting the re-use of algorithms and system components. The course may also serve as a prototype for further development of performance testing environments which enable robot developers and purchasers to objectively evaluate robots for a particular application. In this paper we discuss performance metrics for autonomous mobile robots, the use of representative urban search and rescue scenarios as a challenge domain, and the design criteria for the test course.

  20. INL Autonomous Navigation System

    SciTech Connect

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  1. Autonomous Navigation for Mobile Robots with Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Ballantyne, James; Johns, Edward; Valibeik, Salman; Wong, Charence; Yang, Guang-Zhong

    Dynamic and complex indoor environments present a challenge for mobile robot navigation. The robot must be able to simultaneously map the environment, which often has repetitive features, whilst keep track of its pose and location. This chapter introduces some of the key considerations for human guided navigation. Rather than letting the robot explore the environment fully autonomously, we consider the use of human guidance for progressively building up the environment map and establishing scene association, learning, as well as navigation and planning. After the guide has taken the robot through the environment and indicated the points of interest via hand gestures, the robot is then able to use the geometric map and scene descriptors captured during the tour to create a high-level plan for subsequent autonomous navigation within the environment. Issues related to gesture recognition, multi-cue integration, tracking, target pursuing, scene association and navigation planning are discussed.

  2. Supervised autonomous robotic soft tissue surgery.

    PubMed

    Shademan, Azad; Decker, Ryan S; Opfermann, Justin D; Leonard, Simon; Krieger, Axel; Kim, Peter C W

    2016-05-04

    The current paradigm of robot-assisted surgeries (RASs) depends entirely on an individual surgeon's manual capability. Autonomous robotic surgery-removing the surgeon's hands-promises enhanced efficacy, safety, and improved access to optimized surgical techniques. Surgeries involving soft tissue have not been performed autonomously because of technological limitations, including lack of vision systems that can distinguish and track the target tissues in dynamic surgical environments and lack of intelligent algorithms that can execute complex surgical tasks. We demonstrate in vivo supervised autonomous soft tissue surgery in an open surgical setting, enabled by a plenoptic three-dimensional and near-infrared fluorescent (NIRF) imaging system and an autonomous suturing algorithm. Inspired by the best human surgical practices, a computer program generates a plan to complete complex surgical tasks on deformable soft tissue, such as suturing and intestinal anastomosis. We compared metrics of anastomosis-including the consistency of suturing informed by the average suture spacing, the pressure at which the anastomosis leaked, the number of mistakes that required removing the needle from the tissue, completion time, and lumen reduction in intestinal anastomoses-between our supervised autonomous system, manual laparoscopic surgery, and clinically used RAS approaches. Despite dynamic scene changes and tissue movement during surgery, we demonstrate that the outcome of supervised autonomous procedures is superior to surgery performed by expert surgeons and RAS techniques in ex vivo porcine tissues and in living pigs. These results demonstrate the potential for autonomous robots to improve the efficacy, consistency, functional outcome, and accessibility of surgical techniques.

  3. Task sequencing for autonomous robotic vacuum cleaners

    NASA Astrophysics Data System (ADS)

    Gorbenko, Anna; Popov, Vladimir

    2017-07-01

    Various planning problems for robotic systems are of considerable interest. One of such problems is the problem of task sequencing. In this paper, we consider the problem of task sequencing for autonomous vacuum floor cleaning robots. We consider a graph model for the problem. We propose an efficient approach to solve the problem. In particular, we use an explicit reduction from the decision version of the problem to the satisfiability problem. We present the results of computational experiments for different satisfiability algorithms.

  4. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  5. Synthesis of autonomous robots through evolution.

    PubMed

    Nolfi, Stefano; Floreano, Dario

    2002-01-01

    Evolutionary robotics is the attempt to develop robots through a self-organized process based on artificial evolution. This approach stresses the importance of the study of systems that have a body and that are situated in a physical environment, and which autonomously develop their own skills in close interaction with the environment. In this review we briefly illustrate the method and the main concept of evolutionary robotics, and examine the most significant contribution in this area. We also discuss some of the contributions that this research area is making to the foundational debate in cognitive science.

  6. From Autonomous Robots to Artificial Ecosystems

    NASA Astrophysics Data System (ADS)

    Mastrogiovanni, Fulvio; Sgorbissa, Antonio; Zaccaria, Renato

    During the past few years, starting from the two mainstream fields of Ambient Intelligence [2] and Robotics [17], several authors recognized the benefits of the socalled Ubiquitous Robotics paradigm. According to this perspective, mobile robots are no longer autonomous, physically situated and embodied entities adapting themselves to a world taliored for humans: on the contrary, they are able to interact with devices distributed throughout the environment and get across heterogeneous information by means of communication technologies. Information exchange, coupled with simple actuation capabilities, is meant to replace physical interaction between robots and their environment. Two benefits are evident: (i) smart environments overcome inherent limitations of mobile platforms, whereas (ii) mobile robots offer a mobility dimension unknown to smart environments.

  7. Toward highly capable neuromorphic autonomous robots: beobots

    NASA Astrophysics Data System (ADS)

    Itti, Laurent

    2002-12-01

    We describe a new mobile robotics platform specifically designed for the implementation and testing of neuromorphic vision algorithms in unconstrained outdoors environments. The new platform includes significant computational power (four 1.1GHz CPUs with gigabit interconnect), a high-speed four-wheel-drive chassis, standard Linux operating system, and a comprehensive toolkit of C++ vision classes. The robot is designed with two major goals in mind: real-time operation of sophisticated neuromorphic vision algorithms, and off-the-shelf components to ensure rapid technological evolvability. A preliminary embedded neuromorphic vision architecture that includes attentional, gist/layout, object recognition, and high-level decision subsystems is finally described.

  8. The Baker Observatory Robotic Autonomous Telescope

    NASA Astrophysics Data System (ADS)

    Hicks, L. L.; Reed, M. D.; Thompson, M. A.; Gilker, J. T.

    We describe the Baker Observatory Robotic Autonomous Telescope project. The hardware includes a 16 inch Meade LX-200 telescope, an AstroHaven 7 feet dome, an Apogee U47 CCD camera and filter wheel, a Boltwood Cloud Sensor II, and various other minor hardware. We are implementing RTS2 for the Telescope Control System and incorporating custom drivers for ancillary systems.

  9. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1989-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  10. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1988-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  11. Neuromodulation and plasticity in an autonomous robot.

    PubMed

    Sporns, Olaf; Alexander, William H

    2002-01-01

    In this paper we implement a computational model of a neuromodulatory system in an autonomous robot. The output of the neuromodulatory system acts as a value signal, modulating widely distributed synaptic changes. The model is based on anatomical and physiological properties of midbrain diffuse ascending systems, in particular parts of the dopamine and noradrenaline systems. During reward conditioning, the model learns to generate tonic and phasic signals that represent predictions and prediction errors, including precisely timed negative signals if expected rewards are omitted or delayed. We test the robot's learning and behavior in different environmental contexts and observe changes in the development of the neuromodulatory system that depend upon environmental factors. Simulation of a computational model incorporating both reward-related and aversive stimuli leads to the emergence of conditioned reward and aversive behaviors. These studies represent a step towards investigating computational aspects of neuromodulatory systems in autonomous robots.

  12. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  13. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  14. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-01-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  15. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-10-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  16. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks.

  17. SIR-1: An autonomous mobile sentry robot

    SciTech Connect

    Harrington, J.J.; Klarer, P.R.

    1987-01-01

    This paper describes a prototype mobile robot system configured to function as part of an overall security system at a high security facility. The features of this robot system include specialized software and sensors for navigation without the need for external locator beacons or sign posts, sensors for remote imaging and intruder detection, and data link facilities to communicate information either directly to an electronic security system or to a manned central control center. Other features of the robot system include low weight, compact size, and low power consumption. The robot system can operate either by remote manual control, or it can operate autonomously where the need for direct human control is limited to the global command level. The robot can act as a mobile remote sensing platform for visual alarm assessment or roving patrol, or as an exploratory device in situations potentially hazardous to humans. This robot system may also be used to walk-test intrusion detection sensors as part of a routine test and maintenance program for an interior intrusion detection system (IDS), and to provide a programmable, temporary sensor capability to backup an IDS sensor that has failed. This capability may also be used to provide improved sensor coverage of an area that will be secured on a temporary or short term basis, thereby eliminating the need for a permanent sensor installation. The hardware, software, and operation of this robot system are briefly described.

  18. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings.

  19. SIR-1: An autonomous mobile sentry robot

    SciTech Connect

    Harrington, J.J.; Klarer, P.R.

    1987-05-01

    This paper describes a prototype mobile robot system configured to function as part of an overall security system at a high security facility. The features of this robot system include specialized software and sensors for navigation without the need for external locator beacons or sign posts, sensors for remote imaging and intruder detection, and data link facilities to communicate information either directly to an electronic security system or to a manned central control center. Other features of the robot system include low weight, compact size, and low power consumption. The robot system can operate either by remote manual control, or it can operate autonomously where the need for direct human control is limited to the global command level. The robot can act as a mobile remote sensing platform for visual alarm assessment or roving patrol, or as an exploratory device in situations potentially hazardous to humans. This robot system may also be used to walk-test intrusion detection sensors as part of a routine test and maintenance program for an interior intrusion detection system (IDS), and to provide a programmable, temporary sensor capability to backup an IDS sensor that has failed. This capability may also be used to provide improved sensor coverage of an area that will be secured on a temporary or short term basis, thereby eliminating the need for a permanent sensor installation. The hardware, software, and operation of this robot system will be briefly described herein.

  20. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  1. Autonomous mobile robot research using the HERMIES-III robot

    SciTech Connect

    Pin, F.G.; Beckerman, M.; Spelt, P.F.; Robinson, J.T.; Weisbin, C.R.

    1989-01-01

    This paper reports on the status and future directions in the research, development and experimental validation of intelligent control techniques for autonomous mobile robots using the HERMIES-III robot at the Center for Engineering Systems Advanced research (CESAR) at Oak Ridge National Laboratory (ORNL). HERMIES-III is the fourth robot in a series of increasingly more sophisticated and capable experimental test beds developed at CESAR. HERMIES-III is comprised of a battery powered, onmi-directional wheeled platform with a seven degree-of-freedom manipulator arm, video cameras, sonar range sensors, laser imaging scanner and a dual computer system containing up to 128 NCUBE nodes in hypercube configuration. All electronics, sensors, computers, and communication equipment required for autonomous operation of HERMIES-III are located on board along with sufficient battery power for three to four hours of operation. The paper first provides a more detailed description of the HERMIES-III characteristics, focussing on the new areas of research and demonstration now possible at CESAR with this new test-bed. The initial experimental program is then described with emphasis placed on autonomous performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES- III). The paper concludes with a discussion of the integration problems and safety considerations necessarily arising from the set-up of an experimental program involving human-scale, multi-autonomous mobile robots performance. 10 refs., 3 figs.

  2. An architecture for an autonomous learning robot

    NASA Technical Reports Server (NTRS)

    Tillotson, Brian

    1988-01-01

    An autonomous learning device must solve the example bounding problem, i.e., it must divide the continuous universe into discrete examples from which to learn. We describe an architecture which incorporates an example bounder for learning. The architecture is implemented in the GPAL program. An example run with a real mobile robot shows that the program learns and uses new causal, qualitative, and quantitative relationships.

  3. Autonomous environment modeling by a mobile robot

    NASA Astrophysics Data System (ADS)

    Moutarlier, Philippe

    1991-02-01

    Internal geometric representation of the environment is considered. The autonomy of a mobile robot partly relies on its ability to build a reliable representation of its environment. On the other hand, an autonomous environment building process requires that model be adapted to plan motions and perception actions. Therefore, the modeling process must be a reversible interface between perception motion devices and the model itself. Several kinds of models are necessary in order to achieve an autonomous process. Sensors give stochastic information on the surface, navigation needs free-space representation, and perception planning requires aspect graphs. The functions of stochastic surface modeling, free space representation, and topological graph computing are presented through the integrated geometric model builder called 'Yaka.' Since all environment data uncertainties are correlated together through the robot location inaccuracy, classical filtering methods are inadequate. A method of computing a linear variance estimator, that is adapted to the problem, is proposed. This general formalism is validated by a large number of experimentation wherein the robot incrementally builds a surfacic representation of its environment. Free space cannot be deduced directly, at each step, from the surfacic data provided by the sensors. Innacuracies on object surfaces and uncertainties on the visibility of objects by the sensor as well as the possible motion of objects must all be taken into account for building the free space incrementally. Then, motion and perception planning for autonomous environment modeling are achieved using this free space model and topological location and aspect graphs.

  4. Evolutionary neurocontrollers for autonomous mobile robots.

    PubMed

    Floreano, D; Mondada, F

    1998-10-01

    In this article we describe a methodology for evolving neurocontrollers of autonomous mobile robots without human intervention. The presentation, which spans from technological and methodological issues to several experimental results on evolution of physical mobile robots, covers both previous and recent work in the attempt to provide a unified picture within which the reader can compare the effects of systematic variations on the experimental settings. After describing some key principles for building mobile robots and tools suitable for experiments in adaptive robotics, we give an overview of different approaches to evolutionary robotics and present our methodology. We start reviewing two basic experiments showing that different environments can shape very different behaviours and neural mechanisms under very similar selection criteria. We then address the issue of incremental evolution in two different experiments from the perspective of changing environments and robot morphologies. Finally, we investigate the possibility of evolving plastic neurocontrollers and analyse an evolved neurocontroller that relies on fast and continuously changing synapses characterized by dynamic stability. We conclude by reviewing the implications of this methodology for engineering, biology, cognitive science and artificial life, and point at future directions of research.

  5. Mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D. )

    1992-01-01

    The robotics development group at the Savannah River Laboratory (SRL) is developing a mobile autonomous robot that performs radiological surveys of potentially contaminated floors. The robot is called SIMON, which stands for Semi-Intelligent Mobile Observing Navigator. Certain areas of SRL are classified as radiologically controlled areas (RCAs). In an RCA, radioactive materials are frequently handled by workers, and thus, the potential for contamination is ever present. Current methods used for floor radiological surveying includes labor-intensive manual scanning or random smearing of certain floor locations. An autonomous robot such as SIMON performs the surveying task in a much more efficient manner and will track down contamination before it is contacted by humans. SIMON scans floors at a speed of 1 in./s and stops and alarms upon encountering contamination. Its environment is well defined, consisting of smooth building floors with wide corridors. The kind of contaminations that SIMON is capable of detecting are alpha and beta-gamma. The contamination levels of interest are low to moderate.

  6. A Proposal of Autonomous Robotic Systems Educative Environment

    NASA Astrophysics Data System (ADS)

    Ierache, Jorge; Garcia-Martinez, Ramón; de Giusti, Armando

    This work presents our experiences in the implementation of a laboratory of autonomous robotic systems applied to the training of beginner and advanced students doing a degree course in Computer Engineering., taking into account the specific technologies, robots, autonomous toys, and programming languages. They provide a strategic opportunity for human resources formation by involving different aspects which range from the specification elaboration, modeling, software development and implementation and testing of an autonomous robotic system.

  7. Artificial consciousness, artificial emotions, and autonomous robots.

    PubMed

    Cardon, Alain

    2006-12-01

    Nowadays for robots, the notion of behavior is reduced to a simple factual concept at the level of the movements. On another hand, consciousness is a very cultural concept, founding the main property of human beings, according to themselves. We propose to develop a computable transposition of the consciousness concepts into artificial brains, able to express emotions and consciousness facts. The production of such artificial brains allows the intentional and really adaptive behavior for the autonomous robots. Such a system managing the robot's behavior will be made of two parts: the first one computes and generates, in a constructivist manner, a representation for the robot moving in its environment, and using symbols and concepts. The other part achieves the representation of the previous one using morphologies in a dynamic geometrical way. The robot's body will be seen for itself as the morphologic apprehension of its material substrata. The model goes strictly by the notion of massive multi-agent's organizations with a morphologic control.

  8. An autonomous vision-based mobile robot

    NASA Astrophysics Data System (ADS)

    Baumgartner, Eric Thomas

    This dissertation describes estimation and control methods for use in the development of an autonomous mobile robot for structured environments. The navigation of the mobile robot is based on precise estimates of the position and orientation of the robot within its environment. The extended Kalman filter algorithm is used to combine information from the robot's drive wheels with periodic observations of small, wall-mounted, visual cues to produce the precise position and orientation estimates. The visual cues are reliably detected by at least one video camera mounted on the mobile robot. Typical position estimates are accurate to within one inch. A path tracking algorithm is also developed to follow desired reference paths which are taught by a human operator. Because of the time-independence of the tracking algorithm, the speed that the vehicle travels along the reference path is specified independent from the tracking algorithm. The estimation and control methods have been applied successfully to two experimental vehicle systems. Finally, an analysis of the linearized closed-loop control system is performed to study the behavior and the stability of the system as a function of various control parameters.

  9. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  10. Development of autonomous eating mechanism for biomimetic robots

    NASA Astrophysics Data System (ADS)

    Jeong, Kil-Woong; Cho, Ik-Jin; Lee, Yun-Jung

    2005-12-01

    Most of the recently developed robots are human friendly robots which imitate animals or humans such as entertainment robot, bio-mimetic robot and humanoid robot. Interest for these robots are being increased because the social trend is focused on health, welfare, and graying. Autonomous eating functionality is most unique and inherent behavior of pets and animals. Most of entertainment robots and pet robots make use of internal-type battery. Entertainment robots and pet robots with internal-type battery are not able to operate during charging the battery. Therefore, if a robot has an autonomous function for eating battery as its feeds, the robot is not only able to operate during recharging energy but also become more human friendly like pets. Here, a new autonomous eating mechanism was introduced for a biomimetic robot, called ELIRO-II(Eating LIzard RObot version 2). The ELIRO-II is able to find a food (a small battery), eat and evacuate by itself. This work describe sub-parts of the developed mechanism such as head-part, mouth-part, and stomach-part. In addition, control system of autonomous eating mechanism is described.

  11. Task-level control for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid

    1994-01-01

    Task-level control refers to the integration and coordination of planning, perception, and real-time control to achieve given high-level goals. Autonomous mobile robots need task-level control to effectively achieve complex tasks in uncertain, dynamic environments. This paper describes the Task Control Architecture (TCA), an implemented system that provides commonly needed constructs for task-level control. Facilities provided by TCA include distributed communication, task decomposition and sequencing, resource management, monitoring and exception handling. TCA supports a design methodology in which robot systems are developed incrementally, starting first with deliberative plans that work in nominal situations, and then layering them with reactive behaviors that monitor plan execution and handle exceptions. To further support this approach, design and analysis tools are under development to provide ways of graphically viewing the system and validating its behavior.

  12. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  13. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-04

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist.

  14. Autonomous Dome for a Robotic Telescope

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Sengupta, A.; Ganesh, S.

    2016-12-01

    The Physical Research Laboratory operates a 50 cm robotic observatory at Mount Abu (Rajsthan, India). This Automated Telescope for Variability Studies (ATVS) makes use of the Remote Telescope System 2 (RTS2) for autonomous operations. The observatory uses a 3.5 m dome from Sirius Observatories. We have developed electronics using Arduino electronic circuit boards with home grown logic and software to control the dome operations. We are in the process of completing the drivers to link our Arduino based dome controller with RTS2. This document is a short description of the various phases of the development and their integration to achieve the required objective.

  15. Autonomous biomorphic robots as platforms for sensors

    SciTech Connect

    Tilden, M.; Hasslacher, B.; Mainieri, R.; Moses, J.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomous machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.

  16. Modular control systems for teleoperated and autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Kadonoff, Mark B.; Parish, David W.

    1995-01-01

    This paper will discuss components of a modular hardware and software architecture for mobile robots that supports both teleoperation and autonomous control. The Modular Autonomous Robot System architecture enables rapid development of control systems for unmanned vehicles for a wide variety of commercial and military applications.

  17. Outdoor field experience with autonomous RPC based stations

    NASA Astrophysics Data System (ADS)

    Lopes, L.; Assis, P.; Blanco, A.; Carolino, N.; Cerda, M. A.; Conceição, R.; Cunha, O.; Ferreira, M.; Fonte, P.; Luz, R.; Mendes, L.; Pereira, A.; Pimenta, M.; Sarmento, R.; Tomé, B.

    2016-09-01

    In the last two decades Resistive Plate Chambers were employed in the Cosmic Ray Experiments COVER-PLASTEX and ARGO/YBJ. In both experiments the detectors were housed indoors, likely owing to gas distribution requirements and the need to control environment variables that directly affect RPCs operational stability. But in experiments where Extended Air Shower (EAS) sampling is necessary, large area arrays composed by dispersed stations are deployed, rendering this kind of approach impossible. In this situation, it would be mandatory to have detectors that could be deployed in small standalone stations, with very rare opportunities for maintenance, and with good resilience to environmental conditions. Aiming to meet these requirements, we started some years ago the development of RPCs for Autonomous Stations. The results from indoor tests and measurements were very promising, both concerning performance and stability under very low gas flow rate, which is the main requirement for Autonomous Stations. In this work we update the indoor results and show the first ones concerning outdoor stable operation. In particular, a dynamic adjustment of the high voltage is applied to keep gas gain constant.

  18. Radio Frequency Mapping using an Autonomous Robot: Application to the 2.4 GHz Band

    NASA Astrophysics Data System (ADS)

    Lebreton, J. M.; Murad, N. M.; Lorion, R.

    2016-03-01

    Radio signal strength measurement systems are essential to build a Radio Frequency (RF) mapping in indoor and outdoor environments for different application scenarios. This paper presents an autonomous robot making the construction of a radio signal mapping, by collecting and forwarding different useful information related to all access point devices and inherent to the robot towards the base station. A real case scenario is considered by measuring the RF field from our department network. The RF signal mapping consistency is shown by fitting the measurements with the radio signal strength model in two-dimensional area, and a path-loss exponent of 2.3 is estimated for the open corridor environment.

  19. Robotic technologies for outdoor industrial vehicles

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony

    2001-09-01

    The commercial industries of agriculture, mining, construction, and material handling employ a wide variety of mobile machines, including tractors, combines, Load-Haul-Dump vehicles, trucks, paving machines, fork trucks, and many more. Automation of these vehicles promises to improve productivity, reduce operational costs, and increase safety. Since the vehicles typically operate in difficult environments, under all weather conditions, and in the presence of people and other obstacles, reliable automation faces severe technical challenges. Furthermore, the viable technology solutions are constrained by cost considerations. Fortunately, due to the limited application domain, repetitive nature, and the utility of partial automation for most tasks, robotics technologies can have a profound impact on industrial vehicles. In this paper, we describe a technical approach developed at Carnegie Mellon University for automating mobile machines in several applications, including mass excavation, mining, and agriculture. The approach is introduced via case studies, and the results are presented.

  20. Evolution of autonomous and semi-autonomous robotic surgical systems: a review of the literature.

    PubMed

    Moustris, G P; Hiridis, S C; Deliparaschos, K M; Konstantinidis, K M

    2011-12-01

    Autonomous control of surgical robotic platforms may offer enhancements such as higher precision, intelligent manoeuvres, tissue-damage avoidance, etc. Autonomous robotic systems in surgery are largely at the experimental level. However, they have also reached clinical application. A literature review pertaining to commercial medical systems which incorporate autonomous and semi-autonomous features, as well as experimental work involving automation of various surgical procedures, is presented. Results are drawn from major databases, excluding papers not experimentally implemented on real robots. Our search yielded several experimental and clinical applications, describing progress in autonomous surgical manoeuvres, ultrasound guidance, optical coherence tomography guidance, cochlear implantation, motion compensation, orthopaedic, neurological and radiosurgery robots. Autonomous and semi-autonomous systems are beginning to emerge in various interventions, automating important steps of the operation. These systems are expected to become standard modality and revolutionize the face of surgery. Copyright © 2011 John Wiley & Sons, Ltd.

  1. Autonomous robot behavior based on neural networks

    NASA Astrophysics Data System (ADS)

    Grolinger, Katarina; Jerbic, Bojan; Vranjes, Bozo

    1997-04-01

    The purpose of autonomous robot is to solve various tasks while adapting its behavior to the variable environment, expecting it is able to navigate much like a human would, including handling uncertain and unexpected obstacles. To achieve this the robot has to be able to find solution to unknown situations, to learn experienced knowledge, that means action procedure together with corresponding knowledge on the work space structure, and to recognize working environment. The planning of the intelligent robot behavior presented in this paper implements the reinforcement learning based on strategic and random attempts for finding solution and neural network approach for memorizing and recognizing work space structure (structural assignment problem). Some of the well known neural networks based on unsupervised learning are considered with regard to the structural assignment problem. The adaptive fuzzy shadowed neural network is developed. It has the additional shadowed hidden layer, specific learning rule and initialization phase. The developed neural network combines advantages of networks based on the Adaptive Resonance Theory and using shadowed hidden layer provides ability to recognize lightly translated or rotated obstacles in any direction.

  2. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  3. Object guided autonomous exploration for mobile robots in indoor environments

    NASA Astrophysics Data System (ADS)

    Nieto-Granda, Carlos; Choudhary, Siddarth; Rogers, John G.; Twigg, Jeff; Murali, Varun; Christensen, Henrik I.

    2014-06-01

    Autonomous mobile robotic teams are increasingly used in exploration of indoor environments. Accurate modeling of the world around the robot and describing the interaction of the robot with the world greatly increases the ability of the robot to act autonomously. This paper demonstrates the ability of autonomous robotic teams to find objects of interest. A novel feature of our approach is the object discovery and the use of it to augment the mapping and navigation process. The generated map can then be decomposed into semantic regions while also considering the distance and line of sight to anchor points. The advantage of this approach is that the robot can return a dense map of the region around an object of interest. The robustness of this approach is demonstrated in indoor environments with multiple platforms with the objective of discovering objects of interest.

  4. Autonomous robotic navigation and pipe inspection: A simulation approach

    SciTech Connect

    Ioannou, D.; Wang, S.; Tulenko, J.S.

    1994-12-31

    An important task for an autonomously functioning robot in the nuclear industry is pipe inspection in a nuclear power plant. A typical scenario for such a robot: The robot enters a highly radioactive area to perform several inspection and cleanup tasks. Because the robot is functioning in a radioactive environment, it must perform these tasks in a limited time. As much information as possible should be extracted in the shortest time (i.e., with the fewest number of snapshots). At the University of Florida`s Mobile Robotics for Hazardous Environments Laboratory, a project is under way to build an autonomous robot that will function in a go-stop-process-go manner in a nuclear environment. The system follows the spirit of Thayer et al., but the difference is that it functions autonomously. This paper discusses a simulation of this system.

  5. Development of Outdoor Service Robot to Collect Trash on Streets

    NASA Astrophysics Data System (ADS)

    Obata, Masayuki; Nishida, Takeshi; Miyagawa, Hidekazu; Kondo, Takashi; Ohkawa, Fujio

    The outdoor service robot which we call OSR-01 is developed intending for cleaning up urban areas by means of collecting discarded trash such as PET bottles, cans, plastic bags and so on. We, in this paper, describe the architecture of OSR-01 consisting of hardwares such as sensors, a manipulator, driving wheels, etc. for searching for and picking up trash, and softwares such as fast pattern matching for identifying various trash and distance measurement for picking up via the manipulator. After describing the vision system in detail, which is one of the most critical parts of the trash collection task, we show the result of an open experiment in which OSR-01 collects PET bottles on a real shopping street in the special zone for robot research and development in Kitakyushu-city.

  6. Quantifying Emergent Behavior of Autonomous Robots

    NASA Astrophysics Data System (ADS)

    Martius, Georg; Olbrich, Eckehard

    2015-10-01

    Quantifying behaviors of robots which were generated autonomously from task-independent objective functions is an important prerequisite for objective comparisons of algorithms and movements of animals. The temporal sequence of such a behavior can be considered as a time series and hence complexity measures developed for time series are natural candidates for its quantification. The predictive information and the excess entropy are such complexity measures. They measure the amount of information the past contains about the future and thus quantify the nonrandom structure in the temporal sequence. However, when using these measures for systems with continuous states one has to deal with the fact that their values will depend on the resolution with which the systems states are observed. For deterministic systems both measures will diverge with increasing resolution. We therefore propose a new decomposition of the excess entropy in resolution dependent and resolution independent parts and discuss how they depend on the dimensionality of the dynamics, correlations and the noise level. For the practical estimation we propose to use estimates based on the correlation integral instead of the direct estimation of the mutual information using the algorithm by Kraskov et al. (2004) which is based on next neighbor statistics because the latter allows less control of the scale dependencies. Using our algorithm we are able to show how autonomous learning generates behavior of increasing complexity with increasing learning duration.

  7. Autonomous surgical robotics using 3-D ultrasound guidance: feasibility study.

    PubMed

    Whitman, John; Fronheiser, Matthew P; Ivancevich, Nikolas M; Smith, Stephen W

    2007-10-01

    The goal of this study was to test the feasibility of using a real-time 3D (RT3D) ultrasound scanner with a transthoracic matrix array transducer probe to guide an autonomous surgical robot. Employing a fiducial alignment mark on the transducer to orient the robot's frame of reference and using simple thresholding algorithms to segment the 3D images, we tested the accuracy of using the scanner to automatically direct a robot arm that touched two needle tips together within a water tank. RMS measurement error was 3.8% or 1.58 mm for an average path length of 41 mm. Using these same techniques, the autonomous robot also performed simulated needle biopsies of a cyst-like lesion in a tissue phantom. This feasibility study shows the potential for 3D ultrasound guidance of an autonomous surgical robot for simple interventional tasks, including lesion biopsy and foreign body removal.

  8. Biomimetic smart sensors for autonomous robotic behavior I: acoustic processing

    NASA Astrophysics Data System (ADS)

    Deligeorges, Socrates; Xue, Shuwan; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Robotics are rapidly becoming an integral tool on the battlefield and in homeland security, replacing humans in hazardous conditions. To enhance the effectiveness of robotic assets and their interaction with human operators, smart sensors are required to give more autonomous function to robotic platforms. Biologically inspired sensors are an essential part of this development of autonomous behavior and can increase both capability and performance of robotic systems. Smart, biologically inspired acoustic sensors have the potential to extend autonomous capabilities of robotic platforms to include sniper detection, vehicle tracking, personnel detection, and general acoustic monitoring. The key to enabling these capabilities is biomimetic acoustic processing using a time domain processing method based on the neural structures of the mammalian auditory system. These biologically inspired algorithms replicate the extremely adaptive processing of the auditory system yielding high sensitivity over broad dynamic range. The algorithms provide tremendous robustness in noisy and echoic spaces; properties necessary for autonomous function in real world acoustic environments. These biomimetic acoustic algorithms also provide highly accurate localization of both persistent and transient sounds over a wide frequency range, using baselines on the order of only inches. A specialized smart sensor has been developed to interface with an iRobot Packbot® platform specifically to enhance its autonomous behaviors in response to personnel and gunfire. The low power, highly parallel biomimetic processor, in conjunction with a biomimetic vestibular system (discussed in the companion paper), has shown the system's autonomous response to gunfire in complicated acoustic environments to be highly effective.

  9. Modeling and Implementation of PID Control for Autonomous Robots

    DTIC Science & Technology

    2007-06-01

    Ground Vehicle. Monterey, California: Naval Postgraduate School, 2007. Jewett, John, and Raymound Serway . Physics for Scientists and Engineers ... with similar physical characteristics. These characteristics would include any two or four-wheeled robot with a motor controller for each side... FOR AUTONOMOUS ROBOTS Todd A. Williamson Ensign, United States Navy B.S. Chemical Engineering , University of Idaho, 2006 Submitted in

  10. An Autonomous Mobile Robot for Tsukuba Challenge: JW-Future

    NASA Astrophysics Data System (ADS)

    Fujimoto, Katsuharu; Kaji, Hirotaka; Negoro, Masanori; Yoshida, Makoto; Mizutani, Hiroyuki; Saitou, Tomoya; Nakamura, Katsu

    “Tsukuba Challenge” is the only of its kind to require mobile robots to work autonomously and safely on public walkways. In this paper, we introduce the outline of our robot “JW-Future”, developed for this experiment based on an electric wheel chair. Additionally, the significance of participation to such a technical trial is discussed from the viewpoint of industries.

  11. Autonomous Realtime Threat-Hunting Robot (ARTHR

    SciTech Connect

    INL

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  12. Autonomous Realtime Threat-Hunting Robot (ARTHR

    ScienceCinema

    INL

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  13. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  14. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  15. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  16. REACT - A Third Generation Language For Autonomous Robot Systems

    NASA Astrophysics Data System (ADS)

    Longley, Maxwell J.; Owens, John; Allen, Charles R.; Ratcliff, Karl

    1990-03-01

    REACT is a language under development at Newcastle for the programming of autonomous robot systems, which uses AI constructs and sensor information to respond to failures in assumptions about the real-world by replanning a task. This paper describes the important features of a REACT programmed robotic system, and the results of some initial studies made on defining an executive language using a concept called visiblity sets. Several examples from the language are then applied to specific examples e.g. a white line follower and a railway network controller. The applicability of visibility sets to autonomous robots is evaluated.

  17. Tele-assistance for semi-autonomous robots

    NASA Technical Reports Server (NTRS)

    Rogers, Erika; Murphy, Robin R.

    1994-01-01

    This paper describes a new approach in semi-autonomous mobile robots. In this approach the robot has sufficient computerized intelligence to function autonomously under a certain set of conditions, while the local system is a cooperative decision making unit that combines human and machine intelligence. Communication is then allowed to take place in a common mode and in a common language. A number of exception-handling scenarios that were constructed as a result of experiments with actual sensor data collected from two mobile robots were presented.

  18. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  19. Experimentation and concept formation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Oliver, G.; Silliman, M.

    1990-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning which involves autonomous concept formation using feedback from trial-and-error experimentation with the environment. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 11 refs., 7 figs.

  20. Autonomous robot calibration for hand-eye coordination

    SciTech Connect

    Bennett, D.J.; Geiger, D. ); Hollerbach, J.M. )

    1991-10-01

    Autonomous robot calibration is defined as the process of determining a robot's model by using only its internal sensors. It is shown that autonomous calibration of a manipulator and stereo camera system is possible. The proposed autonomous calibration algorithm may obtain the manipulator kinematic parameters, external kinematic camera parameters, and internal camera parameters. To do this, only joint angle readings and camera image plane data are used. A condition for the identifiability of the manipulator/camera parameters is derived. The method is a generalization of a recently developed scheme for self-calibrating a manipulator by forming it into a mobile closed-loop kinematic chain.

  1. Sensory architectures for biologically inspired autonomous robotics.

    PubMed

    Higgins, C M

    2001-04-01

    Engineers have a lot to gain from studying biology. The study of biological neural systems alone provides numerous examples of computational systems that are far more complex than any man-made system and perform real-time sensory and motor tasks in a manner that humbles the most advanced artificial systems. Despite the evolutionary genesis of these systems and the vast apparent differences between species, there are common design strategies employed by biological systems that span taxa, and engineers would do well to emulate these strategies. However, biologically-inspired computational architectures, which are continuous-time and parallel in nature, do not map well onto conventional processors, which are discrete-time and serial in operation. Rather, an implementation technology that is capable of directly realizing the layered parallel structure and nonlinear elements employed by neurobiology is required for power- and space-efficient implementation. Custom neuromorphic hardware meets these criteria and yields low-power dedicated sensory systems that are small, light, and ideal for autonomous robot applications. As examples of how this technology is applied, this article describes both a low-level neuromorphic hardware emulation of an elementary visual motion detector, and a large-scale, system-level spatial motion integration system.

  2. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  3. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  4. Autonomous Evolution of Dynamic Gaits with Two Quadruped Robots

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Takamura, Seichi; Yamamoto, Takashi; Fujita, Masahiro

    2004-01-01

    A challenging task that must be accomplished for every legged robot is creating the walking and running behaviors needed for it to move. In this paper we describe our system for autonomously evolving dynamic gaits on two of Sony's quadruped robots. Our evolutionary algorithm runs on board the robot and uses the robot's sensors to compute the quality of a gait without assistance from the experimenter. First we show the evolution of a pace and trot gait on the OPEN-R prototype robot. With the fastest gait, the robot moves at over 10/min/min., which is more than forty body-lengths/min. While these first gaits are somewhat sensitive to the robot and environment in which they are evolved, we then show the evolution of robust dynamic gaits, one of which is used on the ERS-110, the first consumer version of AIBO.

  5. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  6. A Vision-Based Trajectory Controller for Autonomous Cleaning Robots

    NASA Astrophysics Data System (ADS)

    Gerstmayr, Lorenz; Röben, Frank; Krzykawski, Martin; Kreft, Sven; Venjakob, Daniel; Möller, Ralf

    Autonomous cleaning robots should completely cover the accessible area with minimal repeated coverage. We present a mostly visionbased navigation strategy for systematical exploration of an area with meandering lanes. The results of the robot experiments show that our approach can guide the robot along parallel lanes while achieving a good coverage with only a small proportion of repeated coverage. The proposed method can be used as a building block for more elaborated navigation strategies which allow the robot to systematically clean rooms with a complex workspace shape.

  7. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    ScienceCinema

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  8. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    SciTech Connect

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  9. Taking on the tall poles of autonomous robot navigation

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark; Rajagopalan, Venkat; Steinbis, John; Haddon, John; Cannon, Paul

    2011-05-01

    The Holy Grail of autonomous ground robotics has been to make ground vehicles that behave like humans. Over the years, as a community, we have realized the difficulty of this task, and we have back pedaled from the initial Holy Grail and have constrained and narrowed the domains of operation in order to get robotic systems fielded. This has lead to phrases such as "operation in structured environments" and "open-and-rolling terrain" in the context of autonomous robot navigation. Unfortunately, constraining the problem in this way has only put off the inevitable, i.e., solving the myriad of difficult robotics problems that we identified as long ago as the 1980's on the Autonomous Land Vehicle Project and in most cases are still facing today. These "Tall Poles" have included but are not limited to navigation through complex terrain geometry, navigation through thick vegetation, the detection of geometry-less obstacles such as negative obstacles and thin obstacles, the ability to deal with diverse and dynamic environmental conditions, the ability to function in dynamic and cluttered environments alongside other humans, and any combination of the above. This paper is an overview of the progress we have made at Autonomous Systems over the last three years in trying to knock down some of the tall poles remaining in the field of autonomous ground robotics.

  10. Interactive animated displayed of man-controlled and autonomous robots

    SciTech Connect

    Crane, C.D. III; Duffy, J.

    1986-01-01

    An interactive computer graphics program has been developed which allows an operator to more readily control robot motions in two distinct modes; viz., man-controlled and autonomous. In man-controlled mode, the robot is guided by a joystick or similar device. As the robot moves, actual joint angle information is measured and supplied to a graphics system which accurately duplicates the robot motion. Obstacles are placed in the actual and animated workspace and the operator is warned of imminent collisions by sight and sound via the graphics system. Operation of the system in man-controlled mode is shown. In autonomous mode, a collision-free path between specified points is obtained by previewing robot motions on the graphics system. Once a satisfactory path is selected, the path characteristics are transmitted to the actual robot and the motion is executed. The telepresence system developed at the University of Florida has been successful in demonstrating that the concept of controlling a robot manipulator with the aid of an interactive computer graphics system is feasible and practical. The clarity of images coupled with real-time interaction and real-time determination of imminent collision with obstacles has resulted in improved operator performance. Furthermore, the ability for an operator to preview and supervise autonomous operations is a significant attribute when operating in a hazardous environment.

  11. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  12. Online Tracking Control of Autonomous Mobile Robot Utilizing Optimal Formulation

    NASA Astrophysics Data System (ADS)

    Hirakoso, Nobuto; Takizawa, Takahiro; Ishihara, Masaaki; Aoki, Kouzou

    In this study, the objective is to build a wheeled mobile robot which can move independently avoiding obstacles. To move autonomously, this robot is enabled to detect obstacles' shapes and conduct self-localization. Also, this robot can move by tracking trajectories designed by the robot itself, based on the information about the obstacles' shapes and the robot's position and attitude angle. The optimal trajectories which lead the robot to its destination are designed by using a unique optimization method. As convergent calculation is performed by setting the variables within a certain range in this proposed optimization method, the optimal solutions can be obtained approximately, even in cases where there is a difference between the number of input and output variables, and when the nonlinearity is strong with restraint conditions. In this thesis, the effectiveness of the optimal track designing method used is proven and the method deemed as appropriate.

  13. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  14. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-01-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  15. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-11-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  16. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans.

  17. FPGA implementation of vision algorithms for small autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Lee, D. J.; Archibald, J. K.

    2005-10-01

    The use of on-board vision with small autonomous robots has been made possible by the advances in the field of Field Programmable Gate Array (FPGA) technology. By connecting a CMOS camera to an FPGA board, on-board vision has been used to reduce the computation time inherent in vision algorithms. The FPGA board allows the user to create custom hardware in a faster, safer, and more easily verifiable manner that decreases the computation time and allows the vision to be done in real-time. Real-time vision tasks for small autonomous robots include object tracking, obstacle detection and avoidance, and path planning. Competitions were created to demonstrate that our algorithms work with our small autonomous vehicles in dealing with these problems. These competitions include Mouse-Trapped-in-a-Box, where the robot has to detect the edges of a box that it is trapped in and move towards them without touching them; Obstacle Avoidance, where an obstacle is placed at any arbitrary point in front of the robot and the robot has to navigate itself around the obstacle; Canyon Following, where the robot has to move to the center of a canyon and follow the canyon walls trying to stay in the center; the Grand Challenge, where the robot had to navigate a hallway and return to its original position in a given amount of time; and Stereo Vision, where a separate robot had to catch tennis balls launched from an air powered cannon. Teams competed on each of these competitions that were designed for a graduate-level robotic vision class, and each team had to develop their own algorithm and hardware components. This paper discusses one team's approach to each of these problems.

  18. GPS and odometer data fusion for outdoor robots continuous positioning

    NASA Astrophysics Data System (ADS)

    Pozo-Ruz, Ana; Garcia-Perez, Lia; Garcia-Alegre, Maria C.; Guinea, Domingo; Ribeiro, Angela; Sandoval, Francisco

    2002-02-01

    Present work describes an approximation to obtain the best estimation of the position of the outdoor robot ROJO, a low cost lawnmower to perform unmanned precision agriculture task such are the spraying of pesticides in horticulture. For continuous location of ROJO, two redundant sensors have been installed onboard: a DGPS submetric precision model and an odometric system. DGPS system will allow an absolute positioning of the vehicle in the field, but GPS failures in the reception of the signals due to obstacles and electrical and meteorological disturbance, lead us to the integration of the odometric system. Thus, a robust odometer based upon magnetic strip sensors has been designed and integrated in the vehicle. These sensors continuosly deliver the position of the vehicle relative to its initial position, complementing the DGPS blindness periods. They give an approximated location of the vehicle in the field that can be in turn conveniently updated and corrected by the DGPS. Thus, to provided the best estimation, a fusion algorithm has been proposed and proved, wherein the best estimation is calculated as the maximum value of the join probability function obtained from both position estimation of the onboard sensors. Some results are presented to show the performance of the proposed sensor fusion technique.

  19. ODYSSEUS autonomous walking robot: The leg/arm design

    NASA Technical Reports Server (NTRS)

    Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.

    1994-01-01

    ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.

  20. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  1. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  2. An approach to autonomous operations for remote mobile robotic exploration

    NASA Technical Reports Server (NTRS)

    Chouinard, C.; Fisher, F.; Gaines, D.; Estlin, T.; Schaffer, S.

    2003-01-01

    This paper presents arguments for a balanced approach to modelling and reasoning in an autonomous robotic system. The framework discussed utilizes both declarative and procedural modelling to define the domain, rules, and constraints of the system and also balances the use of deliberative and reactive reasoning during execution.

  3. Navigation and learning experiments by an autonomous robot

    SciTech Connect

    de Saussure, G.; Weisbin, C.R.; Spelt, P.F.

    1988-01-01

    Developing an autonomous mobile robot capable of navigation, surveillance and manipulation in complex and dynamic environments is a key research activity at CESAR, Oak Ridge National Laboratory's Center for Engineering Systems Advanced Research. The latest series of completed experiments was performed using the autonomous mobile robot HERMIES-IIB (Hostile Environment Robotic Machine Intelligence Experiment Series II-B). The next section describes HERMIES-IIB and some of its major components required for autonomous operation in unstructured, dynamic environments. Section 3 outlines some ongoing research in autonomous navigation. Section 4 discusses our newest research in machine learning concepts. Section 5 describes a successful experiment in which the robot is placed in an arbitrary initial location without any prior specification of the content of its environment, successively discovers and navigates around stationary or moving obstacles, picks up and moves small obstacles, searches for a control panel and performs a learned sequence of manipulations on the panel devices. The last section outlines some future directions of the program.

  4. Modeling and Control Strategies for Autonomous Robotic Systems

    DTIC Science & Technology

    1991-12-23

    Robotic Systems 12 PERSONAL AUTHOR(S) Roger W. Brockett A]&. TYPE Of REPORT 113b. TIME COVERD 114 DATt__Of RE PVRT (Year- month, Day) S.PAGE COUNT Final...NO. ACCESSION NO Reseaich Triangle Park, NC 27709-2211I I 11 TITLE (ir-’-4 Cae-unrv Canmuicauon) Modeling and Control Strategies for Autonomous

  5. GRACE: An Autonomous Robot for the AAAI Robot Challenge

    DTIC Science & Technology

    2003-01-01

    robot interaction (aside from the speech recognition) worked relatively well, there were areas for improvement. For instance, gesture recognition , which...and tracking, people following, gesture recognition , nametag reading, and face recognition. We plan to incorporate capabilities for the robot to

  6. Defining proprioceptive behaviors for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Overholt, James L.; Hudas, Greg R.; Gerhart, Grant R.

    2002-07-01

    Proprioception is a sense of body position and movement that supports the control of many automatic motor functions such as posture and locomotion. This concept, normally relegated to the fields of neural physiology and kinesiology, is being utilized in the field of unmanned mobile robotics. This paper looks at developing proprioceptive behaviors for use in controlling an unmanned ground vehicle. First, we will discuss the field of behavioral control of mobile robots. Next, a discussion of proprioception and the development of proprioceptive sensors will be presented. We will then focus on the development of a unique neural-fuzzy architecture that will be used to incorporate the control behaviors coming directly from the proprioceptive sensors. Finally we will present a simulation experiment where a simple multi-sensor robot, utilizing both external and proprioceptive sensors, is presented with the task of navigating an unknown terrain to a known target position. Results of the mobile robot utilizing this unique fusion methodology will be discussed.

  7. The Baker Observatory Robotic Autonomous Telescope

    NASA Astrophysics Data System (ADS)

    Reed, Mike D.; Thompson, Matthew A.; Hicks, L. L.; Baran, A. S.

    2011-03-01

    The objective of our project is to have an autonomous observatory to obtain long duration time-series observations of pulsating stars. Budget constraints dictate an inexpensive facility. In this paper, we discuss our solution.

  8. JOMAR: Joint Operations with Mobile Autonomous Robots

    DTIC Science & Technology

    2015-12-21

    the locations of attenuating materials in the robots’ environment. We also extend prior tomographic and correlation -based approaches to the multi-robot... correlative signal predic- tion techniques to the case of multiple mobile robots without a base station. • A new signal-strength prediction method...be used to predict packet success rates. There are two main approaches to signal prediction in the literature – the first is correlative in nature

  9. Concurrent planning and execution for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid G.

    1992-01-01

    The Task Control Architecture (TCA) provides communication and coordination facilities to construct distributed, concurrent robotic systems. The use of TCA in a system that walks a legged robot through rugged terrain is described. The walking system, as originally implemented, had a sequential sense-plan-act control cycle. Utilizing TCA features for task sequencing and monitoring, the system was modified to concurrently plan and execute steps. Walking speed improved by over 30 percent, with only a relatively modest conversion effort.

  10. Research in autonomous robotics at ORNL using HERMIES-III

    SciTech Connect

    Weisbin, C.R.; Burks, B.L.; Einstein, J.R.; Feezell, R.R.; Manges, W.W.; Thompson, D.H.

    1989-01-01

    HERMIES-III is an autonomous robot comprised of a seven degree-of- freedom (DOF) manipulator designed for human scale tasks, a laser range finder, a sonar array, an omnidirectional wheel-driven chassis, multiple cameras, and a dual computer system containing a 16-node hypercube expandable to 128 nodes. The current experimental program involves performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES-III). The environment in which the robots operate has been designed to include multiple valves, pipes, meters, obstacles on the floor, valves occluded from view, and multiple paths of differing navigation complexity. The ongoing research program supports the development of autonomous capability for HERMIES-IIB and III to perform complex navigation and manipulation under time constraints, while dealing with imprecise sensory information. 10 refs., 4 figs.

  11. Autonomous learning in humanoid robotics through mental imagery.

    PubMed

    Di Nuovo, Alessandro G; Marocco, Davide; Di Nuovo, Santo; Cangelosi, Angelo

    2013-05-01

    In this paper we focus on modeling autonomous learning to improve performance of a humanoid robot through a modular artificial neural networks architecture. A model of a neural controller is presented, which allows a humanoid robot iCub to autonomously improve its sensorimotor skills. This is achieved by endowing the neural controller with a secondary neural system that, by exploiting the sensorimotor skills already acquired by the robot, is able to generate additional imaginary examples that can be used by the controller itself to improve the performance through a simulated mental training. Results and analysis presented in the paper provide evidence of the viability of the approach proposed and help to clarify the rational behind the chosen model and its implementation.

  12. Applications of concurrent neuromorphic algorithms for autonomous robots

    NASA Technical Reports Server (NTRS)

    Barhen, J.; Dress, W. B.; Jorgensen, C. C.

    1988-01-01

    This article provides an overview of studies at the Oak Ridge National Laboratory (ORNL) of neural networks running on parallel machines applied to the problems of autonomous robotics. The first section provides the motivation for our work in autonomous robotics and introduces the computational hardware in use. Section 2 presents two theorems concerning the storage capacity and stability of neural networks. Section 3 presents a novel load-balancing algorithm implemented with a neural network. Section 4 introduces the robotics test bed now in place. Section 5 concerns navigation issues in the test-bed system. Finally, Section 6 presents a frequency-coded network model and shows how Darwinian techniques are applied to issues of parameter optimization and on-line design.

  13. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  14. System safety analysis of an autonomous mobile robot

    SciTech Connect

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  15. Multiagent collaboration for experimental calibration of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Vachon, Bertrand; Berge-Cherfaoui, Veronique

    1991-03-01

    This paper presents an action in mission SOCRATES whose aim is the development of a self-calibration method for an autonomous mobile robot. The robot has to determine the precise location of the coordinate system shared by its sensors. Knowledge of this system is a sine qua non condition for efficient multisensor fusion and autonomous navigation in an unknown environment. But, as perceptions and motions are not accurate, this knowledge can only be achieved by multisensor fusion. The application described highlights this kind of problem. Multisensor fusion is used here especially in its symbolic aspect. Useful knowledge includes both numerous data coming from various sensors and suitable ways to process these data. A blackboard architecture has been chosen to manage useful information. Knowledge sources are called agents and the implement physical sensors (perceptors or actuators) as well as logical sensors (high level data processors). The problem to solve is self- calibration which includes the determination of the coordinate system R of the robot and the transformations necessary to convert data from sensor reference to R. The origin of R has been chosen to be O, the rotation center of the robot. As its genuine location may vary due to robot or ground characteristics, an experimental determination of O is attempted. A strategy for measuring distances in approximate positions is proposed. This strategy must take into account the fact that motions of the robot as well as perceptions may be inaccurate. Results obtained during experiments and future extensions of the system are discussed.

  16. Biomimetic smart sensors for autonomous robotic behavior II: vestibular processing

    NASA Astrophysics Data System (ADS)

    Xue, Shuwan; Deligeorges, Socrates; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Limited autonomous behaviors are fast becoming a critical capability in the field of robotics as robotic applications are used in more complicated and interactive environments. As additional sensory capabilities are added to robotic platforms, sensor fusion to enhance and facilitate autonomous behavior becomes increasingly important. Using biology as a model, the equivalent of a vestibular system needs to be created in order to orient the system within its environment and allow multi-modal sensor fusion. In mammals, the vestibular system plays a central role in physiological homeostasis and sensory information integration (Fuller et al, Neuroscience 129 (2004) 461-471). At the level of the Superior Colliculus in the brain, there is multimodal sensory integration across visual, auditory, somatosensory, and vestibular inputs (Wallace et al, J Neurophysiol 80 (1998) 1006-1010), with the vestibular component contributing a strong reference frame gating input. Using a simple model for the deep layers of the Superior Colliculus, an off-the-shelf 3-axis solid state gyroscope and accelerometer was used as the equivalent representation of the vestibular system. The acceleration and rotational measurements are used to determine the relationship between a local reference frame of a robotic platform (an iRobot Packbot®) and the inertial reference frame (the outside world), with the simulated vestibular input tightly coupled with the acoustic and optical inputs. Field testing of the robotic platform using acoustics to cue optical sensors coupled through a biomimetic vestibular model for "slew to cue" gunfire detection have shown great promise.

  17. An Aerial–Ground Robotic System for Navigation and Obstacle Mapping in Large Outdoor Areas

    PubMed Central

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  18. Autonomous robots for hazardous and unstructured environments

    SciTech Connect

    Hamel, W.R.; Babcock, S.M.; Hall, M.G.; Jorgenson, C.C.; Killough, S.M.; Weisbin, C.R.

    1986-01-01

    This paper reports continuing research in the areas of navigation and manipulation in unstructured environments, which is being carried out at the Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL). The HERMIES-II mobile robot, a low-cost prototype of a series that will include many of the major features required for remote operations in hazardous environments, is discussed. Progress toward development of a high-performance research manipulator is presented, and application of an advanced parallel computer to mobile robot problems, which is under way, is discussed.

  19. Methods and Apparatus for Autonomous Robotic Control

    NASA Technical Reports Server (NTRS)

    Versace, Massimiliano (Inventor); Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Palma, Jesse (Inventor)

    2017-01-01

    Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.

  20. Application of a Chaotic Oscillator in an Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tlelo-Cuautle, Esteban; Ramos-López, Hugo C.; Sánchez-Sánchez, Mauro; Pano-Azucena, Ana D.; Sánchez-Gaspariano, Luis A.; Núñez-Pérez, José C.; Camas-Anzueto, Jorge L.

    2014-05-01

    Terrain exploration robots can be of great usefulness in critical navigation circumstances. However, the challenge is how to guarantee a control for covering a full terrain area. That way, the application of a chaotic oscillator to control the wheels of an autonomous mobile robot, is introduced herein. Basically, we describe the realization of a random number generator (RNG) based on a double-scroll chaotic oscillator, which is used to guide the robot to cover a full terrain area. The resolution of the terrain exploration area is determined by both the number of bits provided by the RNG and the characteristics of step motors. Finally, the experimental results highlight the covered area by painting the trajectories that the robot explores.

  1. Rice-obot 1: An intelligent autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R.; Ciscon, L.; Berberian, D.

    1989-01-01

    The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes.

  2. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, A.M.; Ward, C.R.; Jones, J.D.; Mallet, W.R.; Harpring, L.J.; Collins, M.X.; Anderson, E.K.

    1999-08-10

    A mobile robotic system is described that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console. 4 figs.

  3. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, Aed M.; Ward, Clyde R.; Jones, Joel D.; Mallet, William R.; Harpring, Larry J.; Collins, Montenius X.; Anderson, Erin K.

    1999-01-01

    A mobile robotic system that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console.

  4. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  5. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed.

  6. Automatic Welding System Using Speed Controllable Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Kim, Taewon; Suto, Takeshi; Kobayashi, Junya; Kim, Jongcheol; Suga, Yasuo

    A prototype of autonomous mobile robot with two vision sensors for automatic welding of steel plates was constructed. The robot can move straight, steer and turn around the robot center by controlling the driving speed of the two wheels respectively. At the tip of the movable arm, two CCD cameras are fixed. A local camera observes the welding line near the welding torch and another wide camera observes relatively wide area in front of the welding part. The robot controls the traveling speed in accordance with the shape of the welding line. In the case of straight welding line, the speed of the robot is accelerated and the welding efficiency is improved. However, if the robot finds a corner of welding line, the speed is decelerated in order to realize the precise seam tracking and stable welding. Therefore, the robot can realize precise and high speed seam-tracking by controlling the travel speed. The effectiveness of the control system is confirmed by welding experiments.

  7. Autonomous Robot System for Sensor Characterization

    SciTech Connect

    David Bruemmer; Douglas Few; Frank Carney; Miles Walton; Heather Hunting; Ron Lujan

    2004-03-01

    This paper discusses an innovative application of new Markov localization techniques that combat the problem of odometry drift, allowing a novel control architecture developed at the Idaho National Engineering and Environmental Laboratory (INEEL) to be utilized within a sensor characterization facility developed at the Remote Sensing Laboratory (RSL) in Nevada. The new robotic capability provided by the INEEL will allow RSL to test and evaluate a wide variety of sensors including radiation detection systems, machine vision systems, and sensors that can detect and track heat sources (e.g. human bodies, machines, chemical plumes). By accurately moving a target at varying speeds along designated paths, the robotic solution allows the detection abilities of a wide variety of sensors to be recorded and analyzed.

  8. Autonomous Robotic Following Using Vision Based Techniques

    DTIC Science & Technology

    2005-02-03

    different methods for the soldier’s control of the vehicle are being investigated. One such method is the Leader - Follower approach. In the Field So...what is the current state of the art for leader - follower applications? One of the leaders in this field is the RF ATD (Robotic Follower Advanced...these systems have in common? Both of these platforms are representative of the state-of-the-art of current leader - follower technology being tested by

  9. A task control architecture for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Mitchell, Tom

    1990-01-01

    An architecture is presented for controlling robots that have multiple tasks, operate in dynamic domains, and require a fair degree of autonomy. The architecture is built on several layers of functionality, including a distributed communication layer, a behavior layer for querying sensors, expanding goals, and executing commands, and a task level for managing the temporal aspects of planning and achieving goals, coordinating tasks, allocating resources, monitoring, and recovering from errors. Application to a legged planetary rover and an indoor mobile manipulator is described.

  10. Autonomous Military Robotics: Risk, Ethics, and Design

    DTIC Science & Technology

    2008-12-20

    avenge the deaths of their brothers in arms—unlawful actions that carry a significant political cost. Indeed, robots may act as objective...unblinking observers on the battlefield, reporting any unethical behavior back to command; their mere presence as such would discourage all-too-human... act in compliance with the LOW and ROE (though this may not be as straightforward and simply as it first appears) or act ethically in the specific

  11. Concurrent algorithms for autonomous robot navigation in an unexplored terrain

    SciTech Connect

    Rao, S.V.N.; Iyengar, S.S.; Jorgensen, C.C.; Weisbin, C.R.

    1986-01-01

    Navigation planning is one of the vital aspects of any autonomous mobile robot. In this paper, we present concurrent algorithms for an autonomous robot navigation system that does not require a pre-learned obstacle terrain model. The terrain model is gradually built by integrating the information from multiple journeys. The available information is used to the maximum extent in navigation planning, and global optimality is gradually achieved. It is shown that these concurrent algorithms are free from deadlocks and starvation. The performance of the concurrent algorithms is analyzed in terms of the planning time, travel time, scanning time, and update time. A modified adjacency list is proposed as the data structure for the spatial graph that represents an obstacle terrain. The time complexities of various algorithms that access, maintain, and update the spatial graph are estimated, and the effectiveness of the implementation is illustrated.

  12. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  13. Robotic reactions: Delay-induced patterns in autonomous vehicle systems

    NASA Astrophysics Data System (ADS)

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  14. A fuzzy logic controller for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Yen, John; Pfluger, Nathan

    1993-01-01

    The ability of a mobile robot system to plan and move intelligently in a dynamic system is needed if robots are to be useful in areas other than controlled environments. An example of a use for this system is to control an autonomous mobile robot in a space station, or other isolated area where it is hard or impossible for human life to exist for long periods of time (e.g., Mars). The system would allow the robot to be programmed to carry out the duties normally accomplished by a human being. Some of the duties that could be accomplished include operating instruments, transporting objects, and maintenance of the environment. The main focus of our early work has been on developing a fuzzy controller that takes a path and adapts it to a given environment. The robot only uses information gathered from the sensors, but retains the ability to avoid dynamically placed obstacles near and along the path. Our fuzzy logic controller is based on the following algorithm: (1) determine the desired direction of travel; (2) determine the allowed direction of travel; and (3) combine the desired and allowed directions in order to determine a direciton that is both desired and allowed. The desired direction of travel is determined by projecting ahead to a point along the path that is closer to the goal. This gives a local direction of travel for the robot and helps to avoid obstacles.

  15. A fuzzy logic controller for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Yen, John; Pfluger, Nathan

    1993-01-01

    The ability of a mobile robot system to plan and move intelligently in a dynamic system is needed if robots are to be useful in areas other than controlled environments. An example of a use for this system is to control an autonomous mobile robot in a space station, or other isolated area where it is hard or impossible for human life to exist for long periods of time (e.g., Mars). The system would allow the robot to be programmed to carry out the duties normally accomplished by a human being. Some of the duties that could be accomplished include operating instruments, transporting objects, and maintenance of the environment. The main focus of our early work has been on developing a fuzzy controller that takes a path and adapts it to a given environment. The robot only uses information gathered from the sensors, but retains the ability to avoid dynamically placed obstacles near and along the path. Our fuzzy logic controller is based on the following algorithm: (1) determine the desired direction of travel; (2) determine the allowed direction of travel; and (3) combine the desired and allowed directions in order to determine a direciton that is both desired and allowed. The desired direction of travel is determined by projecting ahead to a point along the path that is closer to the goal. This gives a local direction of travel for the robot and helps to avoid obstacles.

  16. Visual landmark recognition for autonomous robot navigation

    NASA Astrophysics Data System (ADS)

    Cicerone, M.; Stella, Ettore; Caponetti, Laura; Distante, Arcangelo

    1997-09-01

    Self-location is the capability of a mobile robot to determine its position in the environment referring to absolute landmarks. The possibility to use natural visual landmarks for self-location augments the autonomy and the flexibility of mobile vehicles. In this paper the use of junctions, detected in real images, as landmarks is proposed. The use of visual cues means that problems regarding variations of perspective and scale must be resolved. We propose to formulate the junction recognition as a graph matching problem and resolved using standard methods. Experimental results are shown on real contexts.

  17. Autonomous robotic platforms for locating radio sources buried under rubble

    NASA Astrophysics Data System (ADS)

    Tasu, A. S.; Anchidin, L.; Tamas, R.; Paun, M.; Danisor, A.; Petrescu, T.

    2016-12-01

    This paper deals with the use of autonomous robotic platforms able to locate radio signal sources such as mobile phones, buried under collapsed buildings as a result of earthquakes, natural disasters, terrorism, war, etc. This technique relies on averaging position data resulting from a propagation model implemented on the platform and the data acquired by robotic platforms at the disaster site. That allows us to calculate the approximate position of radio sources buried under the rubble. Based on measurements, a radio map of the disaster site is made, very useful for locating victims and for guiding specific rubble lifting machinery, by assuming that there is a victim next to a mobile device detected by the robotic platform; by knowing the approximate position, the lifting machinery does not risk to further hurt the victims. Moreover, by knowing the positions of the victims, the reaction time is decreased, and the chances of survival for the victims buried under the rubble, are obviously increased.

  18. Autonomous Robotic Refueling System (ARRS) for rapid aircraft turnaround

    NASA Astrophysics Data System (ADS)

    Williams, O. R.; Jackson, E.; Rueb, K.; Thompson, B.; Powell, K.

    An autonomous robotic refuelling system is being developed to achieve rapid aircraft turnaround, notably during combat operations. The proposed system includes a gantry positioner with sufficient reach to position a robotic arm that performs the refuelling tasks; a six degree of freedom manipulator equipped with a remote center of compliance, torque sensor, and a gripper that can handle standard tools; a computer vision system to locate and guide the refuelling nozzle, inspect the nozzle, and avoid collisions; and an operator interface with video and graphics display. The control system software will include components designed for trajectory planning and generation, collision detection, sensor interfacing, sensory processing, and human interfacing. The robotic system will be designed so that upgrading to perform additional tasks will be relatively straightforward.

  19. Distributed multisensor blackboard system for an autonomous robot

    NASA Astrophysics Data System (ADS)

    Kappey, Dietmar; Pokrandt, Peter; Schloen, Jan

    1994-10-01

    Sensoric data enable a robotic system to react to events occurring in its environment. Much work has been done on the development of various sensors and algorithms to extract information from an environment. On the other hand, only little work has been done in the field of multisensor communication. This paper presents a shared memory based communication protocol that has been developed for the autonomous robot system KAMRO. This system consists of two PUMA 260 manipulators and an omnidirectionally driven mobile platform. The proposed approach is based on logical sensors, which can be used to dynamically build hierarchical sensor units. The protocol uses a distributed blackboard structure for the transmission of sensor data and commands. To support asynchronous coupling of robots and sensors, it not only transfers single sensor values, but also offers functions to estimate future values.

  20. A functional system architecture for fully autonomous robot

    NASA Astrophysics Data System (ADS)

    Kalaycioglu, S.

    The Mobile Servicing System (MSS) Autonomous Robotics Program intends to define and plan the development of technologies required to provide a supervised autonomous operation capability for the Special Purpose Dexterous Manipulator (SPDM) on the MSS. The operational functions for the SPDM to perform the required tasks, both in fully autonomous or supervised modes, are identified. Functional decomposition is performed using a graphics oriented methodology called Structural Analysis Design Technique. This process defines the functional architecture of the system, the types of data required to support its functionality, and the control processes that need to be emplaced. On the basis of the functional decomposition, a technology breakdown structure is also developed. A preliminary estimate of the status and maturity of each relevant technology is made, based on this technology breakdown. The developed functional hierarchy is found to be very effective for a robotic system with any level of autonomy. Moreover, this hierarchy can easily be applied to an existing very low level autonomous system and can provide a smooth transition towards a higher degree of autonomy. The effectiveness of the developed functional hierarchy will also play a very significant role both in the system design as well as in the development of the control hierarchy.

  1. Active objects programming for military autonomous mobile robots software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-09-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge pannel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  2. Active object programming for military autonomous mobile robot software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-10-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge panel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  3. Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots

    DTIC Science & Technology

    2003-01-01

    Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots Eric Matson Scott DeLoach Multi-agent and Cooperative...00-00-2003 4. TITLE AND SUBTITLE Using Dynamic Capability Evaluation to Organize a Team of Cooperative, Autonomous Robots 5a. CONTRACT NUMBER 5b...company structure. Agent: Equivalent to autonomous robots in this instance. Agents coordinate through the organization via conversations and act

  4. Evolving self-assembly in autonomous homogeneous robots: experiments with two physical robots.

    PubMed

    Ampatzis, Christos; Tuci, Elio; Trianni, Vito; Christensen, Anders Lyhne; Dorigo, Marco

    2009-01-01

    This research work illustrates an approach to the design of controllers for self-assembling robots in which the self-assembly is initiated and regulated by perceptual cues that are brought forth by the physical robots through their dynamical interactions. More specifically, we present a homogeneous control system that can achieve assembly between two modules (two fully autonomous robots) of a mobile self-reconfigurable system without a priori introduced behavioral or morphological heterogeneities. The controllers are dynamic neural networks evolved in simulation that directly control all the actuators of the two robots. The neurocontrollers cause the dynamic specialization of the robots by allocating roles between them based solely on their interaction. We show that the best evolved controller proves to be successful when tested on a real hardware platform, the swarm-bot. The performance achieved is similar to the one achieved by existing modular or behavior-based approaches, also due to the effect of an emergent recovery mechanism that was neither explicitly rewarded by the fitness function, nor observed during the evolutionary simulation. Our results suggest that direct access to the orientations or intentions of the other agents is not a necessary condition for robot coordination: Our robots coordinate without direct or explicit communication, contrary to what is assumed by most research works in collective robotics. This work also contributes to strengthening the evidence that evolutionary robotics is a design methodology that can tackle real-world tasks demanding fine sensory-motor coordination.

  5. Sensing and modelling concepts for autonomous and remote mobile robot control

    NASA Astrophysics Data System (ADS)

    Yeung, S. K.; McMath, W. S.; Necsulescu, D.; Petriu, E. M.

    1993-01-01

    Sensing functions and modeling concepts for autonomous and remote mobile robot control in an unstructured environment are discussed. Sensing methods for robot position recovery, object recognition, and tactile teleoperator feedback are presented.

  6. Acquisition of Autonomous Behaviors by Robotic Assistants

    NASA Technical Reports Server (NTRS)

    Peters, R. A., II; Sarkar, N.; Bodenheimer, R. E.; Brown, E.; Campbell, C.; Hambuchen, K.; Johnson, C.; Koku, A. B.; Nilas, P.; Peng, J.

    2005-01-01

    Our research achievements under the NASA-JSC grant contributed significantly in the following areas. Multi-agent based robot control architecture called the Intelligent Machine Architecture (IMA) : The Vanderbilt team received a Space Act Award for this research from NASA JSC in October 2004. Cognitive Control and the Self Agent : Cognitive control in human is the ability to consciously manipulate thoughts and behaviors using attention to deal with conflicting goals and demands. We have been updating the IMA Self Agent towards this goal. If opportunity arises, we would like to work with NASA to empower Robonaut to do cognitive control. Applications 1. SES for Robonaut, 2. Robonaut Fault Diagnostic System, 3. ISAC Behavior Generation and Learning, 4. Segway Research.

  7. A software architecture for autonomous orbital robotics

    NASA Astrophysics Data System (ADS)

    Henshaw, Carl G.; Akins, Keith; Creamer, N. Glenn; Faria, Matthew; Flagg, Cris; Hayden, Matthew; Healy, Liam; Hrolenok, Brian; Johnson, Jeffrey; Lyons, Kimberly; Pipitone, Frank; Tasker, Fred

    2006-05-01

    SUMO, the Spacecraft for the Universal Modification of Orbits, is a DARPA-sponsored spacecraft designed to provide orbital repositioning services to geosynchronous satellites. Such services may be needed to facilitate changing the geostationary slot of a satellite, to allow a satellite to be used until the propellant is expended instead of reserving propellant for a retirement burn, or to rescue a satellite stranded in geosynchronous transfer orbit due to a launch failure. Notably, SUMO is being designed to be compatible with the current geosynchronous satellite catalog, which implies that it does not require the customer spacecraft to have special docking fixtures, optical guides, or cooperative communications or pose sensors. In addition, the final approach and grapple will be performed autonomously. SUMO is being designed and built by the Naval Center for Space Technology, a division of the U.S. Naval Research Laboratory in Washington, DC. The nature of the SUMO concept mission leads to significant challenges in onboard spacecraft autonomy. Also, because research and development in machine vision, trajectory planning, and automation algorithms for SUMO is being pursued in parallel with flight software development, there are considerable challenges in prototyping and testing algorithms in situ and in transitioning these algorithms from laboratory form into software suitable for flight. This paper discusses these challenges, outlining the current SUMO design from the standpoint of flight algorithms and software. In particular, the design of the SUMO phase 1 laboratory demonstration software is described in detail. The proposed flight-like software architecture is also described.

  8. Context recognition and situation assessment in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Yavnai, Arie

    1993-05-01

    The capability to recognize the operating context and to assess the situation in real-time is needed, if a high functionality autonomous mobile robot has to react properly and effectively to continuously changing situations and events, either external or internal, while the robot is performing its assigned tasks. A new approach and architecture for context recognition and situation assessment module (CORSA) is presented in this paper. CORSA is a multi-level information processing module which consists of adaptive decision and classification algorithms. It performs dynamic mapping from the data space to the context space, and dynamically decides on the context class. Learning mechanism is employed to update the decision variables so as to minimize the probability of misclassification. CORSA is embedded within the Mission Manager module of the intelligent autonomous hyper-controller (IAHC) of the mobile robot. The information regarding operating context, events and situation is then communicated to other modules of the IAHC where it is used to: (a) select the appropriate action strategy; (b) support the processes to arbitration and conflict resolution between reflexive behaviors and reasoning-driven behaviors; (c) predict future events and situations; and (d) determine criteria and priorities for planning, replanning, and decision making.

  9. Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)

    NASA Technical Reports Server (NTRS)

    Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey

    1990-01-01

    Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.

  10. Low-cost semi-autonomous manipulation technique for explosive ordnance disposal robots

    NASA Astrophysics Data System (ADS)

    Czop, Andrew; Del Signore, Michael J.; Hacker, Kurt

    2008-04-01

    Robotic manipulators used on current EOD robotic platforms exhibit very few autonomous capabilities. This lack of autonomy forces the operator to completely control manipulator movements. With the increasing complexity of robotic manipulators, this can prove to be a very complex and tedious task. The development of autonomous capabilities for platform navigation are currently being extensively researched and applied to EOD robots. While autonomous manipulation has also been researched, this technology has yet to appear in fielded EOD robotic systems. As a result, there is a need for the exploration and development of manipulator automation within the scope of EOD robotics. In addition, due to the expendable nature of EOD robotic assets, the addition of this technology needs to add little to the overall cost of the robotic system. To directly address the need for a low-cost semi-autonomous manipulation capability for EOD robots, the Naval Explosive Ordnance Disposal Technology Division (NAVEODTECHDIV) proposes the Autonomous Robotic Manipulator (ARM). The ARM incorporates several semi-autonomous manipulation behaviors including point-and-click movement, user-defined distance movement, user-defined angle positioning, memory locations to save and recall manipulator positions, and macros to memorize and repeat multi-position repetitive manipulator movements. These semi-autonomous behaviors will decrease an EOD operator's time on target by reducing the manipulation workload in a user-friendly fashion. This conference paper will detail the background of the project, design of the prototype, algorithm development, implementation, results, and future work.

  11. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  12. Dynamic map building for an autonomous mobile robot

    SciTech Connect

    Leonard, J.J.; Durrant-Whyte, H.F. ); Cox, I.J. )

    1992-08-01

    This article presents an algorithm for autonomous map building and maintenance for a mobile robot. The authors believe that mobile robot navigation can be treated as a problem of tracking geometric features that occur naturally in the environment. They represent each feature in the map by a location estimate (the feature state vector) and two distinct measures of uncertainty: a covariance matrix to represent uncertainty in feature location, and a credibility measure to represent their belief in the validity of the feature. During each position update cycle, predicted measurements are generated for each geometric feature in the map and compared with actual sensor observations. Successful matches cause a feature's credibility to be increased. Unpredicted observations are used to initialize new geometric features, while unobserved predictions result in a geometric feature's credibility being decreased. They also describe experimental results obtained with the algorithm that demonstrate successful map building using real sonar data.

  13. Emotion understanding from the perspective of autonomous robots research.

    PubMed

    Cañamero, Lola

    2005-05-01

    In this paper, I discuss some of the contributions that modeling emotions in autonomous robots can make towards understanding human emotions-'as sited in the brain' and as used in our interactions with the environment-and emotions in general. Such contributions are linked, on the one hand, to the potential use of such robotic models as tools and 'virtual laboratories' to test and explore systematically theories and models of human emotions, and on the other hand to a modeling approach that fosters conceptual clarification and operationalization of the relevant aspects of theoretical notions and models. As illustrated by an overview of recent advances in the field, this area is still in its infancy. However, the work carried out already shows that we share many conceptual problems and interests with other disciplines in the affective sciences and that sound progress necessitates multidisciplinary efforts.

  14. SAURON: The Wallace Observatory Small AUtonomous Robotic Optical Nightwatcher

    NASA Astrophysics Data System (ADS)

    Kosiarek, M.; Mansfield, M.; Brothers, T.; Bates, H.; Aviles, R.; Brode-Roger, O.; Person, M.; Russel, M.

    2017-07-01

    The Small AUtonomous Robotic Optical Nightwatcher (SAURON) is an autonomous telescope consisting of an 11-inch Celestron Nexstar telescope on a SoftwareBisque Paramount ME II in a Technical Innovations ProDome located at the MIT George R. Wallace, Jr. Astrophysical Observatory. This paper describes the construction of the telescope system and its first light data on T-And0-15785, an eclipsing binary star. The out-of-eclipse R magnitude of T-And0-15785 was found to be 13.3258 ± 0.0015 R magnitude, and the magnitude changes for the primary and secondary eclipses were found to be 0.7145 ± 0.0515 and 0.6085 ± 0.0165 R magnitudes, respectively.

  15. On autonomous terrain model acquisition by a mobile robot

    SciTech Connect

    Rao, N.S.V.; Iyengar, S.S.; Weisbin, C.R.

    1987-01-20

    The following problem is considered: A point robot is placed in a terrain populated by unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on 'scan from vertices' strategy is given by ..sigma../sub i = 1/sup n/N(O/sub i/) - n and ..sigma../sub i = 1/sup n/N(O/sub i/) - 2n in two and three dimensional terrains respectively, where O = )O/sub 1/,O/sub 2/,...,O/sub n/) set of the obstacles in the terrain, and N(O/sub i/) is the number of vertices of the obstacle O/sub i/.

  16. Classifying and recovering from sensing failures in autonomous mobile robots

    SciTech Connect

    Murphy, R.R.; Hershberger, D.

    1996-12-31

    This paper presents a characterization of sensing failures in autonomous mobile robots, a methodology for classification and recovery, and a demonstration of this approach on a mobile robot performing landmark navigation. A sensing failure is any event leading to defective perception, including sensor malfunctions, software errors, environmental changes, and errant expectations. The approach demonstrated in this paper exploits the ability of the robot to interact with its environment to acquire additional information for classification (i.e., active perception). A Generate and Test strategy is used to generate hypotheses to explain the symptom resulting from the sensing failure. The recovery scheme replaces the affected sensing processes with an alternative logical sensor. The approach is implemented as the Sensor Fusion Effects Exception Handling (SFX-EH) architecture. The advantages of SFX-EH are that it requires only a partial causal model of sensing failure, the control scheme strives for a fast response, tests are constructed so as to prevent confounding from collaborating sensors which have also failed, and the logical sensor organization allows SFX-EH to be interfaced with the behavioral level of existing robot architectures.

  17. Performance of visual and ultrasound sensing by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.

    1991-01-01

    This paper presents results of an experimental study of the reliability of an autonomous mobile robot operating in an unstructured environment. Examined in the study are the principal components of the visual and ultrasound sensor systems used to guide navigation and manipulation tasks of the robot. Performance criteria are established with respect to the requirements of the integrated robotic system. Repeated measurements are done of the geometric and spatial quantities used for docking the robot at a mock-up control panel, and for locating control panel devices to be manipulated. The systematic and random components of the errors in the measured quantities are exhibited, their origins are identified, and means for their reduction are developed. We focus on refinements of visual area data using ultrasound range data, and on extraction of yaw by visual and by ultrasound methods. Monte Carlo methods are used to study the sensor fusion, and angle-dependence considerations are used to characterize the precision of the yaw measurements. Issues relating to sensor models and sensor fusion, viewed as essential strategic components of intelligent systems, are then discussed. 32 refs., 13 figs., 5 tabs.

  18. On autonomous terrain model acquistion by a mobile robot

    NASA Technical Reports Server (NTRS)

    Rao, N. S. V.; Iyengar, S. S.; Weisbin, C. R.

    1987-01-01

    The following problem is considered: A point robot is placed in a terrain populated by an unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on scan from vertices strategy is given by the summation of i = 1 (sup n) N(O sub i)-n and summation of i = 1 (sup n) N(O sub i)-2n in two- and three-dimensional terrains respectively, where O = (O sub 1, O sub 2,....O sub n) set of the obstacles in the terrain, and N(O sub i) is the number of vertices of the obstacle O sub i.

  19. Autonomous navigation and speech in the mobile robot of MAIA

    NASA Astrophysics Data System (ADS)

    Caprile, Bruno; Lazzari, Gianni; Stringa, Luigi

    1993-05-01

    MAIA (acronym for Modello Avanzato di Intelligenza Artificiale) is a project aimed at the integration of several AI resources being presently developed at IRST. The overall approach to the design of intelligent artificial systems that MAIA proposes is experimental not less than theoretical, and an experimental setup (that we call the experimental platform) has consequently been defined in which a variety of mutually interacting functionalities can be tested in a common framework. The experimental platform of MAIA consists of 'brains' and 'tentacles', and it is to one of such tentacles--the Mobile Robot--that the present paper is devoted to. At present, the mobile robot is equipped with two main modules: the Navigation Module, which gives the robot the capability of moving autonomously in an indoor environment, and the Speech Module, which allows the robot to communicate with humans by voice. Here the overall architecture of the system is described in detail and potentialities arising from the combined use of speech and navigation are considered.

  20. Using Robotic Operating System (ROS) to control autonomous observatories

    NASA Astrophysics Data System (ADS)

    Vilardell, Francesc; Artigues, Gabriel; Sanz, Josep; García-Piquer, Álvaro; Colomé, Josep; Ribas, Ignasi

    2016-07-01

    Astronomical observatories are complex systems requiring the integration of numerous devices into a common platform. We are presenting here the firsts steps to integrate the popular Robotic Operating System (ROS) into the control of a fully autonomous observatory. The observatory is also equipped with a decision-making procedure that can automatically react to a changing environment (like weather events). The results obtained so far have shown that the automation of a small observatory can be greatly simplified when using ROS, as well as robust, with the implementation of our decision-making algorithms.

  1. Autonomous, teleoperated, and shared control of robot systems

    SciTech Connect

    Anderson, R.J.

    1994-12-31

    This paper illustrates how different modes of operation such as bilateral teleoperation, autonomous control, and shared control can be described and implemented using combinations of modules in the SMART robot control architecture. Telerobotics modes are characterized by different ``grids`` of SMART icons, where each icon represents a portion of run-time code that implements a passive control law. By placing strict requirements on the module`s input-output behavior and using scattering theory to develop a passive sampling technique, a flexible, expandable telerobot architecture is achieved. An automatic code generation tool for generating SMART systems is also described.

  2. An Economical Framework for Verification of Swarm-Based Algorithms Using Small, Autonomous Robots

    DTIC Science & Technology

    2006-09-01

    NAWCWD TP 8630 An Economical Framework for Verification of Swarm- Based Algorithms Using Small, Autonomous Robots by James...Verification of Swarm-Based Algorithms Using Small, Autonomous Robots (U) 6. AUTHOR(S) James Bobinchak, Eric Ford, Rodney Heil, and Duane Schwartzwald

  3. Recognition of traversable areas for mobile robotic navigation in outdoor environments.

    SciTech Connect

    Hutchinson, Scott Alan; Davidson, James C.

    2003-06-01

    In this paper we consider the problem of automatically determining whether regions in an outdoor environment can be traversed by a mobile robot. We propose a two-level classifier that uses data from a single color image to make this determination. At the low level, we have implemented three classifiers based on color histograms, directional filters and local binary patterns. The outputs of these low level classifiers are combined using a voting scheme that weights the results of each classifier using an estimate of its error probability. We present results from a large number of trials using a database of representative images acquired in real outdoor environments.

  4. Design of a Micro-Autonomous Robot for Use in Astronomical Instruments

    NASA Astrophysics Data System (ADS)

    Cochrane, W. A.; Luo, X.; Lim, T.; Taylor, W. D.; Schnetler, H.

    2012-07-01

    A Micro-Autonomous Positioning System (MAPS) has been developed using micro-autonomous robots for the deployment of small mirrors within multi-object astronomical instruments for use on the next generation ground-based telescopes. The micro-autonomous robot is a two-wheel differential drive robot with a footprint of approximately 20 × 20 mm. The robot uses two brushless DC Smoovy motors with 125:1 planetary gearheads for positioning the mirror. This article describes the various elements of the overall system and in more detail the various robot designs. Also described in this article is the build and test of the most promising design, proving that micro-autonomous robot technology can be used in precision controlled applications.

  5. Laser-Based Pedestrian Tracking in Outdoor Environments by Multiple Mobile Robots

    PubMed Central

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  6. Laser-based pedestrian tracking in outdoor environments by multiple mobile robots.

    PubMed

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-10-29

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures.

  7. Using Insect Electroantennogram Sensors on Autonomous Robots for Olfactory Searches

    PubMed Central

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities1 or explosive traces in landmine fields2 face the same problem as insects foraging for food or searching for mates3: the olfactory search is constrained by the physics of turbulent transport4. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7 but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells8 or toxic and illicit substances9-11. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14 or using nanostructured gas sensors that mimic insect antennae15

  8. Using insect electroantennogram sensors on autonomous robots for olfactory searches.

    PubMed

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-08-04

    Robots designed to track chemical leaks in hazardous industrial facilities or explosive traces in landmine fields face the same problem as insects foraging for food or searching for mates: the olfactory search is constrained by the physics of turbulent transport. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells or toxic and illicit substances. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration or using nanostructured gas sensors that mimic insect antennae.

  9. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  10. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  11. Thermal Imaging for Robotic Applications in Outdoor Scenes

    DTIC Science & Technology

    1990-04-01

    DOWNGRADING SCHEDULE distribution unl imi ted 4 PERFORMING ORGANIZATION REPORT NUMBER(S) 5 MONITORING ORGANIZATION REPORT NUMBER(S) CMU-RI-TR-90-08 Fa NAME...OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION The Robotics Institute (if applicable) Carnegie Mellon University I 6c...HgCdTe with a silicium CCD circuit (IRCCD PV HgCdTe) that works in the 2 - 12pm band. This technique currently allows to build detectors 64 x 64 in

  12. Multi-polarimetric textural distinctiveness for outdoor robotic saliency detection

    NASA Astrophysics Data System (ADS)

    Haider, S. A.; Scharfenberger, C.; Kazemzadeh, F.; Wong, A.; Clausi, D. A.

    2015-01-01

    Mobile robots that rely on vision, for navigation and object detection, use saliency approaches to identify a set of potential candidates to recognize. The state of the art in saliency detection for mobile robotics often rely upon visible light imaging, using conventional camera setups, to distinguish an object against its surroundings based on factors such as feature compactness, heterogeneity and/or homogeneity. We are demonstrating a novel multi- polarimetric saliency detection approach which uses multiple measured polarization states of a scene. We leverage the light-material interaction known as Fresnel reflections to extract rotationally invariant multi-polarimetric textural representations to then train a high dimensional sparse texture model. The multi-polarimetric textural distinctiveness is characterized using a conditional probability framework based on the sparse texture model which is then used to determine the saliency at each pixel of the scene. It was observed that through the inclusion of additional polarized states into the saliency analysis, we were able to compute noticeably improved saliency maps in scenes where objects are difficult to distinguish from their background due to color intensity similarities between the object and its surroundings.

  13. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree.

    PubMed

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J

    2015-05-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon's operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis.

  14. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree

    PubMed Central

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J.

    2015-01-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon’s operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis. PMID:26405563

  15. Smart Fluid Systems: The Advent of Autonomous Liquid Robotics

    PubMed Central

    2017-01-01

    Organic, inorganic or hybrid devices in the liquid state, kept in a fixed volume by surface tension or by a confining membrane that protects them from a harsh environment, could be used as biologically inspired autonomous robotic systems with unique capabilities. They could change shape according to a specific exogenous command or by means of a fully integrated adaptive system, and provide an innovative solution for many future applications, such as space exploration in extreme or otherwise challenging environments, post‐disaster search and rescue in ground applications, compliant wearable devices, and even in the medical field for in vivo applications. This perspective provides an initial assessment of existing capabilities that could be leveraged to pursue the topic of “Smart Fluid Systems” or “Liquid Engineered Systems”. PMID:28725530

  16. The Busot Observatory: towards a robotic autonomous telescope

    NASA Astrophysics Data System (ADS)

    García-Lozano, R.; Rodes, J. J.; Torrejón, J. M.; Bernabéu, G.; Berná, J. Á.

    2016-12-01

    We describe the Busot observatory, our project of a fully robotic autonomous telescope. This astronomical observatory, which obtained the Minor Planet Centre code MPC-J02 in 2009, includes a 14 inch MEADE LX200GPS telescope, a 2 m dome, a ST8-XME CCD camera from SBIG, with an AO-8 adaptive optics system, and a filter wheel equipped with UBVRI system. We are also implementing a spectrograph SGS ST-8 for the telescope. Currently, we are involved in long term studies of variable sources such as X-ray binaries systems, and variable stars. In this work we also present the discovery of W UMa systems and its orbital periods derived from the photometry light curve obtained at Busot Observatory.

  17. Smart Fluid Systems: The Advent of Autonomous Liquid Robotics.

    PubMed

    Chiolerio, A; Quadrelli, Marco B

    2017-07-01

    Organic, inorganic or hybrid devices in the liquid state, kept in a fixed volume by surface tension or by a confining membrane that protects them from a harsh environment, could be used as biologically inspired autonomous robotic systems with unique capabilities. They could change shape according to a specific exogenous command or by means of a fully integrated adaptive system, and provide an innovative solution for many future applications, such as space exploration in extreme or otherwise challenging environments, post-disaster search and rescue in ground applications, compliant wearable devices, and even in the medical field for in vivo applications. This perspective provides an initial assessment of existing capabilities that could be leveraged to pursue the topic of "Smart Fluid Systems" or "Liquid Engineered Systems".

  18. Systematical development of an autonomous HPF driven and controlled inspection robot

    SciTech Connect

    Niewels, J.; Jorden, W.

    1994-12-31

    Autonomous service robots represent currently one of the technically most demanding robot systems. The paper describes the development of such a system for under water internal pipe inspection. Starting off from a brief look at current rends in robot design the authors approach the problem by taking a close look at the internal structure of autonomous robots. They then concentrate on a systematical modularized approach in designing the hardware unit of an inspection system. Employed test facilities within the process of system optimization like six axes force/torque senor, smart skin etc. are described as well. Finally, the paper will present first results gained from the design study.

  19. Mechanical deployment system on aries an autonomous mobile robot

    SciTech Connect

    Rocheleau, D.N.

    1995-12-01

    ARIES (Autonomous Robotic Inspection Experimental System) is under development for the Department of Energy (DOE) to survey and inspect drums containing low-level radioactive waste stored in warehouses at DOE facilities. This paper focuses on the mechanical deployment system-referred to as the camera positioning system (CPS)-used in the project. The CPS is used for positioning four identical but separate camera packages consisting of vision cameras and other required sensors such as bar-code readers and light stripe projectors. The CPS is attached to the top of a mobile robot and consists of two mechanisms. The first is a lift mechanism composed of 5 interlocking rail-elements which starts from a retracted position and extends upward to simultaneously position 3 separate camera packages to inspect the top three drums of a column of four drums. The second is a parallelogram special case Grashof four-bar mechanism which is used for positioning a camera package on drums on the floor. Both mechanisms are the subject of this paper, where the lift mechanism is discussed in detail.

  20. Versatile 360-deg panoramic optical system for autonomous robots

    NASA Astrophysics Data System (ADS)

    Barton, George G.; Feldman, Sidney; Beckstead, Jeffrey A.; Nordhauser, Sidney R.

    1999-01-01

    Autonomous mobile robots require wide angle vision for navigation and threat detection and analysis, best served with full panoramic vision. The panoramic optical element is a unique inexpensive first surface reflective aspheric convex cone. This cone can be sized and configured for any vertical FOV desired. The cone acts as a negative optical element generating a panoramic virtual image. When this virtual image is viewed through a standard camera lens it produces at the lenses focal pane a panoramic toroidal image with a translational linearity of > 99 percent. One of three image transducers can be used to convert the toroidal panoramic image to a video signal. Raster scanned CCDs, radially scanned Vidicons and linear CCD arrays on a mechanically rotated state, each have their own particular advantage. Field object distances can be determined in two ways. If the robot is moving the range can be calculated by the size change of a field object versus the distance traversed in a specific time interval. By vertically displacing the panoramic camera by several inches a quasibinocular system is created and the range determined by simple math. Ranging thus produces the third dimension.

  1. Computer vision system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Liao, Xiaoqun; Cao, Jin; Cao, Ming; Samu, Tayib; Hall, Ernest L.

    2000-10-01

    The purpose of this paper is to compare three methods for 3- D measurements of line position used for the vision guidance to navigate an autonomous mobile robot. A model is first developed to map 3-D ground points into image points to be developed using homogeneous coordinates. Then using the ground plane constraint, the inverse transformation that maps image points into 3-D ground points is determined. And then the system identification problem is solved using a calibration device. Calibration data is used to determine the model parameters by minimizing the mean square error between model and calibration points. A novel simplification is then presented which provides surprisingly accurate results. This method is called the magic matrix approach and uses only the calibration data. A more standard variation of this approach is also considered. The significance of this work is that it shows that three methods that are based on 3-D measurements may be used for mobile robot navigation and that a simple method can achieve accuracy to a fraction of an inch which is sufficient in some applications.

  2. Assessment of a visually guided autonomous exploration robot

    NASA Astrophysics Data System (ADS)

    Harris, C.; Evans, R.; Tidey, E.

    2008-10-01

    A system has been developed to enable a robot vehicle to autonomously explore and map an indoor environment using only visual sensors. The vehicle is equipped with a single camera, whose output is wirelessly transmitted to an off-board standard PC for processing. Visual features within the camera imagery are extracted and tracked, and their 3D positions are calculated using a Structure from Motion algorithm. As the vehicle travels, obstacles in its surroundings are identified and a map of the explored region is generated. This paper discusses suitable criteria for assessing the performance of the system by computer-based simulation and practical experiments with a real vehicle. Performance measures identified include the positional accuracy of the 3D map and the vehicle's location, the efficiency and completeness of the exploration and the system reliability. Selected results are presented and the effect of key system parameters and algorithms on performance is assessed. This work was funded by the Systems Engineering for Autonomous Systems (SEAS) Defence Technology Centre established by the UK Ministry of Defence.

  3. Interaction dynamics of multiple autonomous mobile robots in bounded spatial domains

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    A general navigation strategy for multiple autonomous robots in a bounded domain is developed analytically. Each robot is modeled as a spherical particle (i.e., an effective spatial domain about the center of mass); its interactions with other robots or with obstacles and domain boundaries are described in terms of the classical many-body problem; and a collision-avoidance strategy is derived and combined with homing, robot-robot, and robot-obstacle collision-avoidance strategies. Results from homing simulations involving (1) a single robot in a circular domain, (2) two robots in a circular domain, and (3) one robot in a domain with an obstacle are presented in graphs and briefly characterized.

  4. Interaction dynamics of multiple autonomous mobile robots in bounded spatial domains

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    A general navigation strategy for multiple autonomous robots in a bounded domain is developed analytically. Each robot is modeled as a spherical particle (i.e., an effective spatial domain about the center of mass); its interactions with other robots or with obstacles and domain boundaries are described in terms of the classical many-body problem; and a collision-avoidance strategy is derived and combined with homing, robot-robot, and robot-obstacle collision-avoidance strategies. Results from homing simulations involving (1) a single robot in a circular domain, (2) two robots in a circular domain, and (3) one robot in a domain with an obstacle are presented in graphs and briefly characterized.

  5. [Locomotion and control study on autonomous interventional diagnostic micro-robots].

    PubMed

    Gu, Da-qiang; Zhou, Yong

    2008-09-01

    This paper introduces the locomotion control and the research status of the autonomous interventional diagnostic micro-robots in detail, outlines technical problems and difficulties now existing, and discusses the developing trend of locomotion control.

  6. Human-robot interaction for field operation of an autonomous helicopter

    NASA Astrophysics Data System (ADS)

    Jones, Henry L.; Frew, Eric W.; Woodley, Bruce R.; Rock, Stephen M.

    1999-01-01

    The robustness of autonomous robotic systems to unanticipated circumstances is typically insufficient for use in the field. The many skills of a human user often fill this gap in robotic capability. To incorporate the human into the system, a useful interaction between man and machine must exist. This interaction should enable useful communication to be exchanged in a natural way between human and robot on a variety of levels. This paper describes the current human-robot interaction of the Stanford HUMMINGBIRD autonomous helicopter. In particular, the paper discuses the elements of the system that enable multiple levels of communication. An intelligent system agent manages the different inputs given to the helicopter. An advanced user interface gives the user and helicopter a method for exchanging useful information. Using this human-robot interaction, the HUMMINGBIRD has carried out various autonomous search, tracking, and retrieval missions.

  7. Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 and Smart Autonomous Sand-Swimming Excavator

    NASA Technical Reports Server (NTRS)

    Sandy, Michael

    2015-01-01

    The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.

  8. Behavior-based multi-robot collaboration for autonomous construction tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.

  9. Behavior-based multi-robot collaboration for autonomous construction tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.

  10. Control of autonomous mobile robots using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S.

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to provide a qualitative reasoning capability for the real-time control of autonomous mobile robots. The design and operation of these boards are described and an example of application of qualitative reasoning for the autonomous navigation of a mobile robot in a-priori unknown environments is presented. Results concerning consistency and modularity in the development of qualitative reasoning schemes as well as the general applicability of these techniques to robotic control domains are also discussed. 17 refs., 4 figs.

  11. Application of autonomous robotized systems for the collection of nearshore topographic changing and hydrodynamic measurements

    NASA Astrophysics Data System (ADS)

    Belyakov, Vladimir; Makarov, Vladimir; Zezyulin, Denis; Kurkin, Andrey; Pelinovsky, Efim

    2015-04-01

    Hazardous phenomena in the coastal zone lead to the topographic changing which are difficulty inspected by traditional methods. It is why those autonomous robots are used for collection of nearshore topographic and hydrodynamic measurements. The robot RTS-Hanna is well-known (Wubbold, F., Hentschel, M., Vousdoukas, M., and Wagner, B. Application of an autonomous robot for the collection of nearshore topographic and hydrodynamic measurements. Coastal Engineering Proceedings, 2012, vol. 33, Paper 53). We describe here several constructions of mobile systems developed in Laboratory "Transported Machines and Transported Complexes", Nizhny Novgorod State Technical University. They can be used in the field surveys and monitoring of wave regimes nearshore.

  12. First experiences with semi-autonomous robotic harvesting of protein crystals.

    PubMed

    Viola, Robert; Walsh, Jace; Melka, Alex; Womack, Wesley; Murphy, Sean; Riboldi-Tunnicliffe, Alan; Rupp, Bernhard

    2011-07-01

    The demonstration unit of the Universal Micromanipulation Robot (UMR) capable of semi-autonomous protein crystal harvesting has been tested and evaluated by independent users. We report the status and capabilities of the present unit scheduled for deployment in a high-throughput protein crystallization center. We discuss operational aspects as well as novel features such as micro-crystal handling and drip-cryoprotection, and we extrapolate towards the design of a fully autonomous, integrated system capable of reliable crystal harvesting. The positive to enthusiastic feedback from the participants in an evaluation workshop indicates that genuine demand exists and the effort and resources to develop autonomous protein crystal harvesting robotics are justified.

  13. Concept formation and generalization based on experimentation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Lyness, E.; Oliver, G.; Silliman, M.

    1989-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning problems which involves autonomous concept formation using feedback from trial-and-error learning. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 9 refs., 5 figs.

  14. Biomimetic autonomous robot inspired by the Cyanea capillata (Cyro).

    PubMed

    Villanueva, Alex A; Marut, Kenneth J; Michael, Tyler; Priya, Shashank

    2013-12-01

    A biomimetic robot inspired by Cyanea capillata, termed as 'Cyro', was developed to meet the functional demands of underwater surveillance in defense and civilian applications. The vehicle was designed to mimic the morphology and swimming mechanism of the natural counterpart. The body of the vehicle consists of a rigid support structure with linear DC motors which actuate eight mechanical arms. The mechanical arms in conjunction with artificial mesoglea create the hydrodynamic force required for propulsion. The full vehicle measures 170 cm in diameter and has a total mass of 76 kg. An analytical model of the mechanical arm kinematics was developed. The analytical and experimental bell kinematics were analyzed and compared to the C. capillata. Cyro was found to reach the water surface untethered and autonomously from a depth of 182 cm in five actuation cycles. It achieved an average velocity of 8.47 cm s(-1) while consuming an average power of 70 W. A two-axis thrust stand was developed to calculate the thrust directly from a single bell segment yielding an average thrust of 27.9 N for the whole vehicle. Steady state velocity during Cyro's swimming test was not reached but the measured performance during its last swim cycle resulted in a cost of transport of 10.9 J (kg ⋅ m)(-1) and total efficiency of 0.03.

  15. Meeting the Complex 21 challenge: Autonomous mobile robotics

    SciTech Connect

    Holland, J.M.

    1993-12-31

    Complex 21 focuses attention on developing the technology to store, inventory, account for, protect, and maintain nuclear material into the 21st Century. The optimum nuclear storage facility would be one operated by a minimum number of on-site personnel. ``As many people as necessary and as few as possible,`` would be a good rule of thumb for staffing a nuclear storage site. Human presence adds certain safety and security considerations to the technology equation. It is no small chore to fashion a technological solution that meets the combined challenges of nuclear material handling, physical protection, inventory management, fire watch, security, and personnel safety. What is needed is a multi-purpose technology with industrial, military, scientific, and security applications; a technology that can pull and carry; one that can autonomously patrol large facilities, monitor environmental conditions, detect smoke, gas, flame, heat, humidity, intrusion, chemical, and radiation hazards; one that can survey inventory and keep accurate, detailed data logs; one that can react to and interact with people, material, equipment, conditions, and events in its work environment. Cybermotion Inc., of Roanoke, VA, has been designing, manufacturing, selling and supporting mobile robotic systems since 1984. The company`s systems are at work in research, industrial, military, nuclear, security, and hazardous environment applications around the world. This paper describes some of these applications and especially the type of instruments they carry to perform monitoring and security patrols.

  16. STARLITE: a steering autonomous robot's lane investigation and tracking element

    NASA Astrophysics Data System (ADS)

    DeFauw, Randall; Lakshmanan, Sridhar; Narasimhamurthi, Natarajan; Beauvais, Michael; Kluge, Karl C.

    1997-01-01

    The problem of determining the offset to lane markings is an important one in designing vision-based automotive safety systems that operate on structured road environments. The lane offset information is critical for lateral control of the automobile. In this paper, we investigate the use of this information for an autonomous robot's lane-keeping task. We employ a deformable template-based algorithm for determining the location of lane markings in visual images taken from a side-looking camera. The matching criteria involves a modification of the standard signal-to-noise (SNR) ratio-based matched filtering criteria. A KL-type color transformation is used for transforming the RGB channels of the given image onto a composite color channel, in order to eliminate some of the noise. The standard perspective transformation is used for transforming the offset information from image coordinates onto ground coordinates. The resulting algorithm, named STARLITE is robust to shadows, specular reflections, road cracks, etc. Experimental results are provided to illustrate the performance of STARLITE and compare its performance to the AURORA algorithm, and the SNR-based matched filter.

  17. Autonomous Coordination and Online Motion Modeling for Mobile Robots

    DTIC Science & Technology

    2007-09-01

    number of robots. The second goal of this work is to demonstrate a method by which a robot can automatically determine how it is moving...The second goal of this work is to demonstrate a method by which a robot can automatically determine how it is moving. Experiments demonstrate the...to introduce a method for automatically learning a robot’s motion model. The algorithm will be employed on real mobile robots so as to ensure the

  18. An intelligent hybrid behavior coordination system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Fallouh, Samer

    2013-12-01

    In this paper, development of a low-cost PID controller with an intelligent behavior coordination system for an autonomous mobile robot is described that is equipped with IR sensors, ultrasonic sensors, regulator, and RC filters on the robot platform based on HCS12 microcontroller and embedded systems. A novel hybrid PID controller and behavior coordination system is developed for wall-following navigation and obstacle avoidance of an autonomous mobile robot. Adaptive control used in this robot is a hybrid PID algorithm associated with template and behavior coordination models. Software development contains motor control, behavior coordination intelligent system and sensor fusion. In addition, the module-based programming technique is adopted to improve the efficiency of integrating the hybrid PID and template as well as behavior coordination model algorithms. The hybrid model is developed to synthesize PID control algorithms, template and behavior coordination technique for wall-following navigation with obstacle avoidance systems. The motor control, obstacle avoidance, and wall-following navigation algorithms are developed to propel and steer the autonomous mobile robot. Experiments validate how this PID controller and behavior coordination system directs an autonomous mobile robot to perform wall-following navigation with obstacle avoidance. Hardware configuration and module-based technique are described in this paper. Experimental results demonstrate that the robot is successfully capable of being guided by the hybrid PID controller and behavior coordination system for wall-following navigation with obstacle avoidance.

  19. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring

    PubMed Central

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  20. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-09-14

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper.

  1. Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)

    2002-01-01

    Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.

  2. Multi-robot terrain coverage and task allocation for autonomous detection of landmines

    NASA Astrophysics Data System (ADS)

    Dasgupta, Prithviraj; Muñoz-Meléndez, Angélica; Guruprasad, K. R.

    2012-06-01

    Multi-robot systems comprising of heterogeneous autonomous vehicles on land, air, water are being increasingly used to assist or replace humans in different hazardous missions. Two crucial aspects in such multi-robot systems are to: a) explore an initially unknown region of interest to discover tasks, and, b) allocate and share the discovered tasks between the robots in a coordinated manner using a multi-robot task allocation (MRTA) algorithm. In this paper, we describe results from our research on multi-robot terrain coverage and MRTA algorithms within an autonomous landmine detection scenario, done as part of the COMRADES project. Each robot is equipped with a different type of landmine detection sensor and different sensors, even of the same type, can have different degrees of accuracy. The landmine detection-related operations performed by each robot are abstracted as tasks and multiple robots are required to complete a single task. First, we describe a distributed and robust terrain coverage algorithm that employs Voronoi partitions to divide the area of interest among the robots and then uses a single-robot coverage algorithm to explore each partition for potential landmines. Then, we describe MRTA algorithms that use the location information of discovered potential landmines and employ either a greedy strategy, or, an opportunistic strategy to allocate tasks among the robots while attempting to minimize the time (energy) expended by the robots to perform the tasks. We report experimental results of our algorithms using accurately-simulated Corobot robots within the Webots simulator performing a multi-robot, landmine detection operation.

  3. Tier-scalable reconnaissance: the future in autonomous C4ISR systems has arrived: progress towards an outdoor testbed

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang; Brooks, Alexander J.-W.; Tarbell, Mark A.; Dohm, James M.

    2017-05-01

    Autonomous reconnaissance missions are called for in extreme environments, as well as in potentially hazardous (e.g., the theatre, disaster-stricken areas, etc.) or inaccessible operational areas (e.g., planetary surfaces, space). Such future missions will require increasing degrees of operational autonomy, especially when following up on transient events. Operational autonomy encompasses: (1) Automatic characterization of operational areas from different vantages (i.e., spaceborne, airborne, surface, subsurface); (2) automatic sensor deployment and data gathering; (3) automatic feature extraction including anomaly detection and region-of-interest identification; (4) automatic target prediction and prioritization; (5) and subsequent automatic (re-)deployment and navigation of robotic agents. This paper reports on progress towards several aspects of autonomous C4ISR systems, including: Caltech-patented and NASA award-winning multi-tiered mission paradigm, robotic platform development (air, ground, water-based), robotic behavior motifs as the building blocks for autonomous tele-commanding, and autonomous decision making based on a Caltech-patented framework comprising sensor-data-fusion (feature-vectors), anomaly detection (clustering and principal component analysis), and target prioritization (hypothetical probing).

  4. Remote wave measurements using autonomous mobile robotic systems

    NASA Astrophysics Data System (ADS)

    Kurkin, Andrey; Zeziulin, Denis; Makarov, Vladimir; Belyakov, Vladimir; Tyugin, Dmitry; Pelinovsky, Efim

    2016-04-01

    The project covers the development of a technology for monitoring and forecasting the state of the coastal zone environment using radar equipment transported by autonomous mobile robotic systems (AMRS). Sought-after areas of application are the eastern and northern coasts of Russia, where continuous collection of information on topographic changes of the coastal zone and carrying out hydrodynamic measurements in inaccessible to human environment are needed. The intensity of the reflection of waves, received by radar surveillance, is directly related to the height of the waves. Mathematical models and algorithms for processing experimental data (signal selection, spectral analysis, wavelet analysis), recalculation of landwash from data on heights of waves far from the shore, determination of the threshold values of heights of waves far from the shore have been developed. There has been developed the program complex for functioning of the experimental prototype of AMRS, comprising the following modules: data loading module, reporting module, module of georeferencing, data analysis module, monitoring module, hardware control module, graphical user interface. Further work will be connected with carrying out tests of manufactured experimental prototype in conditions of selected routes coastline of Sakhalin Island. Conducting field tests will allow to reveal the shortcomings of development and to identify ways of optimization of the structure and functioning algorithms of AMRS, as well as functioning the measuring equipment. The presented results have been obtained in Nizhny Novgorod State Technical University n.a. R. Alekseev in the framework of the Federal Target Program «Research and development on priority directions of scientific-technological complex of Russia for 2014 - 2020 years» (agreement № 14.574.21.0089 (unique identifier of agreement - RFMEFI57414X0089)).

  5. An integrated design and fabrication strategy for entirely soft, autonomous robots

    NASA Astrophysics Data System (ADS)

    Wehner, Michael; Truby, Ryan L.; Fitzgerald, Daniel J.; Mosadegh, Bobak; Whitesides, George M.; Lewis, Jennifer A.; Wood, Robert J.

    2016-08-01

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  6. An integrated design and fabrication strategy for entirely soft, autonomous robots.

    PubMed

    Wehner, Michael; Truby, Ryan L; Fitzgerald, Daniel J; Mosadegh, Bobak; Whitesides, George M; Lewis, Jennifer A; Wood, Robert J

    2016-08-25

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  7. Simple sensors for performing useful tasks autonomously in complex outdoor terrain

    NASA Astrophysics Data System (ADS)

    Gat, Erann; Behar, Albert; Desai, Rajiv; Ivlev, Robert V.; Loch, John L.; Miller, David P.

    1992-11-01

    This paper describes the control system for Rocky IV, a prototype microrover designed to demonstrate proof-of-concept for a low-cost scientific mission to Mars. Rocky IV uses a behavior-based control architecture which implements a large variety of functions displaying various degrees of autonomy, from completely autonomous long-duration conditional sequences of actions to very precisely described actions resembling classical AI operators. The control system integrates information from infrared proximity sensors, proprioceptive encoders which report on the state of the articulation of the rover's suspension system and other mechanics, a homing beacon, a magnetic compass, and contact sensors. In addition, significant functionality is implemented as 'virtual sensors', computed values which are presented to the system as if they were sensors values. The robot is able to perform a variety of useful tasks, including soil sample collection, removal of surface weathering layers from rocks, spectral imaging, instrument deployment, and sample return, under realistic mission- like conditions in Mars-like terrain.

  8. How to make an autonomous robot as a partner with humans: design approach versus emergent approach.

    PubMed

    Fujita, M

    2007-01-15

    In this paper, we discuss what factors are important to realize an autonomous robot as a partner with humans. We believe that it is important to interact with people without boring them, using verbal and non-verbal communication channels. We have already developed autonomous robots such as AIBO and QRIO, whose behaviours are manually programmed and designed. We realized, however, that this design approach has limitations; therefore we propose a new approach, intelligence dynamics, where interacting in a real-world environment using embodiment is considered very important. There are pioneering works related to this approach from brain science, cognitive science, robotics and artificial intelligence. We assert that it is important to study the emergence of entire sets of autonomous behaviours and present our approach towards this goal.

  9. The experimental humanoid robot H7: a research platform for autonomous behaviour.

    PubMed

    Nishiwaki, Koichi; Kuffner, James; Kagami, Satoshi; Inaba, Masayuki; Inoue, Hirochika

    2007-01-15

    This paper gives an overview of the humanoid robot 'H7', which was developed over several years as an experimental platform for walking, autonomous behaviour and human interaction research at the University of Tokyo. H7 was designed to be a human-sized robot capable of operating autonomously in indoor environments designed for humans. The hardware is relatively simple to operate and conduct research on, particularly with respect to the hierarchical design of its control architecture. We describe the overall design goals and methodology, along with a summary of its online walking capabilities, autonomous vision-based behaviours and automatic motion planning. We show experimental results obtained by implementations running within a simulation environment as well as on the actual robot hardware.

  10. Using custom-designed VLSI fuzzy inferencing chips for the autonomous navigation of a mobile robot

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, Hiroyuki; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI fuzzy inferencing chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation in apriori unknown environments is discussed. An approach using superposition of elemental sensor-based behaviors is shown to alloy easy development and testing of the inferencing rule base, while providing for progressive addition of behaviors to resolve situations of increasing complexity. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse and inaccurate sensor data. 17 refs., 6 figs.

  11. Autonomous navigation of a mobile robot using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, H.; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of a mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation is a-priori unknown environments is discussed. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse inaccurate sensor data. 17 refs., 6 figs.

  12. A testbed for a unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Lee, T.; Tso, K.; Backes, P.; Lloyd, J.

    1990-01-01

    This paper describes a complete robot control facility built at the Jet Propulsion Laboratory as part of NASA a telerobotics program to develop a state-of-the-art robot control environment for laboratory based space-like experiments. This system, which is now fully operational, has the following features: separation of the computing facilities into local and remote sites, autonomous motion generation in joint or Cartesian coordinates, dual-arm force reflecting teleoperation with voice interaction between the operator and the robots, shared control between the autonomously generated motions and operator controlled teleoperation, and dual-arm coordinated trajectory generation. The system has been used to carry out realistic experiments such as the exchange of an Orbital Replacement Unit (ORU), bolt turning, and door opening, using a mixture of autonomous actions and teleoperation, with either a single arm or two cooperating arms.

  13. A testbed for a unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Lee, T.; Tso, K.; Backes, P.; Lloyd, J.

    1990-01-01

    This paper describes a complete robot control facility built at the Jet Propulsion Laboratory as part of NASA a telerobotics program to develop a state-of-the-art robot control environment for laboratory based space-like experiments. This system, which is now fully operational, has the following features: separation of the computing facilities into local and remote sites, autonomous motion generation in joint or Cartesian coordinates, dual-arm force reflecting teleoperation with voice interaction between the operator and the robots, shared control between the autonomously generated motions and operator controlled teleoperation, and dual-arm coordinated trajectory generation. The system has been used to carry out realistic experiments such as the exchange of an Orbital Replacement Unit (ORU), bolt turning, and door opening, using a mixture of autonomous actions and teleoperation, with either a single arm or two cooperating arms.

  14. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    PubMed

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1).

  15. 3D Ultrasound Guidance of Autonomous Robotic Breast Biopsy: Feasibility Study

    PubMed Central

    Liang, Kaicheng; Rogers, Albert J.; Light, Edward D.; von Allmen, Daniel; Smith, Stephen W.

    2009-01-01

    Feasibility studies of autonomous robot biopsies in tissue have been conducted using real time 3D ultrasound combined with simple thresholding algorithms. The robot first autonomously processed 3D image volumes received from the ultrasound scanner to locate a metal rod target embedded in turkey breast tissue simulating a calcification, and in a separate experiment, the center of a water-filled void in the breast tissue simulating a cyst. In both experiments the robot then directed a needle to the desired target, with no user input required. Separate needle-touch experiments performed by the image-guided robot in a water tank yielded an rms error of 1.15 mm. PMID:19900753

  16. Three-dimensional ultrasound guidance of autonomous robotic breast biopsy: feasibility study.

    PubMed

    Liang, Kaicheng; Rogers, Albert J; Light, Edward D; von Allmen, Daniel; Smith, Stephen W

    2010-01-01

    Feasibility studies of autonomous robot biopsies in tissue have been conducted using real-time three-dimensional (3-D) ultrasound combined with simple thresholding algorithms. The robot first autonomously processed 3-D image volumes received from the ultrasound scanner to locate a metal rod target embedded in turkey breast tissue simulating a calcification, and in a separate experiment, the center of a water-filled void in the breast tissue simulating a cyst. In both experiments the robot then directed a needle to the desired target, with no user input required. Separate needle-touch experiments performed by the image-guided robot in a water tank yielded an rms error of 1.15 mm. (E-mail: kaicheng.liang@duke.edu).

  17. Autonomous discovery and learning by a mobile robot in unstructured environments

    SciTech Connect

    Pin, F.G.; de Saussure, G.; Spelt, P.F.; Barnett, D.L.; Killough, S.M.; Weisbin, C.R.

    1988-01-01

    This paper presents recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of autonomous discovery and learning of emergency and maintenance tasks in unstructured environments by a mobile robot. The methodologies for learning basic operating principles of control devices, and for using the acquired knowledge to solve new problems with conditions not encountered before are presented. The algorithms necessary for the robot to discover problem-solving sequences of actions, through experimentation with the environment, in the two cases of immediate feedback and delayed feedback are described. The inferencing schemes allowing the robot to classify the information acquired from a reduced set of examples and to generalize its knowledge to a much wider problem-solving domain are also provided. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot is then presented. 8 refs., 2 figs.

  18. An architectural approach to create self organizing control systems for practical autonomous robots

    NASA Technical Reports Server (NTRS)

    Greiner, Helen

    1991-01-01

    For practical industrial applications, the development of trainable robots is an important and immediate objective. Therefore, the developing of flexible intelligence directly applicable to training is emphasized. It is generally agreed upon by the AI community that the fusion of expert systems, neural networks, and conventionally programmed modules (e.g., a trajectory generator) is promising in the quest for autonomous robotic intelligence. Autonomous robot development is hindered by integration and architectural problems. Some obstacles towards the construction of more general robot control systems are as follows: (1) Growth problem; (2) Software generation; (3) Interaction with environment; (4) Reliability; and (5) Resource limitation. Neural networks can be successfully applied to some of these problems. However, current implementations of neural networks are hampered by the resource limitation problem and must be trained extensively to produce computationally accurate output. A generalization of conventional neural nets is proposed, and an architecture is offered in an attempt to address the above problems.

  19. Research and development of Ro-boat: an autonomous river cleaning robot

    NASA Astrophysics Data System (ADS)

    Sinha, Aakash; Bhardwaj, Prashant; Vaibhav, Bipul; Mohommad, Noor

    2013-12-01

    Ro-Boat is an autonomous river cleaning intelligent robot incorporating mechanical design and computer vision algorithm to achieve autonomous river cleaning and provide a sustainable environment. Ro-boat is designed in a modular fashion with design details such as mechanical structural design, hydrodynamic design and vibrational analysis. It is incorporated with a stable mechanical system with air and water propulsion, robotic arms and solar energy source and it is proceed to become autonomous by using computer vision. Both "HSV Color Space" and "SURF" are proposed to use for measurements in Kalman Filter resulting in extremely robust pollutant tracking. The system has been tested with successful results in the Yamuna River in New Delhi. We foresee that a system of Ro-boats working autonomously 24x7 can clean a major river in a city on about six months time, which is unmatched by alternative methods of river cleaning.

  20. Passive optically encoded transponder (POET) - An acquisition and alignment target for autonomous robotics

    NASA Astrophysics Data System (ADS)

    White, G. K.

    1987-01-01

    This paper shows that it is possible to produce a three-dimensional target from a two-dimensional transponder that can enhance the capabilities of an optical measurement or alignment system, and that the autonomous operation of such a system is possible. The attitude and position resolution that is possible using such a configuration would allow noncontact coordinate system transfer and tracking capability in a robotic system, enabling a robot to access the physical database of an acquired, known target item and inspect, attach to, or manipulate any external part of the item in a teleoperated or autonomous mode without sophisticated visual capabilities.

  1. Autonomous Mobile Platform for Research in Cooperative Robotics

    NASA Technical Reports Server (NTRS)

    Daemi, Ali; Pena, Edward; Ferguson, Paul

    1998-01-01

    This paper describes the design and development of a platform for research in cooperative mobile robotics. The structure and mechanics of the vehicles are based on R/C cars. The vehicle is rendered mobile by a DC motor and servo motor. The perception of the robot's environment is achieved using IR sensors and a central vision system. A laptop computer processes images from a CCD camera located above the testing area to determine the position of objects in sight. This information is sent to each robot via RF modem. Each robot is operated by a Motorola 68HC11E micro-controller, and all actions of the robots are realized through the connections of IR sensors, modem, and motors. The intelligent behavior of each robot is based on a hierarchical fuzzy-rule based approach.

  2. Autonomous Mobile Platform for Research in Cooperative Robotics

    NASA Technical Reports Server (NTRS)

    Daemi, Ali; Pena, Edward; Ferguson, Paul

    1998-01-01

    This paper describes the design and development of a platform for research in cooperative mobile robotics. The structure and mechanics of the vehicles are based on R/C cars. The vehicle is rendered mobile by a DC motor and servo motor. The perception of the robot's environment is achieved using IR sensors and a central vision system. A laptop computer processes images from a CCD camera located above the testing area to determine the position of objects in sight. This information is sent to each robot via RF modem. Each robot is operated by a Motorola 68HC11E micro-controller, and all actions of the robots are realized through the connections of IR sensors, modem, and motors. The intelligent behavior of each robot is based on a hierarchical fuzzy-rule based approach.

  3. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  4. Autonomous avoidance based on motion delay of master-slave surgical robot.

    PubMed

    Inoue, Shintaro; Toyoda, Kazutaka; Kobayashi, Yo; Fujie, Masakatsu G

    2009-01-01

    Safe use of master-slave robots for endoscopic surgery requires autonomous motions to avert contact with vital organs, blood vessels, and nerves. Here we describe an avoidance control algorithm with delay compensation that takes the dynamic characteristics of the robot into account. To determine the operating parameters, we measured frequency characteristics of each joint of the slave-manipulator. The results suggest this delay compensation program improves avoidance performance.

  5. Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl

    2015-01-01

    Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.

  6. Demonstration of Waypoint Navigation for A Semi-Autonomous Prototype Surf-Zone Robot

    DTIC Science & Technology

    2006-06-01

    systems onboard which enable it to autonomously navigate and control the motors and servos. The ‘brain’ of the Agbot robot is the BL2000 rabbit micro...should one become damaged. However, because we want Agbot to be autonomous, the con- trol systems must be tuned to mimic a radio controller in order...translation of about 13% of the wheel radius during each step [Ref. 10]. This up and down motion causes large amplitude, low frequency vibrations . Many times

  7. Autonomous Robotic Weapons: US Army Innovation for Ground Combat in the Twenty-First Century

    DTIC Science & Technology

    2015-05-21

    171 One of the underlying themes of film is the cognitive barriers to autonomous robots as the film illustrates an unarmed autonomous robot’s...institutional norms, rooted in decades of battlefield dominance throughout the twentieth century, have formed a cognitive resistance to such innovative...twentieth century, have formed a cognitive resistance to such innovative doctrinal development or to paradigm shifts that may be required to prepare the US

  8. Introduction to autonomous mobile robotics using Lego Mindstorms NXT

    NASA Astrophysics Data System (ADS)

    Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-12-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.

  9. Robust performance of multiple tasks by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Einstein, R.; Jones, J.P.; Spelt, P.D.; Weisbin, C.R.

    1989-01-01

    There have been many successful mobile robot experiments, but very few papers have appeared that examine the range of applicability, or robustness, of a robot system. The purpose of this paper is to determine and quantify robustness of the Hermies-IIB experimental capabilities. 6 refs., 1 tab.

  10. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  11. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  12. Manifold traversing as a model for learning control of autonomous robots

    NASA Technical Reports Server (NTRS)

    Szakaly, Zoltan F.; Schenker, Paul S.

    1992-01-01

    This paper describes a recipe for the construction of control systems that support complex machines such as multi-limbed/multi-fingered robots. The robot has to execute a task under varying environmental conditions and it has to react reasonably when previously unknown conditions are encountered. Its behavior should be learned and/or trained as opposed to being programmed. The paper describes one possible method for organizing the data that the robot has learned by various means. This framework can accept useful operator input even if it does not fully specify what to do, and can combine knowledge from autonomous, operator assisted and programmed experiences.

  13. Manifold traversing as a model for learning control of autonomous robots

    NASA Technical Reports Server (NTRS)

    Szakaly, Zoltan F.; Schenker, Paul S.

    1992-01-01

    This paper describes a recipe for the construction of control systems that support complex machines such as multi-limbed/multi-fingered robots. The robot has to execute a task under varying environmental conditions and it has to react reasonably when previously unknown conditions are encountered. Its behavior should be learned and/or trained as opposed to being programmed. The paper describes one possible method for organizing the data that the robot has learned by various means. This framework can accept useful operator input even if it does not fully specify what to do, and can combine knowledge from autonomous, operator assisted and programmed experiences.

  14. Artificial immune-network based autonomous mobile robots navigation and coordination

    NASA Astrophysics Data System (ADS)

    Duan, Q. J.; Wang, R. X.

    2005-12-01

    Based on the analogies between multi autonomous robots system (MARS) and immune system, a synthesized immune network is proposed, and used to solve the navigation and coordination problem on MARS. Individual robot was regarded as small-scaled immune networks (SN). Task was regarded as antigen, and behavior tactics were deemed to the antibodies respectively. Behavior tactic to a robot sensor was taken as B cell. Navigation and coordination problem is transformed into the interaction mechanism among antibody, antigen and small-scaled immune networks. The pursuit problem was used to validate the hypothesis. Simulation results suggest that the proposal is promising.

  15. Towards robotic heart surgery: introduction of autonomous procedures into an experimental surgical telemanipulator system.

    PubMed

    Bauernschmitt, R; Schirmbeck, E U; Knoll, A; Mayer, H; Nagy, I; Wessel, N; Wildhirt, S M; Lange, R

    2005-09-01

    The introduction of telemanipulator systems into cardiac surgery enabled the heart surgeon to perform minimally invasive procedures with high precision and stereoscopic view. For further improvement and especially for inclusion of autonomous action sequences, implementation of force-feedback is necessary. The aim of our study was to provide a robotic scenario giving the surgeon an impression very similar to open procedures (high immersion) and to enable autonomous surgical knot tying with delicate suture material. In this experimental set-up the feasibility of autonomous surgical knot tying is demonstrated for the first time using stereoscopic view and force feedback.

  16. Motor-response learning at a process control panel by an autonomous robot

    SciTech Connect

    Spelt, P.F.; de Saussure, G.; Lyness, E.; Pin, F.G.; Weisbin, C.R.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring and manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.

  17. A Prototype Novel Sensor for Autonomous, Space Based Robots - Phase 2

    NASA Technical Reports Server (NTRS)

    Squillante, M. R.; Derochemont, L. P.; Cirignano, L.; Lieberman, P.; Soller, M. S.

    1990-01-01

    The goal of this program was to develop new sensing capabilities for autonomous robots operating in space. Information gained by the robot using these new capabilities would be combined with other information gained through more traditional capabilities, such as video, to help the robot characterize its environment as well as to identify known or unknown objects that it encounters. Several sensing capabilities using nuclear radiation detectors and backscatter technology were investigated. The result of this research has been the construction and delivery to NASA of a prototype system with three capabilities for use by autonomous robots. The primary capability was the use of beta particle backscatter measurements to determine the average atomic number (Z) of an object. This gives the robot a powerful tool to differentiate objects which may look the same, such as objects made out of different plastics or other light weight materials. In addition, the same nuclear sensor used in the backscatter measurement can be used as a nuclear spectrometer to identify sources of nuclear radiation that may be encountered by the robot, such as nuclear powered satellites. A complete nuclear analysis system is included in the software and hardware of the prototype system built in phase 2 of this effort. Finally, a method to estimate the radiation dose in the environment of the robot has been included as a third capability. Again, the same nuclear sensor is used in a different operating mode and with different analysis software. Each of these capabilities are described.

  18. Development of the Research Platform of Small Autonomous Blimp Robot

    NASA Astrophysics Data System (ADS)

    Takaya, Toshihiko; Kawamura, Hidenori; Yamamoto, Masahito; Ohuchi, Azuma

    A blimp robot is attractive as an small flight robot and can float in the air by buoyancy and realize safe to the crash small flight with low energy and can movement for a long time compared with other flight robots with low energy and can movement for a long time compared with other flight robots. However, control of an airplane robot is difficult for the nonlinear characteristic exposed to inertia by the air flow in response to influence. Therefore, the applied research which carried out the maximum use of such in recent years a blimp robot's feature is prosperous. In this paper, we realized development of blimp robot for research which can be used general-purpose by carrying out clue division of the blimp robot body at a unit, and constituting and building for research of blimp robot, and application development. On the other hand, by developing a general-purpose blimp robot research platform, improvement in the research efficiency of many researchers can be attained, and further, research start of blimp robot becomes easy and contributes to development of research. We performed the experiments for the above-mentioned proof. 1. Checked basic keeping position performance and that various orbital operation was possible. And the unit exchange ease of software unit was checked by the experiment which exchanges the control layer of software for learning control from PID control, and carries out comparison of operation. 2. In order to check the exchange ease of hardware unit, the sensor was exchanged for the microphon from the camera, and control of operation was checked. 3. For the unit addition ease, the microphon which carries out sound detection with the picture detection with a camera was added, and control of operation was verified. 4. The unit exchange was carried out for the check of a function addition and the topological map generation experiment by addition of an ultrasonic sensor was conducted. Developed blimp robot for research mounted the exchange ease

  19. Robot soccer anywhere: achieving persistent autonomous navigation, mapping, and object vision tracking in dynamic environments

    NASA Astrophysics Data System (ADS)

    Dragone, Mauro; O'Donoghue, Ruadhan; Leonard, John J.; O'Hare, Gregory; Duffy, Brian; Patrikalakis, Andrew; Leederkerken, Jacques

    2005-06-01

    The paper describes an ongoing effort to enable autonomous mobile robots to play soccer in unstructured, everyday environments. Unlike conventional robot soccer competitions that are usually held on purpose-built robot soccer "fields", in our work we seek to develop the capability for robots to demonstrate aspects of soccer-playing in more diverse environments, such as schools, hospitals, or shopping malls, with static obstacles (furniture) and dynamic natural obstacles (people). This problem of "Soccer Anywhere" presents numerous research challenges including: (1) Simultaneous Localization and Mapping (SLAM) in dynamic, unstructured environments, (2) software control architectures for decentralized, distributed control of mobile agents, (3) integration of vision-based object tracking with dynamic control, and (4) social interaction with human participants. In addition to the intrinsic research merit of these topics, we believe that this capability would prove useful for outreach activities, in demonstrating robotics technology to primary and secondary school students, to motivate them to pursue careers in science and engineering.

  20. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators.

    PubMed

    Marchese, Andrew D; Onal, Cagdas D; Rus, Daniela

    2014-03-01

    In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input-output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion.

  1. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators

    PubMed Central

    Onal, Cagdas D.; Rus, Daniela

    2014-01-01

    Abstract In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input–output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion. PMID:27625912

  2. A unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, Samad; Lee, Thomas S.; Tso, Kam Sing; Backes, Paul G.; Lloyd, John

    1991-01-01

    A description is given of complete robot control facility built as part of a NASA telerobotics program to develop a state-of-the-art robot control environment for performing experiments in the repair and assembly of spacelike hardware to gain practical knowledge of such work and to improve the associated technology. The basic architecture of the manipulator control subsystem is presented. The multiarm Robot Control C Library (RCCL), a key software component of the system, is described, along with its implementation on a Sun-4 computer. The system's simulation capability is also described, and the teleoperation and shared control features are explained.

  3. A unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, Samad; Lee, Thomas S.; Tso, Kam Sing; Backes, Paul G.; Lloyd, John

    1991-01-01

    A description is given of complete robot control facility built as part of a NASA telerobotics program to develop a state-of-the-art robot control environment for performing experiments in the repair and assembly of spacelike hardware to gain practical knowledge of such work and to improve the associated technology. The basic architecture of the manipulator control subsystem is presented. The multiarm Robot Control C Library (RCCL), a key software component of the system, is described, along with its implementation on a Sun-4 computer. The system's simulation capability is also described, and the teleoperation and shared control features are explained.

  4. A simple, inexpensive, and effective implementation of a vision-guided autonomous robot

    NASA Astrophysics Data System (ADS)

    Tippetts, Beau; Lillywhite, Kirt; Fowers, Spencer; Dennis, Aaron; Lee, Dah-Jye; Archibald, James

    2006-10-01

    This paper discusses a simple, inexpensive, and effective implementation of a vision-guided autonomous robot. This implementation is a second year entrance for Brigham Young University students to the Intelligent Ground Vehicle Competition. The objective of the robot was to navigate a course constructed of white boundary lines and orange obstacles for the autonomous competition. A used electric wheelchair was used as the robot base. The wheelchair was purchased from a local thrift store for $28. The base was modified to include Kegresse tracks using a friction drum system. This modification allowed the robot to perform better on a variety of terrains, resolving issues with last year's design. In order to control the wheelchair and retain the robust motor controls already on the wheelchair the wheelchair joystick was simply removed and replaced with a printed circuit board that emulated joystick operation and was capable of receiving commands through a serial port connection. Three different algorithms were implemented and compared: a purely reactive approach, a potential fields approach, and a machine learning approach. Each of the algorithms used color segmentation methods to interpret data from a digital camera in order to identify the features of the course. This paper will be useful to those interested in implementing an inexpensive vision-based autonomous robot.

  5. Adaptive artificial neural network for autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    The topics are presented in viewgraph form and include: neural network controller for robot arm positioning with visual feedback; initial training of the arm; automatic recovery from cumulative fault scenarios; and error reduction by iterative fine movements.

  6. Adaptive artificial neural network for autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    The topics are presented in viewgraph form and include: neural network controller for robot arm positioning with visual feedback; initial training of the arm; automatic recovery from cumulative fault scenarios; and error reduction by iterative fine movements.

  7. Detection of Water Hazards for Autonomous Robotic Vehicles

    NASA Technical Reports Server (NTRS)

    Matthes, Larry; Belluta, Paolo; McHenry, Michael

    2006-01-01

    Four methods of detection of bodies of water are under development as means to enable autonomous robotic ground vehicles to avoid water hazards when traversing off-road terrain. The methods involve processing of digitized outputs of optoelectronic sensors aboard the vehicles. It is planned to implement these methods in hardware and software that would operate in conjunction with the hardware and software for navigation and for avoidance of solid terrain obstacles and hazards. The first method, intended for use during the day, is based on the observation that, under most off-road conditions, reflections of sky from water are easily discriminated from the adjacent terrain by their color and brightness, regardless of the weather and of the state of surface waves on the water. Accordingly, this method involves collection of color imagery by a video camera and processing of the image data by an algorithm that classifies each pixel as soil, water, or vegetation according to its color and brightness values (see figure). Among the issues that arise is the fact that in the presence of reflections of objects on the opposite shore, it is difficult to distinguish water by color and brightness alone. Another issue is that once a body of water has been identified by means of color and brightness, its boundary must be mapped for use in navigation. Techniques for addressing these issues are under investigation. The second method, which is not limited by time of day, is based on the observation that ladar returns from bodies of water are usually too weak to be detected. In this method, ladar scans of the terrain are analyzed for returns and the absence thereof. In appropriate regions, the presence of water can be inferred from the absence of returns. Under some conditions in which reflections from the bottom are detectable, ladar returns could, in principle, be used to determine depth. The third method involves the recognition of bodies of water as dark areas in short

  8. Experiences in Deploying Test Arenas for Autonomous Mobile Robots

    DTIC Science & Technology

    2001-09-01

    wallpaper and other types of materials pose challenges to stereo vision algorithms. Compliant objects that may visually look like rigid obstacles...c.) Soft materials, victim under bedb.) Curved wall Figure 3: Features from the Yellow arena robot and find alternate routes to exit the arenas that...Figure 2. The Yellow arena is the easiest in terms of traversability. Researchers who may not have very agile robot platforms, yet want to test their

  9. Challenging of path planning algorithms for autonomous robot in known environment

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Irwan, N.; Zuraida, Raja Lailatul; Shaharum, Umairah; Hanafi@Omar, Hafiz Mohd

    2014-06-01

    Most of the mobile robot path planning is estimated to reach its predetermined aim through the shortest path and avoiding the obstacles. This paper is a survey on path planning algorithms of various current research and existing system of Unmanned Ground Vehicles (UGV) where their challenging issues to be intelligent autonomous robot. The focuses are some short reviews on individual papers for UGV in the known environment. Methods and algorithms in path planning for the autonomous robot had been discussed. From the reviews, we obtained that the algorithms proposed are appropriate for some cases such as single or multiple obstacles, static or movement obstacle and optimal shortest path. This paper also describes some pros and cons for every reviewed paper toward algorithms improvement for further work.

  10. Remote Sensing of Radiation Dose Rate by Customizing an Autonomous Robot

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Nakahara, M.; Morisato, K.; Takashina, T.; Kanematsu, H.

    2012-03-01

    Distribution of radiation dose was measured by customizing an autonomous cleaning robot "Roomba" and a scintillation counter. The robot was used as a vehicle carrying the scintillation survey meter, and was additionally equipped with an H8 micro computer to remote-control the vehicle and to send measured data. The data obtained were arranged with position data, and then the distribution map of the radiation dose rate was produced. Manual, programmed and autonomous driving tests were conducted, and all performances were verified. That is, for each operational mode, the measurements both with moving and with discrete moving were tried in and outside of a room. Consequently, it has been confirmed that remote sensing of radiation dose rate is possible by customizing a robot on market.

  11. Autonomous robot for detecting subsurface voids and tunnels using microgravity

    NASA Astrophysics Data System (ADS)

    Wilson, Stacy S.; Crawford, Nicholas C.; Croft, Leigh Ann; Howard, Michael; Miller, Stephen; Rippy, Thomas

    2006-05-01

    Tunnels have been used to evade security of defensive positions both during times of war and peace for hundreds of years. Tunnels are presently being built under the Mexican Border by drug smugglers and possibly terrorists. Several have been discovered at the border crossing at Nogales near Tucson, Arizona, along with others at other border towns. During this war on terror, tunnels under the Mexican Border pose a significant threat for the security of the United States. It is also possible that terrorists will attempt to tunnel under strategic buildings and possibly discharge explosives. The Center for Cave and Karst Study (CCKS) at Western Kentucky University has a long and successful history of determining the location of caves and subsurface voids using microgravity technology. Currently, the CCKS is developing a remotely controlled robot which will be used to locate voids underground. The robot will be a remotely controlled vehicle that will use microgravity and GPS to accurately detect and measure voids below the surface. It is hoped that this robot will also be used in military applications to locate other types of voids underground such as tunnels and bunkers. It is anticipated that the robot will be able to function up to a mile from the operator. This paper will describe the construction of the robot and the use of microgravity technology to locate subsurface voids with the robot.

  12. Terrain coverage of an unknown room by an autonomous mobile robot

    SciTech Connect

    VanderHeide, J.R.

    1995-12-05

    Terrain coverage problems are nearly as old as mankind: they were necessary early in our history for basic activities such as finding food and other necessities. As our societies and their associated machineries have grown more complex, we have not outgrown the need for this primitive skill. It is still used on a small scale for cleaning tasks and on a large scale for {open_quotes}search and report{close_quotes} missions of various kinds. The motivation for automating this process may not lie in the novelty of anything we might gain as an end product, but in freedom from something which we as humans find tedious, time-consuming and sometimes dangerous. Here we consider autonomous coverage of a terrain, typically indoor rooms, by a mobile robot that has no a priori model of the terrain. In evaluating its surroundings, the robot employs only inexpensive and commercially available ultrasonic and infrared sensors. The proposed solution is a basic step - a proof of principle - that can contribute to robots capable of autonomously performing tasks such as vacuum cleaning, mopping, radiation scanning, etc. The area of automatic terrain coverage and the closely related problem of terrain model acquisition have been studied both analytically and experimentally. Compared to the existing works, the following are three major distinguishing aspects of our study: (1) the theory is actually applied to an existing robot, (2) the robot has no a priori knowledge of the terrain, and (3) the robot can be realized relatively inexpensively.

  13. A traffic priority language for collision-free navigation of autonomous mobile robots in dynamic environments.

    PubMed

    Bourbakis, N G

    1997-01-01

    This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules.

  14. Development of an Interactive Augmented Environment and Its Application to Autonomous Learning for Quadruped Robots

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi

    This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.

  15. Bio-inspired motion planning algorithms for autonomous robots facilitating greater plasticity for security applications

    NASA Astrophysics Data System (ADS)

    Guo, Yi; Hohil, Myron; Desai, Sachi V.

    2007-10-01

    Proposed are techniques toward using collaborative robots for infrastructure security applications by utilizing them for mobile sensor suites. A vast number of critical facilities/technologies must be protected against unauthorized intruders. Employing a team of mobile robots working cooperatively can alleviate valuable human resources. Addressed are the technical challenges for multi-robot teams in security applications and the implementation of multi-robot motion planning algorithm based on the patrolling and threat response scenario. A neural network based methodology is exploited to plan a patrolling path with complete coverage. Also described is a proof-of-principle experimental setup with a group of Pioneer 3-AT and Centibot robots. A block diagram of the system integration of sensing and planning will illustrate the robot to robot interaction to operate as a collaborative unit. The proposed approach singular goal is to overcome the limits of previous approaches of robots in security applications and enabling systems to be deployed for autonomous operation in an unaltered environment providing access to an all encompassing sensor suite.

  16. Evaluation of a Home Biomonitoring Autonomous Mobile Robot

    PubMed Central

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  17. Evaluation of a Home Biomonitoring Autonomous Mobile Robot.

    PubMed

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too.

  18. An autonomous mobil robot to perform waste drum inspections

    SciTech Connect

    Peterson, K.D.; Ward, C.R.

    1994-03-01

    A mobile robot is being developed by the Savannah River Technology Center (SRTC) Robotics Group of Westinghouse Savannah River company (WSRC) to perform mandated inspections of waste drums stored in warehouse facilities. The system will reduce personnel exposure and create accurate, high quality documentation to ensure regulatory compliance. Development work is being coordinated among several DOE, academic and commercial entities in accordance with DOE`s technology transfer initiative. The prototype system was demonstrated in November of 1993. A system is now being developed for field trails at the Fernald site.

  19. Male urinary and sexual function after robotic pelvic autonomic nerve-preserving surgery for rectal cancer.

    PubMed

    Wang, Gang; Wang, Zhiming; Jiang, Zhiwei; Liu, Jiang; Zhao, Jian; Li, Jieshou

    2017-03-01

    Urinary and sexual dysfunction is the potential complication of rectal cancer surgery. The aim of this study was to evaluate the urinary and sexual function in male patients with robotic surgery for rectal cancer. This prospective study included 137 of the 336 male patients who underwent surgery for rectal cancer. Urinary and male sexual function was studied by means of a questionnaire based on the International Prostatic Symptom Score and International Index of Erectile Function. All data were collected before surgery and 12 months after surgery. Patients who underwent robotic surgery had significantly decreased incidence of partial or complete erectile dysfunction and sexual dysfunction than patients with laparoscopic surgery. The pre- and post-operative total IPSS scores in patients with robotic surgery were significantly less than that with laparoscopic surgeries. Robotic surgery shows distinct advantages in protecting the pelvic autonomic nerves and relieving post-operative sexual dysfunction. Copyright © 2016 John Wiley & Sons, Ltd.

  20. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  1. Tegotae-based decentralised control scheme for autonomous gait transition of snake-like robots.

    PubMed

    Kano, Takeshi; Yoshizawa, Ryo; Ishiguro, Akio

    2017-08-04

    Snakes change their locomotion patterns in response to the environment. This ability is a motivation for developing snake-like robots with highly adaptive functionality. In this study, a decentralised control scheme of snake-like robots that exhibited autonomous gait transition (i.e. the transition between concertina locomotion in narrow aisles and scaffold-based locomotion on unstructured terrains) was developed. Additionally, the control scheme was validated via simulations. A key insight revealed is that these locomotion patterns were not preprogrammed but emerged by exploiting Tegotae, a concept that describes the extent to which a perceived reaction matches a generated action. Unlike local reflexive mechanisms proposed previously, the Tegotae-based feedback mechanism enabled the robot to 'selectively' exploit environments beneficial for propulsion, and generated reasonable locomotion patterns. It is expected that the results of this study can form the basis to design robots that can work under unpredictable and unstructured environments.

  2. Development of a semi-autonomous service robot with telerobotic capabilities

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; White, D. R.

    1987-01-01

    The importance to the United States of semi-autonomous systems for application to a large number of manufacturing and service processes is very clear. Two principal reasons emerge as the primary driving forces for development of such systems: enhanced national productivity and operation in environments whch are hazardous to humans. Completely autonomous systems may not currently be economically feasible. However, autonomous systems that operate in a limited operation domain or that are supervised by humans are within the technology capability of this decade and will likely provide reasonable return on investment. The two research and development efforts of autonomy and telerobotics are distinctly different, yet interconnected. The first addresses the communication of an intelligent electronic system with a robot while the second requires human communication and ergonomic consideration. Discussed here are work in robotic control, human/robot team implementation, expert system robot operation, and sensor development by the American Welding Institute, MTS Systems Corporation, and the Colorado School of Mines--Center for Welding Research.

  3. Processing real-time stereo video for an autonomous robot using disparity maps and sensor fusion

    NASA Astrophysics Data System (ADS)

    Rosselot, Donald W.; Hall, Ernest L.

    2004-10-01

    The Bearcat "Cub" Robot is an interactive, intelligent, Autonomous Guided Vehicle (AGV) designed to serve in unstructured environments. Recent advances in computer stereo vision algorithms that produce quality disparity and the availability of low cost high speed camera systems have simplified many of tasks associated with robot navigation and obstacle avoidance using stereo vision. Leveraging these benefits, this paper describes a novel method for autonomous navigation and obstacle avoidance currently being implemented on the UC Bearcat Robot. The core of this approach is the synthesis of multiple sources of real-time data including stereo image disparity maps, tilt sensor data, and LADAR data with standard contour, edge, color, and line detection methods to provide robust and intelligent obstacle avoidance. An algorithm is presented with Matlab code to process the disparity maps to rapidly produce obstacle size and location information in a simple format, and features cancellation of noise and correction for pitch and roll. The vision and control computers are clustered with the Parallel Virtual Machine (PVM) software. The significance of this work is in presenting the methods needed for real time navigation and obstacle avoidance for intelligent autonomous robots.

  4. Semi-autonomous exploration of multi-floor buildings with a legged robot

    NASA Astrophysics Data System (ADS)

    Wenger, Garrett J.; Johnson, Aaron M.; Taylor, Camillo J.; Koditschek, Daniel E.

    2015-05-01

    This paper presents preliminary results of a semi-autonomous building exploration behavior using the hexapedal robot RHex. Stairwells are used in virtually all multi-floor buildings, and so in order for a mobile robot to effectively explore, map, clear, monitor, or patrol such buildings it must be able to ascend and descend stairwells. However most conventional mobile robots based on a wheeled platform are unable to traverse stairwells, motivating use of the more mobile legged machine. This semi-autonomous behavior uses a human driver to provide steering input to the robot, as would be the case in, e.g., a tele-operated building exploration mission. The gait selection and transitions between the walking and stair climbing gaits are entirely autonomous. This implementation uses an RGBD camera for stair acquisition, which offers several advantages over a previously documented detector based on a laser range finder, including significantly reduced acquisition time. The sensor package used here also allows for considerable expansion of this behavior. For example, complete automation of the building exploration task driven by a mapping algorithm and higher level planner is presently under development.

  5. Automatic tracking of laparoscopic instruments for autonomous control of a cameraman robot.

    PubMed

    Khoiy, Keyvan Amini; Mirbagheri, Alireza; Farahmand, Farzam

    2016-01-01

    An automated instrument tracking procedure was designed and developed for autonomous control of a cameraman robot during laparoscopic surgery. The procedure was based on an innovative marker-free segmentation algorithm for detecting the tip of the surgical instruments in laparoscopic images. A compound measure of Saturation and Value components of HSV color space was incorporated that was enhanced further using the Hue component and some essential characteristics of the instrument segment, e.g., crossing the image boundaries. The procedure was then integrated into the controlling system of the RoboLens cameraman robot, within a triple-thread parallel processing scheme, such that the tip is always kept at the center of the image. Assessment of the performance of the system on prerecorded real surgery movies revealed an accuracy rate of 97% for high quality images and about 80% for those suffering from poor lighting and/or blood, water and smoke noises. A reasonably satisfying performance was also observed when employing the system for autonomous control of the robot in a laparoscopic surgery phantom, with a mean time delay of 200ms. It was concluded that with further developments, the proposed procedure can provide a practical solution for autonomous control of cameraman robots during laparoscopic surgery operations.

  6. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  7. Agent-based Multimodal Interface for Dynamically Autonomous Mobile Robots

    DTIC Science & Technology

    2003-01-01

    Recognition Process The gesture recognition process utilizes a structured-light rangefinder which emits a horizontal plane of laser light. A camera...statements, will be spoken by the robot through its on-board voice synthesizer, and also sent as a text string back to the desktop GUI. 4.1. Gesture

  8. Semi-autonomous surgical tasks using a miniature in vivo surgical robot.

    PubMed

    Dumpert, Jason; Lehman, Amy C; Wood, Nathan A; Oleynikov, Dmitry; Farritor, Shane M

    2009-01-01

    Natural Orifice Translumenal Endoscopic Surgery (NOTES) is potentially the next step in minimally invasive surgery. This type of procedure could reduce patient trauma through eliminating external incisions, but poses many surgical challenges that are not sufficiently overcome with current flexible endoscopy tools. A robotic platform that attempts to emulate a laparoscopic interface for performing NOTES procedures is being developed to address these challenges. These robots are capable of entering the peritoneal cavity through the upper gastrointestinal tract, and once inserted are not constrained by incisions, allowing for visualization and manipulations throughout the cavity. In addition to using these miniature in vivo robots for NOTES procedures, these devices can also be used to perform semi-autonomous surgical tasks. Such tasks could be useful in situations where the patient is in a location far from a trained surgeon. A surgeon at a remote location could control the robot even if the communication link between surgeon and patient has low bandwidth or very high latency. This paper details work towards using the miniature robot to perform simple surgical tasks autonomously.

  9. An integrated movement capture and control platform applied towards autonomous movements of surgical robots.

    PubMed

    Daluja, Sachin; Golenberg, Lavie; Cao, Alex; Pandya, Abhilash K; Auner, Gregory W; Klein, Michael D

    2009-01-01

    Robotic surgery has gradually gained acceptance due to its numerous advantages such as tremor filtration, increased dexterity and motion scaling. There remains, however, a significant scope for improvement, especially in the areas of surgeon-robot interface and autonomous procedures. Previous studies have attempted to identify factors affecting a surgeon's performance in a master-slave robotic system by tracking hand movements. These studies relied on conventional optical or magnetic tracking systems, making their use impracticable in the operating room. This study concentrated on building an intrinsic movement capture platform using microcontroller based hardware wired to a surgical robot. Software was developed to enable tracking and analysis of hand movements while surgical tasks were performed. Movement capture was applied towards automated movements of the robotic instruments. By emulating control signals, recorded surgical movements were replayed by the robot's end-effectors. Though this work uses a surgical robot as the platform, the ideas and concepts put forward are applicable to telerobotic systems in general.

  10. R-MASTIF: robotic mobile autonomous system for threat interrogation and object fetch

    NASA Astrophysics Data System (ADS)

    Das, Aveek; Thakur, Dinesh; Keller, James; Kuthirummal, Sujit; Kira, Zsolt; Pivtoraiko, Mihail

    2013-01-01

    Autonomous robotic "fetch" operation, where a robot is shown a novel object and then asked to locate it in the field, re- trieve it and bring it back to the human operator, is a challenging problem that is of interest to the military. The CANINE competition presented a forum for several research teams to tackle this challenge using state of the art in robotics technol- ogy. The SRI-UPenn team fielded a modified Segway RMP 200 robot with multiple cameras and lidars. We implemented a unique computer vision based approach for textureless colored object training and detection to robustly locate previ- ously unseen objects out to 15 meters on moderately flat terrain. We integrated SRI's state of the art Visual Odometry for GPS-denied localization on our robot platform. We also designed a unique scooping mechanism which allowed retrieval of up to basketball sized objects with a reciprocating four-bar linkage mechanism. Further, all software, including a novel target localization and exploration algorithm was developed using ROS (Robot Operating System) which is open source and well adopted by the robotics community. We present a description of the system, our key technical contributions and experimental results.

  11. Analysis of mutual assured destruction-like scenario with swarms of non-recallable autonomous robots

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    This paper considers the implications of the creation of an autonomous robotic fighting force without recall-ability which could serve as a deterrent to a `total war' magnitude attack. It discusses the technical considerations for this type of robotic system and the limited enhancements required to current technologies (particularly UAVs) needed to create such a system. Particular consideration is paid to how the introduction of this type of technology by one actor could create a need for reciprocal development. Also considered is the prospective utilization of this type of technology by non-state actors and the impact of this on state actors.

  12. Application of concurrent engineering methods to the design of an autonomous aerial robot

    NASA Astrophysics Data System (ADS)

    Ingalls, Stephen A.

    1991-12-01

    This paper documents the year-long efforts of a multidisciplinary design team to design, build, and support an autonomous aerial robotics system. The system was developed to participate in the Association for Unmanned Vehicle System's (AUVS) First International Aerial Robotics Competition which was held in Atlanta, Georgia on the Georgia Tech campus on July 29th, 1991. As development time and budget were extremely limited, the team elected to attempt the design using concurrent engineering design methods. These methods were validated in an IDA study by Winner 1 in the late- 1980's to be particularly adept at handling the difficulties to design presented by these limitations.

  13. Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.

  14. Welding torch trajectory generation for hull joining using autonomous welding mobile robot

    NASA Astrophysics Data System (ADS)

    Hascoet, J. Y.; Hamilton, K.; Carabin, G.; Rauch, M.; Alonso, M.; Ares, E.

    2012-04-01

    Shipbuilding processes involve highly dangerous manual welding operations. Welding of ship hulls presents a hazardous environment for workers. This paper describes a new robotic system, developed by the SHIPWELD consortium, that moves autonomously on the hull and automatically executes the required welding processes. Specific focus is placed on the trajectory control of such a system and forms the basis for the discussion in this paper. It includes a description of the robotic hardware design as well as some methodology used to establish the torch trajectory control.

  15. The concept and architecture of data communication in autonomous cleaning robots

    NASA Astrophysics Data System (ADS)

    Paczesny, Daniel; Nowak, Bartosz; Tarapata, Grzegorz; Marzecki, Michał

    2016-09-01

    The paper presents description of concept of hardware and software architecture which can be easy implemented in autonomous cleaning robots. The requirement for such system is its reliability but still offering free and simple expansions and modifications. The paper describes considerations of the control and communication system, the date frame configuration and the software architecture. To show results of presented control and development system the specialised measurement stand was also proposed and described. All performed tests passed successfully and as a consequence the system architecture was implemented on dedicated cleaning robots.

  16. Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.

  17. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    NASA Astrophysics Data System (ADS)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  18. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation.

    PubMed

    Omrane, Hajer; Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path.

  19. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748

  20. Positional estimation techniques for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Aggarwal, J. K.

    1990-01-01

    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.

  1. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  2. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  3. Needle Path Planning for Autonomous Robotic Surgical Suturing

    PubMed Central

    Jackson, Russell C.; Çavuşoğlu, M. Cenk

    2013-01-01

    This paper develops a path plan for suture needles used with solid tissue volumes in endoscopic surgery. The path trajectory is based on the best practices that are used by surgeons. The path attempts to minimize the interaction forces between the tissue and the needle. Using surgical guides as a basis, two different techniques for driving a suture needle are developed. The two techniques are compared in hardware experiments by robotically driving the suture needle using both of the motion plans. PMID:24683500

  4. Needle Path Planning for Autonomous Robotic Surgical Suturing.

    PubMed

    Jackson, Russell C; Cavuşoğlu, M Cenk

    2013-12-31

    This paper develops a path plan for suture needles used with solid tissue volumes in endoscopic surgery. The path trajectory is based on the best practices that are used by surgeons. The path attempts to minimize the interaction forces between the tissue and the needle. Using surgical guides as a basis, two different techniques for driving a suture needle are developed. The two techniques are compared in hardware experiments by robotically driving the suture needle using both of the motion plans.

  5. Application-based control of an autonomous mobile robot

    SciTech Connect

    Fisher, J.J.

    1988-01-01

    Industry response to new technology is governed, almost without exception, by the systems available to meet real world needs, not tools which prove the feasibility of the technology. To this end, SRL is developing robust control strategies and tools for potential autonomous vehicle applications on site. This document describes the work packages developed to perform remote tasks and a integrated control environment which allows rapid vehicle applications development and diagnostic capabilities. 5 refs., 7 figs.

  6. Performance evaluation of 3D vision-based semi-autonomous control method for assistive robotic manipulator.

    PubMed

    Ka, Hyun W; Chung, Cheng-Shiu; Ding, Dan; James, Khara; Cooper, Rory

    2017-03-22

    We developed a 3D vision-based semi-autonomous control interface for assistive robotic manipulators. It was implemented based on one of the most popular commercially available assistive robotic manipulator combined with a low-cost depth-sensing camera mounted on the robot base. To perform a manipulation task with the 3D vision-based semi-autonomous control interface, a user starts operating with a manual control method available to him/her. When detecting objects within a set range, the control interface automatically stops the robot, and provides the user with possible manipulation options through audible text output, based on the detected object characteristics. Then, the system waits until the user states a voice command. Once the user command is given, the control interface drives the robot autonomously until the given command is completed. In the empirical evaluations conducted with human subjects from two different groups, it was shown that the semi-autonomous control can be used as an alternative control method to enable individuals with impaired motor control to more efficiently operate the robot arms by facilitating their fine motion control. The advantage of semi-autonomous control was not so obvious for the simple tasks. But, for the relatively complex real-life tasks, the 3D vision-based semi-autonomous control showed significantly faster performance. Implications for Rehabilitation A 3D vision-based semi-autonomous control interface will improve clinical practice by providing an alternative control method that is less demanding physically as well cognitively. A 3D vision-based semi-autonomous control provides the user with task specific intelligent semiautonomous manipulation assistances. A 3D vision-based semi-autonomous control gives the user the feeling that he or she is still in control at any moment. A 3D vision-based semi-autonomous control is compatible with different types of new and existing manual control methods for ARMs.

  7. Autonomous and Remote-Controlled Airborne and Ground-Based Robotic Platforms for Adaptive Geophysical Surveying

    NASA Astrophysics Data System (ADS)

    Spritzer, J. M.; Phelps, G. A.

    2011-12-01

    Low-cost autonomous and remote-controlled robotic platforms have opened the door to precision-guided geophysical surveying. Over the past two years, the U.S. Geological Survey, Senseta, NASA Ames Research Center, and Carnegie Mellon University Silicon Valley, have developed and deployed small autonomous and remotely controlled vehicles for geophysical investigations. The purpose of this line of investigation is to 1) increase the analytical capability, resolution, and repeatability, and 2) decrease the time, and potentially the cost and map-power necessary to conduct near-surface geophysical surveys. Current technology has advanced to the point where vehicles can perform geophysical surveys autonomously, freeing the geoscientist to process and analyze the incoming data in near-real time. This has enabled geoscientists to monitor survey parameters; process, analyze and interpret the incoming data; and test geophysical models in the same field session. This new approach, termed adaptive surveying, provides the geoscientist with choices of how the remainder of the survey should be conducted. Autonomous vehicles follow pre-programmed survey paths, which can be utilized to easily repeat surveys on the same path over large areas without the operator fatigue and error that plague man-powered surveys. While initial deployments with autonomous systems required a larger field crew than a man-powered survey, over time operational experience costs and man power requirements will decrease. Using a low-cost, commercially available chassis as the base for autonomous surveying robotic systems promise to provide higher precision and efficiency than human-powered techniques. An experimental survey successfully demonstrated the adaptive techniques described. A magnetic sensor was mounted on a small rover, which autonomously drove a prescribed course designed to provide an overview of the study area. Magnetic data was relayed to the base station periodically, processed and gridded. A

  8. Road network modeling in open source GIS to manage the navigation of autonomous robots

    NASA Astrophysics Data System (ADS)

    Mangiameli, Michele; Muscato, Giovanni; Mussumeci, Giuseppe

    2013-10-01

    The autonomous navigation of a robot can be accomplished through the assignment of a sequence of waypoints previously identified in the territory to be explored. In general, the starting point is a vector graph of the network consisting of possible paths. The vector graph can be directly available in the case of actual road networks, or it can be modeled, i.e. on the basis of cartographic supports or, even better, of a digital terrain model (DTM). In this paper we present software procedures developed in Grass-GIS, PostGIS and QGIS environments to identify, model, and visualize a road graph and to extract and normalize sequence of waypoints which can be transferred to a robot for its autonomous navigation.

  9. The Implementation and Testing of a Robotic Arm on an Autonomous Vehicle

    DTIC Science & Technology

    2007-12-01

    12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) An articulated arm with three degrees of freedom is implemented and tested on an...PAGES 51 14. SUBJECT TERMS Robotic Arm, Autonomous, Kinematics 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY...of freedom [10]. The arm consists of a rotatable base, shoulder, elbow and a functional gripper which are controlled by the arm controller. Figure

  10. Mobbing Behavior and Deceit and its Role in Bioinspired Autonomous Robotic Agents

    DTIC Science & Technology

    2012-01-01

    behavior mainly displayed in cooperative birds but can also be found in animals such as meerkats [13] and squirrels [11] shown in figure 1. This...Technical Report GIT-MRL-12-02 Mobbing Behavior and Deceit and its role in Bio- inspired Autonomous Robotic Agents Justin Davis and Ronald C...simulation results are presented, which portend the value of this behavior in military situations. I. INTRODUCTION Mobbing is an anti-predator

  11. Intelligent operating systems for autonomous robots: Real-time capabilities on a hypercube super-computer

    SciTech Connect

    Einstein, J.R.; Barhen, J.; Jefferson, D.

    1986-01-01

    Autonomous robots which must perform time-critical tasks in hostile environments require computers which can perform many asynchronous tasks at extremely high speeds. Certain hypercube multiprocessors have many of the required attributes, but their operating systems must be provided with special functions to improve the capability of the system to respond rapidly to unpredictable events. A ''virtual-time'' shell, under design for addition to the Vertex operating system of the NCUBE hypercube computer, and having such capabilities, is described.

  12. Neuromodulated Neural Hardware and Its Implementation on an Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tokura, Seiji; Ishiguro, Akio; Okuma, Shigeru

    In order to construct truly autonomous mobile robots, the concept of packaging is highly indispensable: all parts such as controllers, power systems, and batteries should be embedded inside a body. Therefore, implementing a controller on hardware is one of the most promising ways, since this contributes to low power consumption, miniaturization, and so on. Another crucial requirement in the field of autonomous mobile robots is robustness. That is, autonomous mobile robots have to cope with their unpredictably changing environments in real time. In this study, to meet these requirements the concept of Dynamically Rearrangeable Electrical Circuit(DREC) is proposed. In addition, we implement DREC onto FPGAs as physical electronic circuits by using the diffusion-reaction mechanism of neuromodulation which is widely observed in biological nervous systems. We developed the DREC for the peg-pushing task as a practical example. We confirmed that the physical DREC can successfully regulate the behavior according to the situation by changing its properties in real time.

  13. Self-organization of spiking neural network that generates autonomous behavior in a real mobile robot.

    PubMed

    Alnajjar, Fady; Murase, Kazuyuki

    2006-08-01

    In this paper, we propose self-organization algorithm of spiking neural network (SNN) applicable to autonomous robot for generation of adoptive and goal-directed behavior. First, we formulated a SNN model whose inputs and outputs were analog and the hidden unites are interconnected each other. Next, we implemented it into a miniature mobile robot Khepera. In order to see whether or not a solution(s) for the given task(s) exists with the SNN, the robot was evolved with the genetic algorithm in the environment. The robot acquired the obstacle avoidance and navigation task successfully, exhibiting the presence of the solution. After that, a self-organization algorithm based on a use-dependent synaptic potentiation and depotentiation at synapses of input layer to hidden layer and of hidden layer to output layer was formulated and implemented into the robot. In the environment, the robot incrementally organized the network and the given tasks were successfully performed. The time needed to acquire the desired adoptive and goal-directed behavior using the proposed self-organization method was much less than that with the genetic evolution, approximately one fifth.

  14. Automatic generation of modules of object categorization for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Gorbenko, Anna

    2013-10-01

    Many robotic tasks require advanced systems of visual sensing. Robotic systems of visual sensing must be able to solve a number of different complex problems of visual data analysis. Object categorization is one of such problems. In this paper, we propose an approach to automatic generation of computationally effective modules of object categorization for autonomous mobile robots. This approach is based on the consideration of the stack cover problem. In particular, it is assumed that the robot is able to perform an initial inspection of the environment. After such inspection, the robot needs to solve the stack cover problem by using a supercomputer. A solution of the stack cover problem allows the robot to obtain a template for computationally effective scheduling of object categorization. Also, we consider an efficient approach to solve the stack cover problem. In particular, we consider an explicit reduction from the decision version of the stack cover problem to the satisfiability problem. For different satisfiability algorithms, the results of computational experiments are presented.

  15. Toward a mobile autonomous robotic system for Mars exploration

    NASA Astrophysics Data System (ADS)

    Arena, P.; Di Giamberardino, P.; Fortuna, L.; La Gala, F.; Monaco, S.; Muscato, G.; Rizzo, A.; Ronchini, R.

    2004-01-01

    The paper deals with the results obtained up to now in the design and realization of mobile platforms, wheeled and legged ones, for autonomous deployment in unknown and hostile environments: a work developed in the framework of a project supported by the Italian Space Agency. The paper is focused on the description of the hierarchical architecture adopted for the planning, the supervision and the control of their mobility. Experimental results validate the solutions proposed, evidencing the capabilities of the platforms to explore environments in presence of irregular ground shape and obstacles of different dimensions.

  16. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized

  17. Autonomous function of wheelchair-mounted robotic manipulators to perform daily activities.

    PubMed

    Chung, Cheng-Shiu; Wang, Hongwu; Cooper, Rory A

    2013-06-01

    Autonomous functions for wheelchair-mounted robotic manipulators (WMRMs) allow a user to focus more on the outcome from the task - for example, eating or drinking, instead of moving robot joints through user interfaces. In this paper, we introduce a novel personal assistive robotic system based on a position-based visual servoing (PBVS) approach. The system was evaluated with a complete drinking task, which included recognizing the location of the drink, picking up the drink from a start location, conveying the drink to the proximity of the user's mouth without spilling, and placing the drink back on the table. For a drink located in front of the wheelchair, the success rate was nearly 100%. Overall, the total time of completing drinking task is within 40 seconds.

  18. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  19. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan.

    PubMed

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items.

  20. Effectiveness of Social Behaviors for Autonomous Wheelchair Robot to Support Elderly People in Japan

    PubMed Central

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  1. Autonomous navigation method for substation inspection robot based on travelling deviation

    NASA Astrophysics Data System (ADS)

    Yang, Guoqing; Xu, Wei; Li, Jian; Fu, Chongguang; Zhou, Hao; Zhang, Chuanyou; Shao, Guangting

    2017-06-01

    A new method of edge detection is proposed in substation environment, which can realize the autonomous navigation of the substation inspection robot. First of all, the road image and information are obtained by using an image acquisition device. Secondly, the noise in the region of interest which is selected in the road image, is removed with the digital image processing algorithm, the road edge is extracted by Canny operator, and the road boundaries are extracted by Hough transform. Finally, the distance between the robot and the left and the right boundaries is calculated, and the travelling distance is obtained. The robot's walking route is controlled according to the travel deviation and the preset threshold. Experimental results show that the proposed method can detect the road area in real time, and the algorithm has high accuracy and stable performance.

  2. Command and Control Architectures for Autonomous Micro-Robotic Forces - FY-2000 Project Report

    SciTech Connect

    Dudenhoeffer, Donald Dean

    2001-04-01

    Advances in Artificial Intelligence (AI) and micro-technologies will soon give rise to production of large-scale forces of autonomous micro-robots with systems of innate behaviors and with capabilities of self-organization and real world tasking. Such organizations have been compared to schools of fish, flocks of birds, herds of animals, swarms of insects, and military squadrons. While these systems are envisioned as maintaining a high degree of autonomy, it is important to understand the relationship of man with such machines. In moving from research studies to the practical deployment of large-scale numbers of robots, one of critical pieces that must be explored is the command and control architecture for humans to re-task and also inject global knowledge, experience, and intuition into the force. Tele-operation should not be the goal, but rather a level of adjustable autonomy and high-level control. If a herd of sheep is comparable to the collective of robots, then the human element is comparable to the shepherd pulling in strays and guiding the herd in the direction of greener pastures. This report addresses the issues and development of command and control for largescale numbers of autonomous robots deployed as a collective force.

  3. CREST Autonomous Robotic Scientist: Developing a Closed-Loop Science Exploration Capability for European Mars Missions

    NASA Astrophysics Data System (ADS)

    Woods, M.; Shaw, A.; Ward, R.; Barnes, D.; Pullan, D.; Long, D.

    2008-08-01

    In common with most Mars missions, the current communications baseline for Europe's ExoMars Rover mission exhibits constrained data links with Earth, making remote operations difficult. The time taken to transmit and react to planning data places a natural limit on the amount of science exploration that can be achieved in any given period. In order to increase the potential science return, autonomous science assessment and response is an attractive option and worthy of investigation. In this work, we have integrated technologies and techniques developed in previous studies and used the resulting test bed to demonstrate an autonomous, opportunistic science concept on a representative robotic platform. In addition to progressing the system design approach and individual autonomy components, we have introduced a methodology for autonomous science assessment based on terrestrial field science practice.

  4. Aladdin: a semi-autonomous door opening system for EOD-class robots

    NASA Astrophysics Data System (ADS)

    Craft, Jack; Wilson, Jack; Huang, Wesley H.; Claffee, Mark R.; Phillips, Emilie A.

    2011-05-01

    This paper describes our results to date on the Aladdin project, an ongoing effort to enable small UGVs to open doors semi-autonomously. Our system consists of a modular general-purpose gripper and software that provides semiautonomous capabilities. The gripper features compliant elements which simplify operations such as turning a doorknob and opening a door; this gripper can be retrofitted onto existing general-purpose robotic manipulators without extensive hardware modifications. The software provides semi-autonomous door opening capability through an OCU; these capabilities are focused on targeting and reaching for a doorknob, a subtask that our initial testing showed would provide the greatest improvement in door opening operations. This paper describes our system and the results of our evaluations on the door opening task. We continue to develop both the hardware and software with the ultimate goal of fully autonomous door-opening.

  5. Monocular SLAM for Autonomous Robots with Enhanced Features Initialization

    PubMed Central

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-01-01

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided. PMID:24699284

  6. Monocular SLAM for autonomous robots with enhanced features initialization.

    PubMed

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-04-02

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided.

  7. Emergence of Leadership in a Group of Autonomous Robots

    PubMed Central

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different “styles” of leadership (active and passive). PMID:26340449

  8. Emergence of Leadership in a Group of Autonomous Robots.

    PubMed

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different "styles" of leadership (active and passive).

  9. Lane identification and path planning for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    McKeon, Robert T.; Paulik, Mark; Krishnan, Mohan

    2006-10-01

    This work has been performed in conjunction with the University of Detroit Mercy's (UDM) ECE Department autonomous vehicle entry in the 2006 Intelligent Ground Vehicle Competition (www.igvc.org). The IGVC challenges engineering students to design autonomous vehicles and compete in a variety of unmanned mobility competitions. The course to be traversed in the competition consists of a lane demarcated by painted lines on grass with the possibility of one of the two lines being deliberately left out over segments of the course. The course also consists of other challenging artifacts such as sandpits, ramps, potholes, and colored tarps that alter the color composition of scenes, and obstacles set up using orange and white construction barrels. This paper describes a composite lane edge detection approach that uses three algorithms to implement noise filters enabling increased removal of noise prior to the application of image thresholding. The first algorithm uses a row-adaptive statistical filter to establish an intensity floor followed by a global threshold based on a reverse cumulative intensity histogram and a priori knowledge about lane thickness and separation. The second method first improves the contrast of the image by implementing an arithmetic combination of the blue plane (RGB format) and a modified saturation plane (HSI format). A global threshold is then applied based on the mean of the intensity image and a user-defined offset. The third method applies the horizontal component of the Sobel mask to a modified gray scale of the image, followed by a thresholding method similar to the one used in the second method. The Hough transform is applied to each of the resulting binary images to select the most probable line candidates. Finally, a heuristics-based confidence interval is determined, and the results sent on to a separate fuzzy polar-based navigation algorithm, which fuses the image data with that produced by a laser scanner (for obstacle detection).

  10. Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot.

    PubMed

    Mafrica, Stefano; Servel, Alain; Ruffier, Franck

    2016-11-10

    Here we present a novel bio-inspired optic flow (OF) sensor and its application to visual  guidance and odometry on a low-cost car-like robot called BioCarBot. The minimalistic OF sensor was robust to high-dynamic-range lighting conditions and to various visual patterns encountered thanks to its M(2)APIX auto-adaptive pixels and the new cross-correlation OF algorithm implemented. The low-cost car-like robot estimated its velocity and steering angle, and therefore its position and orientation, via an extended Kalman filter (EKF) using only two downward-facing OF sensors and the Ackerman steering model. Indoor and outdoor experiments were carried out in which the robot was driven in the closed-loop mode based on the velocity and steering angle estimates. The experimental results obtained show that our novel OF sensor can deliver high-frequency measurements ([Formula: see text]) in a wide OF range (1.5-[Formula: see text]) and in a 7-decade high-dynamic light level range. The OF resolution was constant and could be adjusted as required (up to [Formula: see text]), and the OF precision obtained was relatively high (standard deviation of [Formula: see text] with an average OF of [Formula: see text], under the most demanding lighting conditions). An EKF-based algorithm gave the robot's position and orientation with a relatively high accuracy (maximum errors outdoors at a very low light level: [Formula: see text] and [Formula: see text] over about [Formula: see text] and [Formula: see text]) despite the low-resolution control systems of the steering servo and the DC motor, as well as a simplified model identification and calibration. Finally, the minimalistic OF-based odometry results were compared to those obtained using measurements based on an inertial measurement unit (IMU) and a motor's speed sensor.

  11. Real-time map building and navigation for autonomous robots in unknown environments.

    PubMed

    Oriolo, G; Ulivi, G; Vendittelli, M

    1998-01-01

    An algorithmic solution method is presented for the problem of autonomous robot motion in completely unknown environments. Our approach is based on the alternate execution of two fundamental processes: map building and navigation. In the former, range measures are collected through the robot exteroceptive sensors and processed in order to build a local representation of the surrounding area. This representation is then integrated in the global map so far reconstructed by filtering out insufficient or conflicting information. In the navigation phase, an A*-based planner generates a local path from the current robot position to the goal. Such a path is safe inside the explored area and provides a direction for further exploration. The robot follows the path up to the boundary of the explored area, terminating its motion if unexpected obstacles are encountered. The most peculiar aspects of our method are the use of fuzzy logic for the efficient building and modification of the environment map, and the iterative application of A*, a complete planning algorithm which takes full advantage of local information. Experimental results for a NOMAD 200 mobile robot show the real-time performance of the proposed method, both in static and moderately dynamic environments.

  12. Autonomous Mobile Robot System for Monitoring and Control of Penetration during Fixed Pipes Welding

    NASA Astrophysics Data System (ADS)

    Muramatsu, Masahiro; Suga, Yasuo; Mori, Kazuhiro

    In order to obtain sound welded joints in the welding of horizontal fixed pipes, it is important to control the back bead width in the first pass. However, it is difficult to obtain optimum back bead width, because the proper welding conditions change with welding position. In this paper, in order to fully automatize the welding of fixed pipes, a new method is developed to control the back bead width with monitoring the shape and dimensions of the molten pool from the reverse side by autonomous mobile robot system. This robot has spherical shape so as to move in a complex route including curved pipe, elbow joint and so on. It has also a camera to observe inner surface of pipe and recognize a route in which the robot moves. The robot moves to welding point in the pipe, and monitors the reverse side shape of molten pool during welding. The host computer processes the images of molten pool acquired by the robot vision system, and calculates the optimum welding conditions to realize adaptive control of welding. As a result of the welding control experiments, the effectiveness of this system for the penetration control of fixed pipes is demonstrated.

  13. 3-D world modeling for an autonomous robot

    SciTech Connect

    Goldstein, M.; Pin, F.G.; Weisbin, C.R.

    1987-08-01

    This paper presents a methodology for a concise representation of the 3-D world model for a mobile robot, using range data. The process starts with the segmentation of the scene into ''objects'' that are given a unique label, based on principles of range continuity. Then the external surface of each object is partitioned into homogeneous surface patches. Contours of surface patches in 3-D space are identified by estimating the normal and curvature associated with each pixel. The resulting surface patches are then classified as planar, convex or concave. Since the world model uses a volumetric representation for the 3-D environment, planar surfaces are represented by thin volumetric polyhedra. Spherical and cylindrical surfaces are extracted and represented by appropriate volumetric primitives. All other surfaces are represented using the boolean union of spherical volumes (as described in a separate paper by the same authors). The result is a general, concise representation of the external 3-D world, which allows for efficient and robust 3-D object recognition. 20 refs., 14 figs.

  14. Remotely manipulated and autonomous robotic welding fabrication in space

    NASA Technical Reports Server (NTRS)

    Agapakis, J. E.; Masubuchi, K.

    1985-01-01

    The results of a NASA sponsored study, performed in order to establish the feasibility of remotely manipulated or unmanned welding fabrication systems for space construction, are presented. Possible space welding fabrication tasks and operational modes are classified and the capabilities and limitations of human operators and machines are outlined. Human performance in remote welding tasks was experimentally tested under the sensing and actuation constraints imposed by remote manipulation in outer space environments. Proposals for the development of space welding technology are made and necessary future R&D efforts are identified. The development of improved visual sensing strategies and computer encoding of the human welding engineering expertise are identified as essential, both for human operator assistance and for autonomous operation in all phases of welding fabrication. Novel uses of machine vision for the determination of the weld joint and bead geometry are proposed, and a prototype of a rule-based expert system is described for the interpretation of the visually detected weld features and defects.

  15. Remotely Manipulated And Autonomous Robotic Welding Fabrication In Space

    NASA Astrophysics Data System (ADS)

    Agapakis, John E.; Masubuchi, Koichi

    1985-12-01

    The results of a National Aeronautics and Space Administration (NASA) sponsored study, performed in order to establish the feasibility of remotely manipulated or unmanned welding fabrication systems for space construction, are first presented in this paper. Possible space welding fabrication tasks and operational modes are classified and the capabilities and limitations of human operators and machines are outlined. The human performance in remote welding tasks is experimentally tested under the sensing and actuation constraints imposed by remote manipulation in outer space environments. Proposals for the development of space welding technology are made and necessary future research and development (R&D) efforts are identified. The development of improved visual sensing strategies and computer encoding of the human welding engineering expertise are identified as essential, both for human operator assistance and for autonomous operation in all phases of welding fabrication. Results of a related follow-up study are then briefly presented. Novel uses of machine vision for the determination of the weld joint and bead geometry are proposed and implemented, and a first prototype of a rule-based expert system is developed for the interpretation of the visually detected weld features and defects.

  16. Towards MRI-Based Autonomous Robotic US Acquisitions: A First Feasibility Study.

    PubMed

    Hennersperger, Christoph; Fuerst, Bernhard; Virga, Salvatore; Zettinig, Oliver; Frisch, Benjamin; Neff, Thomas; Navab, Nassir

    2016-10-24

    Robotic ultrasound has the potential to assist and guide physicians during interventions. In this work, we present a set of methods and a workflow to enable autonomous MRI-guided ultrasound acquisitions. Our approach uses a structured-light 3D scanner for patient-to-robot and image-to-patient calibration, which in turn is used to plan 3D ultrasound trajectories. These MRI-based trajectories are followed autonomously by the robot and are further refined online using automatic MRI/US registration. Despite the low spatial resolution of structured light scanners, the initial planned acquisition path can be followed with an accuracy of 2.46±0.96 mm. This leads to a good initialization of the MRI/US registration: the 3D-scan-based alignment for planning and acquisition shows an accuracy (distance between planned ultrasound and MRI) of 4.47 mm, and 0.97 mm after an online-update of the calibration based on a closed loop registration.

  17. Autonomous robot navigation based on the evolutionary multi-objective optimization of potential fields

    NASA Astrophysics Data System (ADS)

    Herrera Ortiz, Juan Arturo; Rodríguez-Vázquez, Katya; Padilla Castañeda, Miguel A.; Arámbula Cosío, Fernando

    2013-01-01

    This article presents the application of a new multi-objective evolutionary algorithm called RankMOEA to determine the optimal parameters of an artificial potential field for autonomous navigation of a mobile robot. Autonomous robot navigation is posed as a multi-objective optimization problem with three objectives: minimization of the distance to the goal, maximization of the distance between the robot and the nearest obstacle, and maximization of the distance travelled on each field configuration. Two decision makers were implemented using objective reduction and discrimination in performance trade-off. The performance of RankMOEA is compared with NSGA-II and SPEA2, including both decision makers. Simulation experiments using three different obstacle configurations and 10 different routes were performed using the proposed methodology. RankMOEA clearly outperformed NSGA-II and SPEA2. The robustness of this approach was evaluated with the simulation of different sensor masks and sensor noise. The scheme reported was also combined with the wavefront-propagation algorithm for global path planning.

  18. An Intention-Driven Semi-autonomous Intelligent Robotic System for Drinking.

    PubMed

    Zhang, Zhijun; Huang, Yongqian; Chen, Siyuan; Qu, Jun; Pan, Xin; Yu, Tianyou; Li, Yuanqing

    2017-01-01

    In this study, an intention-driven semi-autonomous intelligent robotic (ID-SIR) system is designed and developed to assist the severely disabled patients to live independently. The system mainly consists of a non-invasive brain-machine interface (BMI) subsystem, a robot manipulator and a visual detection and localization subsystem. Different from most of the existing systems remotely controlled by joystick, head- or eye tracking, the proposed ID-SIR system directly acquires the intention from users' brain. Compared with the state-of-art system only working for a specific object in a fixed place, the designed ID-SIR system can grasp any desired object in a random place chosen by a user and deliver it to his/her mouth automatically. As one of the main advantages of the ID-SIR system, the patient is only required to send one intention command for one drinking task and the autonomous robot would finish the rest of specific controlling tasks, which greatly eases the burden on patients. Eight healthy subjects attended our experiment, which contained 10 tasks for each subject. In each task, the proposed ID-SIR system delivered the desired beverage container to the mouth of the subject and then put it back to the original position. The mean accuracy of the eight subjects was 97.5%, which demonstrated the effectiveness of the ID-SIR system.

  19. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    PubMed Central

    Podder, Tarun K.; Buzurovic, Ivan; Huang, Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-01

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane’s model and the Army Material Systems Analysis Activity, i.e., Crow’s model, were applied. The MTBF was used as an important measure for assessing the system’s reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane’s postulation as well as Crow’s postulation of reliability growth. The Laplace test index was −3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved

  20. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    SciTech Connect

    Podder, Tarun K.; Buzurovic, Ivan; Huang Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-15

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane's model and the Army Material Systems Analysis Activity, i.e., Crow's model, were applied. The MTBF was used as an important measure for assessing the system's reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane's postulation as well as Crow's postulation of reliability growth. The Laplace test index was -3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved reliability. The MTBF

  1. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations.

  2. Developing a Telescope Simulator Towards a Global Autonomous Robotic Telescope Network

    NASA Astrophysics Data System (ADS)

    Giakoumidis, N.; Ioannou, Z.; Dong, H.; Mavridis, N.

    2013-05-01

    A robotic telescope network is a system that integrates a number of telescopes to observe a variety of astronomical targets without being operated by a human. This system autonomously selects and observes targets in accordance to an optimized target. It dynamically allocates telescope resources depending on the observation requests, specifications of the telescopes, target visibility, meteorological conditions, daylight, location restrictions and availability and many other factors. In this paper, we introduce a telescope simulator, which can control a telescope to a desired position in order to observe a specific object. The system includes a Client Module, a Server Module, and a Dynamic Scheduler module. We make use and integrate a number of open source software to simulate the movement of a robotic telescope, the telescope characteristics, the observational data and weather conditions in order to test and optimize our system.

  3. Development and training of a learning expert system in an autonomous mobile robot via simulation

    SciTech Connect

    Spelt, P.F.; Lyness, E.; DeSaussure, G. . Center for Engineering Systems Advanced Research)

    1989-11-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using a computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.

  4. Shuttlecock detection system for fully-autonomous badminton robot with two high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Masunari, T.; Yamagami, K.; Mizuno, M.; Une, S.; Uotani, M.; Kanematsu, T.; Demachi, K.; Sano, S.; Nakamura, Y.; Suzuki, S.

    2017-02-01

    Two high-speed video cameras are successfully used to detect the motion of a flying shuttlecock of badminton. The shuttlecock detection system is applied to badminton robots that play badminton fully autonomously. The detection system measures the three dimensional position and velocity of a flying shuttlecock, and predicts the position where the shuttlecock falls to the ground. The badminton robot moves quickly to the position where the shuttle-cock falls to, and hits the shuttlecock back into the opponent's side of the court. In the game of badminton, there is a large audience, and some of them move behind a flying shuttlecock, which are a kind of background noise and makes it difficult to detect the motion of the shuttlecock. The present study demonstrates that such noises can be eliminated by the method of stereo imaging with two high-speed cameras.

  5. Simulation of Autonomous Robotic Multiple-Core Biopsy by 3D Ultrasound Guidance

    PubMed Central

    Liang, Kaicheng; Rogers, Albert J.; Light, Edward D.; von Allmen, Daniel; Smith, Stephen W.

    2010-01-01

    An autonomous multiple-core biopsy system guided by real-time 3D ultrasound and operated by a robotic arm with 6+1 degrees of freedom has been developed. Using a specimen of turkey breast as a tissue phantom, our system was able to first autonomously locate the phantom in the image volume and then perform needle sticks in each of eight sectors in the phantom in a single session, with no human intervention required. Based on the fraction of eight sectors successfully sampled in an experiment of five trials, a success rate of 93% was recorded. This system could have relevance in clinical procedures that involve multiple needle-core sampling such as prostate or breast biopsy. PMID:20687279

  6. Non-equilibrium assembly of microtubules: from molecules to autonomous chemical robots.

    PubMed

    Hess, H; Ross, Jennifer L

    2017-03-22

    Biological systems have evolved to harness non-equilibrium processes from the molecular to the macro scale. It is currently a grand challenge of chemistry, materials science, and engineering to understand and mimic biological systems that have the ability to autonomously sense stimuli, process these inputs, and respond by performing mechanical work. New chemical systems are responding to the challenge and form the basis for future responsive, adaptive, and active materials. In this article, we describe a particular biochemical-biomechanical network based on the microtubule cytoskeletal filament - itself a non-equilibrium chemical system. We trace the non-equilibrium aspects of the system from molecules to networks and describe how the cell uses this system to perform active work in essential processes. Finally, we discuss how microtubule-based engineered systems can serve as testbeds for autonomous chemical robots composed of biological and synthetic components.

  7. Autonomous robotic capture of non-cooperative target by adaptive extended Kalman filter based visual servo

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Zheng H.

    2016-05-01

    This paper presents a real-time, vision-based algorithm for the pose and motion estimation of non-cooperative targets and its application in visual servo robotic manipulator to perform autonomous capture. A hybrid approach of adaptive extended Kalman filter and photogrammetry is developed for the real-time pose and motion estimation of non-cooperative targets. Based on the pose and motion estimates, the desired pose and trajectory of end-effector is defined and the corresponding desired joint angles of the robotic manipulator are derived by inverse kinematics. A close-loop visual servo control scheme is then developed for the robotic manipulator to track, approach and capture the target. Validating experiments are designed and performed on a custom-built six degrees of freedom robotic manipulator with an eye-in-hand configuration. The experimental results demonstrate the feasibility, effectiveness and robustness of the proposed adaptive extended Kalman filter enabled pose and motion estimation and visual servo strategy.

  8. Neural network representation of sensor graphs in autonomous robot path planning

    SciTech Connect

    Jorgensen, C.C.

    1987-01-01

    This paper discusses a continuous valued associative neural network used for anticipatory robot navigation planning in partially learned environments. A navigation methodology is implemented in four steps. First, a room is represented as a lattice of connected voxels formed by dividing navigation space into equal sized volumetric cells. Each voxel is associated with a simulated neuron. The magnitude of a neurons activation corresponds to a probability of voxel occupancy calculated from a series of sonar readings taken by an autonomous robot. Neurons are trained with a series of room patterns derived from varying robot sensor perspectives. At another time, the robot is exposed to a single perspective of one of the rooms and utilizes the sensor return as a cue to prompt associative recall of a best guess of the complete interior of the room. A two step path planning operation is then invoked which uses line of sight readings and anticipated global information to form a trial path plan. The planning process merges a nearest neighbor grid cell technique and a simulated annealing gradient descent method to optimize traversal movements. In the final step, the path is followed until a mismatch between the estimated room and the actual sensor returns indicate incorrect anticipation. Implementation of the method on a Hypercube computer is discussed along with memory computation tradeoff requirements.

  9. 3-D world modeling based on combinatorial geometry for autonomous robot navigation

    SciTech Connect

    Goldstein, M.; Pin, F.G.; de Saussure, G.; Weisbin, C.R.

    1987-01-01

    In applications of robotics to surveillance and mapping at nuclear facilities, the scene to be described is fundamentally three-dimensional. Usually, only partial information concerning the 3-D environment is known a-priori. Using an autonomous robot, this information may be updated using range data to provide an accurate model of the environment. Range data quantify the distances from the sensor focal plane to the object surface. In other words, the 3-D coordinates of discrete points on the object surface are known. The approach proposed herein for 3-D world modeling is based on the Combinatorial Geometry (C.G.) Method which is widely used in Monte Carlo particle transport calculations. First, each measured point on the object surface is surrounded by a small solid sphere with a radius determined by the range to that point. Then, the 3-D shapes of the visible surfaces are obtained by taking the (Boolean) union of all the spheres. The result is a concise and unambiguous representation of the object's boundary surfaces. The distances from discrete points on the robot's boundary surface to various objects are calculated effectively using the C.G. type of representation. This feature is particularly useful for navigation purposes. The efficiency of the proposed approach is illustrated by a simulation of a spherical robot navigating in a 3-D room with several static obstacles.

  10. Localization of non-linearly modeled autonomous mobile robots using out-of-sequence measurements.

    PubMed

    Besada-Portas, Eva; Lopez-Orozco, Jose A; Lanillos, Pablo; de la Cruz, Jesus M

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost.

  11. Localization of Non-Linearly Modeled Autonomous Mobile Robots Using Out-of-Sequence Measurements

    PubMed Central

    Besada-Portas, Eva; Lopez-Orozco, Jose A.; Lanillos, Pablo; de la Cruz, Jesus M.

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  12. Autonomous Navigation System for Mobile Robot Using Randomly Distributed Passive RFID Tags

    NASA Astrophysics Data System (ADS)

    Park, Sunhong; Hashimoto, Shuji

    This paper presents an autonomous navigation system for a mobile robot using randomly distributed passive RFID tags. In the case of randomly distributed RFID tags, it is difficult to provide the precise location of the robot especially in the area of sparse RFID tag distribution. This, combined with the wide turning radius of the robot, can cause the robot to enter a zigzag exploration path and miss the goal. In RFID-based navigation, the key is to reduce both the number of RFID tags and the localization error for practical use in a large space. To cope with these, we utilized the Read time, which measures the reading time of each RFID tag. With this, we could estimate accurately the localization and orientation without using any external sensors or increasing the RFID tags. The average estimation errors of 7.8cm in localization and 11 degrees in orientation were achieved with 102 RFID tags in the area of 4.2m by 6.2m. Our proposed method is verified with the path trajectories produced during navigation compared with conventional approaches.

  13. Autonomous Multi-Robot Search for a Hazardous Source in a Turbulent Environment

    PubMed Central

    Ristic, Branko; Angley, Daniel; Moran, Bill; Palmer, Jennifer L.

    2017-01-01

    Finding the source of an accidental or deliberate release of a toxic substance into the atmosphere is of great importance for national security. The paper presents a search algorithm for turbulent environments which falls into the class of cognitive (infotaxi) algorithms. Bayesian estimation of the source parameter vector is carried out using the Rao–Blackwell dimension-reduction method, while the robots are controlled autonomously to move in a scalable formation. Estimation and control are carried out in a centralised replicated fusion architecture assuming all-to-all communication. The paper presents a comprehensive numerical analysis of the proposed algorithm, including the search-time and displacement statistics. PMID:28430120

  14. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  15. Autonomous Multi-Robot Search for a Hazardous Source in a Turbulent Environment.

    PubMed

    Ristic, Branko; Angley, Daniel; Moran, Bill; Palmer, Jennifer L

    2017-04-21

    Finding the source of an accidental or deliberate release of a toxic substance into the atmosphere is of great importance for national security. The paper presents a search algorithm for turbulent environments which falls into the class of cognitive (infotaxi) algorithms. Bayesian estimation of the source parameter vector is carried out using the Rao-Blackwell dimension-reduction method, while the robots are controlled autonomously to move in a scalable formation. Estimation and control are carried out in a centralised replicated fusion architecture assuming all-to-all communication. The paper presents a comprehensive numerical analysis of the proposed algorithm, including the search-time and displacement statistics.

  16. Concept for practical exercises for studying autonomous flying robots in a university environment: part II

    NASA Astrophysics Data System (ADS)

    Gageik, Nils; Dilger, Erik; Montenegro, Sergio; Schön, Stefan; Wildenhein, Rico; Creutzburg, Reiner; Fischer, Arno

    2015-03-01

    The present paper demonstrates the application of quadcopters as educational material for students in aerospace computer science, as it is already in usage today. The work with quadrotors teaches students theoretical and practical knowledge in the fields of robotics, control theory, aerospace and electrical engineering as well as embedded programming and computer science. For this the material, concept, realization and future view of such a course is discussed in this paper. Besides that, the paper gives a brief overview of student research projects following the course, which are related to the research and development of fully autonomous quadrotors.

  17. Autonomous global sky monitoring with real-time robotic follow-up

    SciTech Connect

    Vestrand, W Thomas; Davis, H; Wren, J; Wozniak, P; Norman, B; White, R; Bloch, J; Fenimore, E; Hodge, Barry; Jah, Moriba; Rast, Richard

    2008-01-01

    We discuss the development of prototypes for a global grid of advanced 'thinking' sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  18. Performance Analysis and Odometry Improvement of an Omnidirectional Mobile Robot for Outdoor Terrain

    DTIC Science & Technology

    2011-09-01

    omnidirectional mobile robot can perform complex maneuvers (i.e. extremely sharp turning) that cannot be achieved by typical Ackermann steered wheeled vehicles...for the ASOC-driven omnidirectional mobile robot is described as follows: First, the wheel angular velocities of each ASOC [ωi,L , ωi,R] and the...terrain. This is due to wheel slippage that causes miscounts of wheel rotation. In particular, the ASOC-driven omnidirectional mobile robot experiences

  19. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  20. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  1. Tank-automotive robotics

    NASA Astrophysics Data System (ADS)

    Lane, Gerald R.

    1999-07-01

    To provide an overview of Tank-Automotive Robotics. The briefing will contain program overviews & inter-relationships and technology challenges of TARDEC managed unmanned and robotic ground vehicle programs. Specific emphasis will focus on technology developments/approaches to achieve semi- autonomous operation and inherent chassis mobility features. Programs to be discussed include: DemoIII Experimental Unmanned Vehicle (XUV), Tactical Mobile Robotics (TMR), Intelligent Mobility, Commanders Driver Testbed, Collision Avoidance, International Ground Robotics Competition (ICGRC). Specifically, the paper will discuss unique exterior/outdoor challenges facing the IGRC competing teams and the synergy created between the IGRC and ongoing DoD semi-autonomous Unmanned Ground Vehicle and DoT Intelligent Transportation System programs. Sensor and chassis approaches to meet the IGRC challenges and obstacles will be shown and discussed. Shortfalls in performance to meet the IGRC challenges will be identified.

  2. Demonstration of a Spoken Dialogue Interface for Planning Activities of a Semi-autonomous Robot

    NASA Technical Reports Server (NTRS)

    Dowding, John; Frank, Jeremy; Hockey, Beth Ann; Jonsson, Ari; Aist, Gregory

    2002-01-01

    Planning and scheduling in the face of uncertainty and change pushes the capabilities of both planning and dialogue technologies by requiring complex negotiation to arrive at a workable plan. Planning for use of semi-autonomous robots involves negotiation among multiple participants with competing scientific and engineering goals to co-construct a complex plan. In NASA applications this plan construction is done under severe time pressure so having a dialogue interface to the plan construction tools can aid rapid completion of the process. But, this will put significant demands on spoken dialogue technology, particularly in the areas of dialogue management and generation. The dialogue interface will need to be able to handle the complex dialogue strategies that occur in negotiation dialogues, including hypotheticals and revisions, and the generation component will require an ability to summarize complex plans. This demonstration will describe a work in progress towards building a spoken dialogue interface to the EUROPA planner for the purposes of planning and scheduling the activities of a semi-autonomous robot. A prototype interface has been built for planning the schedule of the Personal Satellite Assistant (PSA), a mobile robot designed for micro-gravity environments that is intended for use on the Space Shuttle and International Space Station. The spoken dialogue interface gives the user the capability to ask for a description of the plan, ask specific questions about the plan, and update or modify the plan. We anticipate that a spoken dialogue interface to the planner will provide a natural augmentation or alternative to the visualization interface, in situations in which the user needs very targeted information about the plan, in situations where natural language can express complex ideas more concisely than GUI actions, or in situations in which a graphical user interface is not appropriate.

  3. Demonstration of a Spoken Dialogue Interface for Planning Activities of a Semi-autonomous Robot

    NASA Technical Reports Server (NTRS)

    Dowding, John; Frank, Jeremy; Hockey, Beth Ann; Jonsson, Ari; Aist, Gregory

    2002-01-01

    Planning and scheduling in the face of uncertainty and change pushes the capabilities of both planning and dialogue technologies by requiring complex negotiation to arrive at a workable plan. Planning for use of semi-autonomous robots involves negotiation among multiple participants with competing scientific and engineering goals to co-construct a complex plan. In NASA applications this plan construction is done under severe time pressure so having a dialogue interface to the plan construction tools can aid rapid completion of the process. But, this will put significant demands on spoken dialogue technology, particularly in the areas of dialogue management and generation. The dialogue interface will need to be able to handle the complex dialogue strategies that occur in negotiation dialogues, including hypotheticals and revisions, and the generation component will require an ability to summarize complex plans. This demonstration will describe a work in progress towards building a spoken dialogue interface to the EUROPA planner for the purposes of planning and scheduling the activities of a semi-autonomous robot. A prototype interface has been built for planning the schedule of the Personal Satellite Assistant (PSA), a mobile robot designed for micro-gravity environments that is intended for use on the Space Shuttle and International Space Station. The spoken dialogue interface gives the user the capability to ask for a description of the plan, ask specific questions about the plan, and update or modify the plan. We anticipate that a spoken dialogue interface to the planner will provide a natural augmentation or alternative to the visualization interface, in situations in which the user needs very targeted information about the plan, in situations where natural language can express complex ideas more concisely than GUI actions, or in situations in which a graphical user interface is not appropriate.

  4. Recognition of 3D objects for autonomous mobile robot's navigation in automated shipbuilding

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Cho, Hyungsuck

    2007-10-01

    Nowadays many parts of shipbuilding process are automated, but the painting process is not, because of the difficulty of automated on-line painting quality measurement, harsh painting environment and the difficulty of robot navigation. However, the painting automation is necessary, because it can provide consistent performance of painting film thickness. Furthermore, autonomous mobile robots are strongly required for flexible painting work. However, the main problem of autonomous mobile robot's navigation is that there are many obstacles which are not expressed in the CAD data. To overcome this problem, obstacle detection and recognition are necessary to avoid obstacles and painting work effectively. Until now many object recognition algorithms have been studied, especially 2D object recognition methods using intensity image have been widely studied. However, in our case environmental illumination does not exist, so these methods cannot be used. To overcome this, to use 3D range data must be used, but the problem of using 3D range data is high computational cost and long estimation time of recognition due to huge data base. In this paper, we propose a 3D object recognition algorithm based on PCA (Principle Component Analysis) and NN (Neural Network). In the algorithm, the novelty is that the measured 3D range data is transformed into intensity information, and then adopts the PCA and NN algorithm for transformed intensity information to reduce the processing time and make the data easy to handle which are disadvantages of previous researches of 3D object recognition. A set of experimental results are shown to verify the effectiveness of the proposed algorithm.

  5. Control of distributed autonomous robotic systems using principles of pattern formation in nature and pedestrian behavior.

    PubMed

    Molnar, P; Starke, J

    2001-01-01

    Self-organized and error-resistant control of distributed autonomous robotic units in a manufacturing environment with obstacles where the robotic units have to be assigned to manufacturing targets in a cost effective way, is achieved by using two fundamental principles of nature. First, the selection behavior of modes is used which appears in pattern formation of physical, chemical and biological systems. Coupled selection equations based on these pattern formation principles can be used as dynamical system approach to assignment problems. These differential equations guarantee feasibility of the obtained solutions which is of great importance in industrial applications. Second, a model of behavioral forces is used, which has been successfully applied to describe self-organized crowd behavior of pedestrians. This novel approach includes collision avoidance as well as error resistivity. In particular, in systems where failures are of concern, the suggested approach outperforms conventional methods in covering up for sudden external changes like breakdowns of some robotic units. The capability of this system is demonstrated in computer simulations.

  6. Towards Autonomous Inspection of Space Systems Using Mobile Robotic Sensor Platforms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Saad, Ashraf; Litt, Jonathan S.

    2007-01-01

    The space transportation systems required to support NASA's Exploration Initiative will demand a high degree of reliability to ensure mission success. This reliability can be realized through autonomous fault/damage detection and repair capabilities. It is crucial that such capabilities are incorporated into these systems since it will be impractical to rely upon Extra-Vehicular Activity (EVA), visual inspection or tele-operation due to the costly, labor-intensive and time-consuming nature of these methods. One approach to achieving this capability is through the use of an autonomous inspection system comprised of miniature mobile sensor platforms that will cooperatively perform high confidence inspection of space vehicles and habitats. This paper will discuss the efforts to develop a small scale demonstration test-bed to investigate the feasibility of using autonomous mobile sensor platforms to perform inspection operations. Progress will be discussed in technology areas including: the hardware implementation and demonstration of robotic sensor platforms, the implementation of a hardware test-bed facility, and the investigation of collaborative control algorithms.

  7. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  8. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    SciTech Connect

    EISLER, G. RICHARD

    2002-08-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstrate the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.

  9. THERAPIST: Towards an Autonomous Socially Interactive Robot for Motor and Neurorehabilitation Therapies for Children

    PubMed Central

    Calderita, Luis Vicente; Manso, Luis J; Bustos, Pablo; Fernández, Fernando; Bandera, Antonio

    2014-01-01

    Background Neurorehabilitation therapies exploiting the use-dependent plasticity of our neuromuscular system are devised to help patients who suffer from injuries or diseases of this system. These therapies take advantage of the fact that the motor activity alters the properties of our neurons and muscles, including the pattern of their connectivity, and thus their functionality. Hence, a sensor-motor treatment where patients makes certain movements will help them (re)learn how to move the affected body parts. But these traditional rehabilitation processes are usually repetitive and lengthy, reducing motivation and adherence to the treatment, and thus limiting the benefits for the patients. Objective Our goal was to create innovative neurorehabilitation therapies based on THERAPIST, a socially assistive robot. THERAPIST is an autonomous robot that is able to find and execute plans and adapt them to new situations in real-time. The software architecture of THERAPIST monitors and determines the course of action, learns from previous experiences, and interacts with people using verbal and non-verbal channels. THERAPIST can increase the adherence of the patient to the sessions using serious games. Data are recorded and can be used to tailor patient sessions. Methods We hypothesized that pediatric patients would engage better in a therapeutic non-physical interaction with a robot, facilitating the design of new therapies to improve patient motivation. We propose RoboCog, a novel cognitive architecture. This architecture will enhance the effectiveness and time-of-response of complex multi-degree-of-freedom robots designed to collaborate with humans, combining two core elements: a deep and hybrid representation of the current state, own, and observed; and a set of task-dependent planners, working at different levels of abstraction but connected to this central representation through a common interface. Using RoboCog, THERAPIST engages the human partner in an active

  10. THERAPIST: Towards an Autonomous Socially Interactive Robot for Motor and Neurorehabilitation Therapies for Children.

    PubMed

    Calderita, Luis Vicente; Manso, Luis J; Bustos, Pablo; Suárez-Mejías, Cristina; Fernández, Fernando; Bandera, Antonio

    2014-10-07

    Neurorehabilitation therapies exploiting the use-dependent plasticity of our neuromuscular system are devised to help patients who suffer from injuries or diseases of this system. These therapies take advantage of the fact that the motor activity alters the properties of our neurons and muscles, including the pattern of their connectivity, and thus their functionality. Hence, a sensor-motor treatment where patients makes certain movements will help them (re)learn how to move the affected body parts. But these traditional rehabilitation processes are usually repetitive and lengthy, reducing motivation and adherence to the treatment, and thus limiting the benefits for the patients. Our goal was to create innovative neurorehabilitation therapies based on THERAPIST, a socially assistive robot. THERAPIST is an autonomous robot that is able to find and execute plans and adapt them to new situations in real-time. The software architecture of THERAPIST monitors and determines the course of action, learns from previous experiences, and interacts with people using verbal and non-verbal channels. THERAPIST can increase the adherence of the patient to the sessions using serious games. Data are recorded and can be used to tailor patient sessions. We hypothesized that pediatric patients would engage better in a therapeutic non-physical interaction with a robot, facilitating the design of new therapies to improve patient motivation. We propose RoboCog, a novel cognitive architecture. This architecture will enhance the effectiveness and time-of-response of complex multi-degree-of-freedom robots designed to collaborate with humans, combining two core elements: a deep and hybrid representation of the current state, own, and observed; and a set of task-dependent planners, working at different levels of abstraction but connected to this central representation through a common interface. Using RoboCog, THERAPIST engages the human partner in an active interactive process. But Robo

  11. Autonomous charging to enable long-endurance missions for small aerial robots

    NASA Astrophysics Data System (ADS)

    Mulgaonkar, Yash; Kumar, Vijay

    2014-06-01

    The past decade has seen an increased interest towards research involving Autonomous Micro Aerial Vehicles (MAVs). The predominant reason for this is their agility and ability to perform tasks too difficult or dangerous for their human counterparts and to navigate into places where ground robots cannot reach. Among MAVs, rotary wing aircraft such as quadrotors have the ability to operate in confined spaces, hover at a given point in space and perch1 or land on a flat surface. This makes the quadrotor a very attractive aerial platform giving rise to a myriad of research opportunities. The potential of these aerial platforms is severely limited by the constraints on the flight time due to limited battery capacity. This in turn arises from limits on the payload of these rotorcraft. By automating the battery recharging process, creating autonomous MAVs that can recharge their on-board batteries without any human intervention and by employing a team of such agents, the overall mission time can be greatly increased. This paper describes the development, testing, and implementation of a system of autonomous charging stations for a team of Micro Aerial Vehicles. This system was used to perform fully autonomous long-term multi-agent aerial surveillance experiments with persistent station keeping. The scalability of the algorithm used in the experiments described in this paper was also tested by simulating a persistence surveillance scenario for 10 MAVs and charging stations. Finally, this system was successfully implemented to perform a 9½ hour multi-agent persistent flight test. Preliminary implementation of this charging system in experiments involving construction of cubic structures with quadrotors showed a three-fold increase in effective mission time.

  12. Using Simulation to Evaluate Scientific Impact of Autonomous Robotic Capabilities for Mars

    NASA Astrophysics Data System (ADS)

    Haldemann, A. F.; McHenry, M. C.; Castano, R. A.; Cameraon, J. M.; Estlin, T. A.; Farr, T. G.; Jain, A.; Lee, M.; Leff, C. E.; Lim, C.; Nesnas, I. A.; Petras, R. D.; Pomerantz, M.; Powell, M.; Shu, I.; Wood, J.; Volpe, R.; Gaines, D. M.

    2006-12-01

    The Science Operations On Planetary Surfaces (SOOPS) task was created with the goal of evaluating, developing and validating methods for increasing the productivity of science operations on planetary surfaces. The highly integrated spacecraft-instrument payload systems of planetary surface missions create operational constraints (e.g. power, data volume, number of ground control interactions) that can reduce the effective science capabilities. Technological solutions have been proposed to mitigate the impact of those constraints on science return. For example, enhanced mobility autonomy, robotic arm autonomous deployment, and on- board image analysis have been implemented on the Mars Exploration Rovers. Next generation improvements involve on-board science driven decision-making and data collection. SOOPS takes a systems level approach to science operations and thus to evaluating and demonstrating the potential benefits of technologies that are already in development at the `component level'. A simulation environment---"Field Test in a Box" or SOOPS-FTB---has been developed with realistic terrains and scientifically pertinent information content. The terrain can be explored with a simulated spacecraft and instruments that are operated using an activity planning software interface which closely resembles that used for actual surface spacecraft missions. The simulation environment provides flexibility and control over experiments that help answer "what if" questions about the performance of proposed autonomous technologies. The experiments also help evaluate operator interaction with the autonomous system, and improve the designs of the control tools. We will report the recent results of SOOPS-FTB experiments with an on-board feature mapping capability, which is effectively an autonomous compression scheme. This example illustrates a demonstration of a new software scheme to operate within a known hardware configuration. It is also conceivable that SOOPS-FTB could be

  13. A field robot for autonomous laser-based N2O flux measurements

    NASA Astrophysics Data System (ADS)

    Molstad, Lars; Reent Köster, Jan; Bakken, Lars; Dörsch, Peter; Lien, Torgrim; Overskeid, Øyvind; Utstumo, Trygve; Løvås, Daniel; Brevik, Anders

    2014-05-01

    N2O measurements in multi-plot field trials are usually carried out by chamber-based manual gas sampling and subsequent laboratory-based gas chromatographic N2O determination. Spatial and temporal resolution of these measurements are commonly limited by available manpower. However, high spatial and temporal variability of N2O fluxes within individual field plots can add large uncertainties to time- and area-integrated flux estimates. Detailed mapping of this variability would improve these estimates, as well as help our understanding of the factors causing N2O emissions. An autonomous field robot was developed to increase the sampling frequency and to operate outside normal working hours. The base of this system was designed as an open platform able to carry versatile instrumentation. It consists of an electrically motorized platform powered by a lithium-ion battery pack, which is capable of autonomous navigation by means of a combined high precision real-time kinematic (RTK) GPS and an inertial measurement unit (IMU) system. On this platform an elevator is mounted, carrying a lateral boom with a static chamber on each side of the robot. Each chamber is equipped with a frame of plastic foam to seal the chamber when lowered onto the ground by the elevator. N2O flux from the soil covered by the two chambers is sequentially determined by circulating air between each chamber and a laser spectrometer (DLT-100, Los Gatos Research, Mountain View, CA, USA), which monitors the increase in N2O concentration. The target enclosure time is 1 - 2 minutes, but may be longer when emissions are low. CO2 concentrations are determined by a CO2/H2O gas analyzer (LI-840A, LI-COR Inc., Lincoln, NE, USA). Air temperature and air pressure inside both chambers are continuously monitored and logged. Wind speed and direction are monitored by a 3D sonic anemometer on top of the elevator boom. This autonomous field robot can operate during day and night time, and its working hours are only

  14. A real-time expert system for control of an autonomous mobile robot and for diagnosing unexpected occurrences

    SciTech Connect

    Kammer, D.W.; de Saussure, G.; Weisbin, C.R.

    1986-01-01

    The use of an expert system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explanation, verification and modification of the rules which determine the robot's behavior, and the domain of competence can be incrementally extended. However, real-time operation poses a number of challenges due to the dynamic nature of the data and the time constraints of dealing with a large data base. An implementation is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, monitoring, and diagnosis of unexpected occurrences during a navigation task. Control has been successfully implemented for goal directed navigation in the presence of moving obstacles.

  15. Real-time expert system for the control of autonomous robot navigation in the presence of moving obstacles

    SciTech Connect

    deSaussure, G.; Kammer, D.W.; Weisbin, C.R.

    1987-01-01

    The use of an expert system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explanation, verification and modification of the rules which determine the robot's behavior, and the domain of competence can be incrementally extended. However, real-time operation poses a number of challenges due to the dynamic nature of the data and the time constraints of dealing with a large database. An implementation is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, monitoring, and diagnosis of unexpected occurrences during a navigation task. Control has been successfully implemented for goal directed navigation in the presence of moving obstacles.

  16. Where neuroscience and dynamic system theory meet autonomous robotics: a contracting basal ganglia model for action selection.

    PubMed

    Girard, B; Tabareau, N; Pham, Q C; Berthoz, A; Slotine, J-J

    2008-05-01

    Action selection, the problem of choosing what to do next, is central to any autonomous agent architecture. We use here a multi-disciplinary approach at the convergence of neuroscience, dynamical system theory and autonomous robotics, in order to propose an efficient action selection mechanism based on a new model of the basal ganglia. We first describe new developments of contraction theory regarding locally projected dynamical systems. We exploit these results to design a stable computational model of the cortico-baso-thalamo-cortical loops. Based on recent anatomical data, we include usually neglected neural projections, which participate in performing accurate selection. Finally, the efficiency of this model as an autonomous robot action selection mechanism is assessed in a standard survival task. The model exhibits valuable dithering avoidance and energy-saving properties, when compared with a simple if-then-else decision rule.

  17. Self-localization for an autonomous mobile robot based on an omni-directional vision system

    NASA Astrophysics Data System (ADS)

    Chiang, Shu-Yin; Lin, Kuang-Yu; Chia, Tsorng-Lin

    2013-12-01

    In this study, we designed an autonomous mobile robot based on the rules of the Federation of International Robotsoccer Association (FIRA) RoboSot category, integrating the techniques of computer vision, real-time image processing, dynamic target tracking, wireless communication, self-localization, motion control, path planning, and control strategy to achieve the contest goal. The self-localization scheme of the mobile robot is based on the algorithms featured in the images from its omni-directional vision system. In previous works, we used the image colors of the field goals as reference points, combining either dual-circle or trilateration positioning of the reference points to achieve selflocalization of the autonomous mobile robot. However, because the image of the game field is easily affected by ambient light, positioning systems exclusively based on color model algorithms cause errors. To reduce environmental effects and achieve the self-localization of the robot, the proposed algorithm is applied in assessing the corners of field lines by using an omni-directional vision system. Particularly in the mid-size league of the RobotCup soccer competition, selflocalization algorithms based on extracting white lines from the soccer field have become increasingly popular. Moreover, white lines are less influenced by light than are the color model of the goals. Therefore, we propose an algorithm that transforms the omni-directional image into an unwrapped transformed image, enhancing the extraction features. The process is described as follows: First, radical scan-lines were used to process omni-directional images, reducing the computational load and improving system efficiency. The lines were radically arranged around the center of the omni-directional camera image, resulting in a shorter computational time compared with the traditional Cartesian coordinate system. However, the omni-directional image is a distorted image, which makes it difficult to recognize the

  18. Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.

    PubMed

    Ribeiro, C H; Hemerly, E M

    1999-06-01

    Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.

  19. Autonomous learning based on cost assumptions: theoretical studies and experiments in robot control.

    PubMed

    Ribeiro, C H; Hemerly, E M

    2000-02-01

    Autonomous learning techniques are based on experience acquisition. In most realistic applications, experience is time-consuming: it implies sensor reading, actuator control and algorithmic update, constrained by the learning system dynamics. The information crudeness upon which classical learning algorithms operate make such problems too difficult and unrealistic. Nonetheless, additional information for facilitating the learning process ideally should be embedded in such a way that the structural, well-studied characteristics of these fundamental algorithms are maintained. We investigate in this article a more general formulation of the Q-learning method that allows for a spreading of information derived from single updates towards a neighbourhood of the instantly visited state and converges to optimality. We show how this new formulation can be used as a mechanism to safely embed prior knowledge about the structure of the state space, and demonstrate it in a modified implementation of a reinforcement learning algorithm in a real robot navigation task.

  20. Portable robot for autonomous venipuncture using 3D near infrared image guidance.

    PubMed

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2013-09-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model.

  1. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    NASA Astrophysics Data System (ADS)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  2. Portable robot for autonomous venipuncture using 3D near infrared image guidance

    PubMed Central

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2015-01-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model. PMID:26120592

  3. Autonomous trajectory generation for mobile robots with non-holonomic and steering angle constraints

    SciTech Connect

    Pin, F.G.; Vasseur, H.A.

    1990-01-01

    This paper presents an approach to the trajectory planning of mobile platforms characterized by non-holonomic constraints and constraints on the steering angle and steering angle rate. The approach is based on geometric reasoning and provides deterministic trajectories for all pairs of initial and final configurations (position x, y, and orientation {theta}) of the robot. Furthermore, the method generates trajectories taking into account the forward and reverse mode of motion of the vehicle, or combination of these when complex maneuvering is involved or when the environment is obstructed with obstacles. The trajectory planning algorithm is described, and examples of trajectories generated for a variety of environmental conditions are presented. The generation of the trajectories only takes a few milliseconds of run time on a micro Vax, making the approach quite attractive for use as a real-time motion planner for teleoperated or sensor-based autonomous vehicles in complex environments. 10 refs., 11 figs.

  4. The Robotically Controlled Telescope (RCT): First Five Years of Fully Autonomous Operation

    NASA Astrophysics Data System (ADS)

    Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Strolger, Louis-Gregory; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    We review the status of the 1.3-meter Robotically Controlled Telescope (RCT), located at Kitt Peak National Observatory in Arizona. Through the efforts of a consortium of institutions, the RCT has been refurbished and automated to obtain optical images in support of a wide variety of astrophysical research investigations. The refurbished RCT came back on line in 2003, with observing undertaken via pre-scheduled scripts. Since 2007 the observatory has operated in a fully autonomous mode to acquire observations for the numerous and diverse research programs being pursued by the consortium membership and for guest observers. Many challenges and obstacles have been overcome throughout the refurbishment and automation, allowing this venerable telescope to continue its productive history.

  5. An algorithm for image clusters detection and identification based on color for an autonomous mobile robot

    SciTech Connect

    Uy, D.L.

    1996-02-01

    An algorithm for detection and identification of image clusters or {open_quotes}blobs{close_quotes} based on color information for an autonomous mobile robot is developed. The input image data are first processed using a crisp color fuszzyfier, a binary smoothing filter, and a median filter. The processed image data is then inputed to the image clusters detection and identification program. The program employed the concept of {open_quotes}elastic rectangle{close_quotes}that stretches in such a way that the whole blob is finally enclosed in a rectangle. A C-program is develop to test the algorithm. The algorithm is tested only on image data of 8x8 sizes with different number of blobs in them. The algorithm works very in detecting and identifying image clusters.

  6. Multiresolutional schemata for unsupervised learning of autonomous robots for 3D space operation

    NASA Technical Reports Server (NTRS)

    Lacaze, Alberto; Meystel, Michael; Meystel, Alex

    1994-01-01

    This paper describes a novel approach to the development of a learning control system for autonomous space robot (ASR) which presents the ASR as a 'baby' -- that is, a system with no a priori knowledge of the world in which it operates, but with behavior acquisition techniques that allows it to build this knowledge from the experiences of actions within a particular environment (we will call it an Astro-baby). The learning techniques are rooted in the recursive algorithm for inductive generation of nested schemata molded from processes of early cognitive development in humans. The algorithm extracts data from the environment and by means of correlation and abduction, it creates schemata that are used for control. This system is robust enough to deal with a constantly changing environment because such changes provoke the creation of new schemata by generalizing from experiences, while still maintaining minimal computational complexity, thanks to the system's multiresolutional nature.

  7. Robotic-Controlled, Autonomous Friction Stir Welding Processes for In-Situ Fabrication, Maintenance, and Repair

    NASA Astrophysics Data System (ADS)

    Zhou, W.

    NASA s new vision of human and robotic missions to the Moon Mars and beyond will demand large and permanent infrastructures on the Moon and other planets including power plants communication towers human and biomass habitats launch and landing facilities fabrication and repair workshops and research facilities so that material utilization and product development can be carried out and subsisted in-situ The conventional approach of transporting pre-constructed fabricated structures from earth to the Moon planets will no longer be feasible due to limited lifting capacity and extremely high transportation costs associated with long duration space travel To minimize transport of pre-made large structures between earth and the Moon planets minimize crew time for the fabrication and assembly of infrastructures on the Moon planets and to assure crew safety and maintain quality during the operation there is a strong need for robotic capabilities for in-situ fabrication maintenance and repair Clearly development of innovative autonomous in-situ fabrication maintenance and repair technologies is crucial to the success of both NASA s unmanned preparation missions and manned exploration missions In-space material joining is not new to NASA Many lessons were learned from NASA s International Space Welding Experiment which employed the Electron Beam Welding process for space welding experiments Significant safety concerns related to high-energy beams arcing spatter elecromagnetic fields and molten particles were

  8. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-03-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.

  9. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  10. Teaching and implementing autonomous robotic lab walkthroughs in a biotech laboratory through model-based visual tracking

    NASA Astrophysics Data System (ADS)

    Wojtczyk, Martin; Panin, Giorgio; Röder, Thorsten; Lenz, Claus; Nair, Suraj; Heidemann, Rüdiger; Goudar, Chetan; Knoll, Alois

    2010-01-01

    After utilizing robots for more than 30 years for classic industrial automation applications, service robots form a constantly increasing market, although the big breakthrough is still awaited. Our approach to service robots was driven by the idea of supporting lab personnel in a biotechnology laboratory. After initial development in Germany, a mobile robot platform extended with an industrial manipulator and the necessary sensors for indoor localization and object manipulation, has been shipped to Bayer HealthCare in Berkeley, CA, USA, a global player in the sector of biopharmaceutical products, located in the San Francisco bay area. The determined goal of the mobile manipulator is to support the off-shift staff to carry out completely autonomous or guided, remote controlled lab walkthroughs, which we implement utilizing a recent development of our computer vision group: OpenTL - an integrated framework for model-based visual tracking.

  11. Distributed, Collaborative Human-Robotic Networks for Outdoor Experiments in Search, Identify and Track

    DTIC Science & Technology

    2011-01-11

    design 3.3 Computers Each robot is designed to mount two Mini-ITX form factor custom computers. Each computer is equipped with a Core 2 Duo Mobile...curve built from the output from the A* algorithm The planned paths are then fed into a modified vector polar histogram ( VPH ) controller which...provides motor actuation commands to the Segway platform. The VPH controller continuously aims for a look-ahead point on the path a set distance away

  12. Hedonic quality or reward? A study of basic pleasure in homeostasis and decision making of a motivated autonomous robot.

    PubMed

    Lewis, Matthew; Cañamero, Lola

    2016-10-01

    We present a robot architecture and experiments to investigate some of the roles that pleasure plays in the decision making (action selection) process of an autonomous robot that must survive in its environment. We have conducted three sets of experiments to assess the effect of different types of pleasure-related versus unrelated to the satisfaction of physiological needs-under different environmental circumstances. Our results indicate that pleasure, including pleasure unrelated to need satisfaction, has value for homeostatic management in terms of improved viability and increased flexibility in adaptive behavior.

  13. On the design of neuro-controllers for individual and social learning behaviour in autonomous robots: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Pini, Giovanni; Tuci, Elio

    2008-06-01

    In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).

  14. Visual identification and similarity measures used for on-line motion planning of autonomous robots in unknown environments

    NASA Astrophysics Data System (ADS)

    Martínez, Fredy; Martínez, Fernando; Jacinto, Edwar

    2017-02-01

    In this paper we propose an on-line motion planning strategy for autonomous robots in dynamic and locally observable environments. In this approach, we first visually identify geometric shapes in the environment by filtering images. Then, an ART-2 network is used to establish the similarity between patterns. The proposed algorithm allows that a robot establish its relative location in the environment, and define its navigation path based on images of the environment and its similarity to reference images. This is an efficient and minimalist method that uses the similarity of landmark view patterns to navigate to the desired destination. Laboratory tests on real prototypes demonstrate the performance of the algorithm.

  15. Hedonic quality or reward? A study of basic pleasure in homeostasis and decision making of a motivated autonomous robot

    PubMed Central

    Lewis, Matthew; Cañamero, Lola

    2016-01-01

    We present a robot architecture and experiments to investigate some of the roles that pleasure plays in the decision making (action selection) process of an autonomous robot that must survive in its environment. We have conducted three sets of experiments to assess the effect of different types of pleasure—related versus unrelated to the satisfaction of physiological needs—under different environmental circumstances. Our results indicate that pleasure, including pleasure unrelated to need satisfaction, has value for homeostatic management in terms of improved viability and increased flexibility in adaptive behavior. PMID:28018120

  16. Autonomous global sky surveillance with real-time robotic follow-up: Night Sky Awareness through Thinking Telescopes Technology

    NASA Astrophysics Data System (ADS)

    Vestrand, T.; Davis, H.; Wren, J.; Wozniak, P.; Norman, B.; White, R.; Bloch, J.; Fenimore, E.; Hogge, B.; Jah, M.; Rast, R.

    We discuss the development of prototypes for a global grid of advanced "thinking" sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time, follow-up observations. The layered, fault-tolerant, network uses relatively inexpensive robotic EO sensors to provide persistent autonomous monitoring and real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  17. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  18. Real-time expert system for control of an autonomous mobile robot including diagnosis of unexpected occurrences

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Kammer, D.W.

    1986-01-01

    An autonomous mobile robot deals with the empirical world which is never fully predictable, hence it must continually monitor its performance by comparing the actual responses of sensors to their exected responses. Where a discrepancy occurs, the source of the discrepancy must be diagnosed and on-line corrective actions or replanning may be required. The use of a production system for the control of an autonomous robot presents several attractive features: the explicitness and homogeneity of the knowledge representation facilitates explaining, verifying and modifying the rules which determine the robot's behavior; it also permits the incremental extension of the domain of competence. However, real-time operation poses a number of challenges due to the dynamic nature of the data and because the system must frequently deal with a large knowledge base in a limited time. An implementation of a control system is discussed where a large commercial real-time expert system originally designed for industrial process diagnostics was adapted to the control of an autonomous mobile robot for planning, executing and monitoring a set of navigational tasks. One of the essential components of the problem domain is the occurrence of an ''unexpected'' happening e.g., as new obstacles are moved into the domain during the robot traverse, or when an obstacle undetectable by the long-range sonar sensors is suddenly observed by a proximity sensor. In a recent demonstration of the system, the detection of a problem generated an interrupt alarm, a diagnostic procedure, and a new plan, which was successfully executed in real time.

  19. Biomimetic evolutionary analysis: testing the adaptive value of vertebrate tail stiffness in autonomous swimming robots.

    PubMed

    Long, J H; Koob, T J; Irving, K; Combie, K; Engel, V; Livingston, N; Lammert, A; Schumacher, J

    2006-12-01

    For early vertebrates, a long-standing hypothesis is that vertebrae evolved as a locomotor adaptation, stiffening the body axis and enhancing swimming performance. While supported by biomechanical data, this hypothesis has not been tested using an evolutionary approach. We did so by extending biomimetic evolutionary analysis (BEA), which builds physical simulations of extinct systems, to include use of autonomous robots as proxies of early vertebrates competing in a forage navigation task. Modeled after free-swimming larvae of sea squirts (Chordata, Urochordata), three robotic tadpoles (;Tadros'), each with a propulsive tail bearing a biomimetic notochord of variable spring stiffness, k (N m(-1)), searched for, oriented to, and orbited in two dimensions around a light source. Within each of ten generations, we selected for increased swimming speed, U (m s(-1)) and decreased time to the light source, t (s), average distance from the source, R (m) and wobble maneuvering, W (rad s(-2)). In software simulation, we coded two quantitative trait loci (QTL) that determine k: bending modulus, E (Nm(-2)) and length, L (m). Both QTL were mutated during replication, independently assorted during meiosis and, as haploid gametes, entered into the gene pool in proportion to parental fitness. After random mating created three new diploid genotypes, we fabricated three new offspring tails. In the presence of both selection and chance events (mutation, genetic drift), the phenotypic means of this small population evolved. The classic hypothesis was supported in that k was positively correlated (r(2)=0.40) with navigational prowess, NP, the dimensionless ratio of U to the product of R, t and W. However, the plausible adaptive scenario, even in this simplified system, is more complex, since the remaining variance in NP was correlated with the residuals of R and U taken with respect to k, suggesting that changes in k alone are insufficient to explain the evolution of NP.

  20. Remote Sensing of Radiation Dose Rate by a Robot for Outdoor Usage

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Doi, K.; Kanematsu, H.; Utsumi, Y.; Hashimoto, R.; Takashina, T.

    2013-04-01

    In the present paper, the design and prototyping of a telemetry system, in which GPS, camera, and scintillation counter were mounted on a crawler type traveling vehicle, were conducted for targeting outdoor usage such as school playground. As a result, the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt. The results were as follows: (1) It was confirmed that the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt (running speed: 17[m/min]). (2) It was confirmed that the location information captured by GPS is visible on the Google map, and that the incorporation of video information is also possible to play. (3)A radiation dose rate of 0.09[μSv / h] was obtained in the ground. The value is less than the 1/40 ([3.8μSv / h]) allowable radiation dose rate for children in Fukushima Prefecture.(4)As a further work, modifying to program traveling, the measurement of the distribution of the radiation dose rate in a school of Fukushima Prefecture, and class delivery on radiation measurement will be carried out.

  1. Autonomous Scheduling of the 1.3-meter Robotically Controlled Telescope (RCT)

    NASA Astrophysics Data System (ADS)

    Strolger, Louis-Gregory; Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    The 1.3-meter telescope at Kitt Peak operates as a fully robotic instrument for optical imaging. An autonomous scheduling algorithm is an essential component of this observatory, and has been designed to manage numerous requests in various imaging modes in a manner similar to how requests are managed at queue-scheduled observatories, but with greater efficiency. Built from the INSGEN list generator and process spawner originally developed for the Berkeley Automatic Imaging Telescope, the RCT scheduler manages and integrates multi-user observations in real time, according to target and exposure information and program-specific constraints (e.g., user assigned priority, moon avoidance, airmass, or temporal constraints), while accounting for instrument limitations, meteorologic conditions, and other technical constraints. The robust system supports time-critical requests, such as with coordinated observations, while also providing short-term (hours) and long-term (days) monitoring capabilities, and one-off observations. We discuss the RCT scheduler, its current decision tree, and future prospects including integration with active partner-share monitoring (which factor into future observation requests) to insure fairness and parity of requests.

  2. Adjustably Autonomous Multi-agent Plan Execution with an Internal Spacecraft Free-Flying Robot Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Nicewarner, Keith

    2006-01-01

    We present an multi-agent model-based autonomy architecture with monitoring, planning, diagnosis, and execution elements. We discuss an internal spacecraft free-flying robot prototype controlled by an implementation of this architecture and a ground test facility used for development. In addition, we discuss a simplified environment control life support system for the spacecraft domain also controlled by an implementation of this architecture. We discuss adjustable autonomy and how it applies to this architecture. We describe an interface that provides the user situation awareness of both autonomous systems and enables the user to dynamically edit the plans prior to and during execution as well as control these agents at various levels of autonomy. This interface also permits the agents to query the user or request the user to perform tasks to help achieve the commanded goals. We conclude by describing a scenario where these two agents and a human interact to cooperatively detect, diagnose and recover from a simulated spacecraft fault.

  3. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics.

    PubMed

    Jaffe, Jules S; Franks, Peter J S; Roberts, Paul L D; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-24

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1-10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical-biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics.

  4. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics

    PubMed Central

    Jaffe, Jules S.; Franks, Peter J. S.; Roberts, Paul L. D.; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-01

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1–10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical–biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics. PMID:28117837

  5. A swarm of autonomous miniature underwater robot drifters for exploring submesoscale ocean dynamics

    NASA Astrophysics Data System (ADS)

    Jaffe, Jules S.; Franks, Peter J. S.; Roberts, Paul L. D.; Mirza, Diba; Schurgers, Curt; Kastner, Ryan; Boch, Adrien

    2017-01-01

    Measuring the ever-changing 3-dimensional (3D) motions of the ocean requires simultaneous sampling at multiple locations. In particular, sampling the complex, nonlinear dynamics associated with submesoscales (<1-10 km) requires new technologies and approaches. Here we introduce the Mini-Autonomous Underwater Explorer (M-AUE), deployed as a swarm of 16 independent vehicles whose 3D trajectories are measured near-continuously, underwater. As the vehicles drift with the ambient flow or execute preprogrammed vertical behaviours, the simultaneous measurements at multiple, known locations resolve the details of the flow within the swarm. We describe the design, construction, control and underwater navigation of the M-AUE. A field programme in the coastal ocean using a swarm of these robots programmed with a depth-holding behaviour provides a unique test of a physical-biological interaction leading to plankton patch formation in internal waves. The performance of the M-AUE vehicles illustrates their novel capability for measuring submesoscale dynamics.

  6. Towards autonomous locomotion: CPG-based control of smooth 3D slithering gait transition of a snake-like robot.

    PubMed

    Bing, Zhenshan; Cheng, Long; Chen, Guang; Röhrbein, Florian; Huang, Kai; Knoll, Alois

    2017-04-04

    Snake-like robots with 3D locomotion ability have significant advantages of adaptive travelling in diverse complex terrain over traditional legged or wheeled mobile robots. Despite numerous developed gaits, these snake-like robots suffer from unsmooth gait transitions by changing the locomotion speed, direction, and body shape, which would potentially cause undesired movement and abnormal torque. Hence, there exists a knowledge gap for snake-like robots to achieve autonomous locomotion. To address this problem, this paper presents the smooth slithering gait transition control based on a lightweight central pattern generator (CPG) model for snake-like robots. First, based on the convergence behavior of the gradient system, a lightweight CPG model with fast computing time was designed and compared with other widely adopted CPG models. Then, by reshaping the body into a more stable geometry, the slithering gait was modified, and studied based on the proposed CPG model, including the gait transition of locomotion speed, moving direction, and body shape. In contrast to sinusoid-based method, extensive simulations and prototype experiments finally demonstrated that smooth slithering gait transition can be effectively achieved using the proposed CPG-based control method without generating undesired locomotion and abnormal torque.

  7. Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.

    PubMed

    Downey, John E; Weiss, Jeffrey M; Muelling, Katharina; Venkatraman, Arun; Valois, Jean-Sebastien; Hebert, Martial; Bagnell, J Andrew; Schwartz, Andrew B; Collinger, Jennifer L

    2016-03-18

    Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users. NCT01364480 and NCT01894802 .

  8. The VIPER project (Visualization Integration Platform for Exploration Research): a biologically inspired autonomous reconfigurable robotic platform for diverse unstructured environments

    NASA Astrophysics Data System (ADS)

    Schubert, Oliver J.; Tolle, Charles R.

    2004-09-01

    Over the last decade the world has seen numerous autonomous vehicle programs. Wheels and track designs are the basis for many of these vehicles. This is primarily due to four main reasons: a vast preexisting knowledge base for these designs, energy efficiency of power sources, scalability of actuators, and the lack of control systems technologies for handling alternate highly complex distributed systems. Though large efforts seek to improve the mobility of these vehicles, many limitations still exist for these systems within unstructured environments, e.g. limited mobility within industrial and nuclear accident sites where existing plant configurations have been extensively changed. These unstructured operational environments include missions for exploration, reconnaissance, and emergency recovery of objects within reconfigured or collapsed structures, e.g. bombed buildings. More importantly, these environments present a clear and present danger for direct human interactions during the initial phases of recovery operations. Clearly, the current classes of autonomous vehicles are incapable of performing in these environments. Thus the next generation of designs must include highly reconfigurable and flexible autonomous robotic platforms. This new breed of autonomous vehicles will be both highly flexible and environmentally adaptable. Presented in this paper is one of the most successful designs from nature, the snake-eel-worm (SEW). This design implements shape memory alloy (SMA) actuators which allow for scaling of the robotic SEW designs from sub-micron scale to heavy industrial implementations without major conceptual redesigns as required in traditional hydraulic, pneumatic, or motor driven systems. Autonomous vehicles based on the SEW design posses the ability to easily move between air based environments and fluid based environments with limited or no reconfiguration. Under a SEW designed vehicle, one not only achieves vastly improved maneuverability within a

  9. Passive Optically Encoded Transponder (POET) An Acquisition And Alignment Target For Autonomous Robotics

    NASA Astrophysics Data System (ADS)

    White, G. K.

    1987-02-01

    Relative position information concerning an object that is to be acquired, attached to, or manipulated in some way by a robotic system is usually supplied by a known database or through vision information of some kind. Vision systems normally require some degree of intelligence to produce complete position information and therefore are relatively sophisticated, slow, or both. Simple "targets" require some amount of pattern recognition in autonomous operations and do not usually lend themselves to precision applications. This paper describes work on a discrete optical element prototype target which when interrogated by a video camera system, will provide noncontact relative position information about all 6 degrees-of-freedom (DOF). This information is available within the active field of view (FOV) of the transponder and could be processed by microprocessor-based, software algorithms with simple pattern recognition capabilities. The interrogation system (camera) is composed of a standard charge injection device (CID) array video camera, a controllable macrozoom lens, a liquid crystal shutter (LCS), and a point-source multispectral illuminator. This allows the transponder to be used where a standard video camera vision system is needed, or already implemented, and results in a relatively fast system (approximately 10 Hz). A passive optically encoded transponder (POET) implemented in a "stick-on" holographic optical element (HOE) is proposed as a next generation target, to supply relative position information in all 6 DOF for acquisition and precision alignment. In applications requiring maximum bandwidth and resolution, the fact that no "pattern recognition" is required in the proposed system results in the ability to interrogate the transponder in real time with a dedicated nonvision, interrogation system, resulting in a multiorder of magnitude increase in speed. The transponder (target) is configured to provide optimum information for the intended use. Being

  10. Control Algorithms and Simulated Environment Developed and Tested for Multiagent Robotics for Autonomous Inspection of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Wong, Edmond

    2005-01-01

    The NASA Glenn Research Center and academic partners are developing advanced multiagent robotic control algorithms that will enable the autonomous inspection and repair of future propulsion systems. In this application, on-wing engine inspections will be performed autonomously by large groups of cooperative miniature robots that will traverse the surfaces of engine components to search for damage. The eventual goal is to replace manual engine inspections that require expensive and time-consuming full engine teardowns and allow the early detection of problems that would otherwise result in catastrophic component failures. As a preliminary step toward the long-term realization of a practical working system, researchers are developing the technology to implement a proof-of-concept testbed demonstration. In a multiagent system, the individual agents are generally programmed with relatively simple controllers that define a limited set of behaviors. However, these behaviors are designed in such a way that, through the localized interaction among individual agents and between the agents and the environment, they result in self-organized, emergent group behavior that can solve a given complex problem, such as cooperative inspection. One advantage to the multiagent approach is that it allows for robustness and fault tolerance through redundancy in task handling. In addition, the relatively simple agent controllers demand minimal computational capability, which in turn allows for greater miniaturization of the robotic agents.

  11. Coastal zone environment measurements at Sakhalin Island using autonomous mobile robotic system

    NASA Astrophysics Data System (ADS)

    Tyugin, Dmitry; Kurkin, Andrey; Zaytsev, Andrey; Zeziulin, Denis; Makarov, Vladimir

    2017-04-01

    To perform continuous complex measurements of environment characteristics in coastal zones autonomous mobile robotic system was built. The main advantage of such system in comparison to manual measurements is an ability to quickly change location of the equipment and start measurements. AMRS allows to transport a set of sensors and appropriate power source for long distances. The equipment installed on the AMRS includes: a modern high-tech ship's radar «Micran» for sea waves measurements, multiparameter platform WXT 520 for weather monitoring, high precision GPS/GLONASS receiver OS-203 for georeferencing, laser scanner platform based on two Sick LMS-511 scanners which can provide 3D distance measurements in up to 80 meters on the AMRS route and rugged designed quad-core fanless computer Matrix MXE-5400 for data collecting and recording. The equipment is controlled by high performance modular software developed specially for the AMRS. During the summer 2016 the experiment was conducted. Measurements took place at the coastal zone of Sakhalin Island (Russia). The measuring system of AMRS was started in automatic mode controlled by the software. As result a lot of data was collected and processed to database. It consists of continuous measurements of the coastal zone including different weather conditions. The most interesting for investigation is a period of three-point storm detected on June, 2, 2016. Further work will relate to data processing of measured environment characteristics and numerical models verification based on the collected data. The presented results of research obtained by the support of the Russian president's scholarship for young scientists and graduate students №SP-193.2015.5

  12. Performance of multiple tasks by an autonomous robot using visual and ultrasound sensing

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Dickens, M. ); Weisbin, C.R. )

    1990-01-01

    While there have been many successful mobile robot experiments, only a few papers have addressed issues pertaining to the range of applicability, or robustness, of robotic systems. The purpose of this paper is to report results of a series of benchmark experiments done to determine and quantify the robustness of an integrated hardware and software system of a mobile robot. 5 refs., 6 figs.

  13. Assessing the Impact of an Autonomous Robotics Competition for STEM Education

    ERIC Educational Resources Information Center

    Chung, C. J. ChanJin; Cartwright, Christopher; Cole, Matthew

    2014-01-01

    Robotics competitions for K-12 students are popular, but are students really learning and improving their STEM scores through robotics competitions? If not, why not? If they are, how much more effective is learning through competitions than traditional classes? Is there room for improvement? What is the best robotics competition model to maximize…

  14. Master's in Autonomous Systems: An Overview of the Robotics Curriculum and Outcomes at ISEP, Portugal

    ERIC Educational Resources Information Center

    Silva, E.; Almeida, J.; Martins, A.; Baptista, J. P.; Campos Neves, B.

    2013-01-01

    Robotics research in Portugal is increasing every year, but few students embrace it as one of their first choices for study. Until recently, job offers for engineers were plentiful, and those looking for a degree in science and technology would avoid areas considered to be demanding, like robotics. At the undergraduate level, robotics programs are…

  15. Master's in Autonomous Systems: An Overview of the Robotics Curriculum and Outcomes at ISEP, Portugal

    ERIC Educational Resources Information Center

    Silva, E.; Almeida, J.; Martins, A.; Baptista, J. P.; Campos Neves, B.

    2013-01-01

    Robotics research in Portugal is increasing every year, but few students embrace it as one of their first choices for study. Until recently, job offers for engineers were plentiful, and those looking for a degree in science and technology would avoid areas considered to be demanding, like robotics. At the undergraduate level, robotics programs are…

  16. Assessing the Impact of an Autonomous Robotics Competition for STEM Education

    ERIC Educational Resources Information Center

    Chung, C. J. ChanJin; Cartwright, Christopher; Cole, Matthew

    2014-01-01

    Robotics competitions for K-12 students are popular, but are students really learning and improving their STEM scores through robotics competitions? If not, why not? If they are, how much more effective is learning through competitions than traditional classes? Is there room for improvement? What is the best robotics competition model to maximize…

  17. Application of chaotic dynamics in a recurrent neural network to control: hardware implementation into a novel autonomous roving robot.

    PubMed

    Li, Yongtao; Kurata, Shuhei; Morita, Shogo; Shimizu, So; Munetaka, Daigo; Nara, Shigetoshi

    2008-09-01

    Originating from a viewpoint that complex/chaotic dynamics would play an important role in biological system including brains, chaotic dynamics introduced in a recurrent neural network was applied to control. The results of computer experiment was successfully implemented into a novel autonomous roving robot, which can only catch rough target information with uncertainty by a few sensors. It was employed to solve practical two-dimensional mazes using adaptive neural dynamics generated by the recurrent neural network in which four prototype simple motions are embedded. Adaptive switching of a system parameter in the neural network results in stationary motion or chaotic motion depending on dynamical situations. The results of hardware implementation and practical experiment using it show that, in given two-dimensional mazes, the robot can successfully avoid obstacles and reach the target. Therefore, we believe that chaotic dynamics has novel potential capability in controlling, and could be utilized to practical engineering application.

  18. Current challenges in autonomous vehicle development

    NASA Astrophysics Data System (ADS)

    Connelly, J.; Hong, W. S.; Mahoney, R. B., Jr.; Sparrow, D. A.

    2006-05-01

    The field of autonomous vehicles is a rapidly growing one, with significant interest from both government and industry sectors. Autonomous vehicles represent the intersection of artificial intelligence (AI) and robotics, combining decision-making with real-time control. Autonomous vehicles are desired for use in search and rescue, urban reconnaissance, mine detonation, supply convoys, and more. The general adage is to use robots for anything dull, dirty, dangerous or dumb. While a great deal of research has been done on autonomous systems, there are only a handful of fielded examples incorporating machine autonomy beyond the level of teleoperation, especially in outdoor/complex environments. In an attempt to assess and understand the current state of the art in autonomous vehicle development, a few areas where unsolved problems remain became clear. This paper outlines those areas and provides suggestions for the focus of science and technology research. The first step in evaluating the current state of autonomous vehicle development was to develop a definition of autonomy. A number of autonomy level classification systems were reviewed. The resulting working definitions and classification schemes used by the authors are summarized in the opening sections of the paper. The remainder of the report discusses current approaches and challenges in decision-making and real-time control for autonomous vehicles. Suggested research focus areas for near-, mid-, and long-term development are also presented.

  19. Perception for Outdoor Navigation

    DTIC Science & Technology

    1991-12-01

    driving in traffic. The fifth and final chapter, ’Combining artificial neural networks and symbolic processing for autonomous robot guidance’, shows how we combine neural nets with map data in a complete system.

  20. A New, Open and Modular Platform for Research in Autonomous Four-Legged Robots

    NASA Astrophysics Data System (ADS)

    Friedmann, Martin; Petters, Sebastian; Risler, Max; Sakamoto, Hajime; von Stryk, Oskar; Thomas, Dirk

    In this paper the design goals for a new, open and modular, four-legged robot platform are described that was developed in reaction to the open call for a standard platform issued by the RoboCup Federation in 2006. The new robot should have similar motion and sensing capabilities like the previously used Sony AIBO plus several new ones. The hardware and software should be open, modular and reconfigurable. The robot should be resonably priced and allow annually upgrades.

  1. Autonomous Power: From War to Peace in the I-Robot Millennium

    DTIC Science & Technology

    2015-02-25

    operationalization of autonomous power at highest intergovernmental level. 15. SUBJECT TERMS: Autonomy, National Power, Artificial Intelligence 16. SECURITY...strategy for applying ways and means outstrips desired ends, the resulting outcome is risk. Autonomous power, the fruitful combination of artificial ... artificial speciation and successfully created novel species, contributing innumerable benefits to mankind and the world. In the not too distant

  2. Operator-centered control of a semi-autonomous industrial robot

    SciTech Connect

    Spelt, P.F.; Jones, S.L.

    1994-12-31

    This paper presents work done by Oak Ridge National Laboratory and Remotec, Inc., to develop a new operator-centered control system for Remotec`s Andros telerobot. Andros robots are presently used by numerous electric utilities, the armed forces, and numerous law enforcement agencies to perform tasks which are hazardous for human operators. This project has automated task components and enhanced the video graphics display of the robot`s position in the environment to significantly reduce operator workload. The procedure of automating a telerobot requires the addition of computer power to the robot, along with a variety of sensors and encoders to provide information about the robots performance in and relationship to its environment The resulting vehicle serves as a platform for research on strategies to integrate automated tasks with those performed by a human operator. The addition of these capabilities will greatly enhance the safety and efficiency of performance in hazardous environments.

  3. Integrating a complex electronic system in a small-scale autonomous instrumented robot: the NanoWalker project

    NASA Astrophysics Data System (ADS)

    Martel, Sylvain M.; Doyle, Kevin; Martinez, Gerardo; Hunter, Ian W.; Lafontaine, Serge

    1999-08-01

    The NanoWalker project is an attempt to explore a new approach in the development of various instruments. The idea is to build a small autonomous robot capable of nanometer range motions that will provide a standard platform for new miniaturized embedded instruments. This modular approach will allow easy expansion in instrumentation capability through the use of an arbitrary number of NanoWalkers which would perform similar or different measurement simultaneously on various samples. To do so, a fair amount of electronics must be embedded for infrared wireless communication, processing, support for the embedded instrument, and accurate control and drive capability for the piezo-actuated motion system. Miniaturization of the whole assembly is also a key characteristic to allow more robots to operate simultaneously within smaller surface areas. As such, new assembly techniques applicable to small volume production must be used to achieve the smallest possible implementation. The integration phase within the technological constraints is complicated by the fact that several factors such as the weight and weight distribution of the electronic assembly will have a direct impact on the very sensitive motion behavior of the robot. The NanoWalker is briefly described with the integration phases and the requirements that must be met by the assembly process.

  4. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  5. Intuitive control of mobile robots: an architecture for autonomous adaptive dynamic behaviour integration.

    PubMed

    Melidis, Christos; Iizuka, Hiroyuki; Marocco, Davide

    2017-06-05

    In this paper, we present a novel approach to human-robot control. Taking inspiration from behaviour-based robotics and self-organisation principles, we present an interfacing mechanism, with the ability to adapt both towards the user and the robotic morphology. The aim is for a transparent mechanism connecting user and robot, allowing for a seamless integration of control signals and robot behaviours. Instead of the user adapting to the interface and control paradigm, the proposed architecture allows the user to shape the control motifs in their way of preference, moving away from the case where the user has to read and understand an operation manual, or it has to learn to operate a specific device. Starting from a tabula rasa basis, the architecture is able to identify control patterns (behaviours) for the given robotic morphology and successfully merge them with control signals from the user, regardless of the input device used. The structural components of the interface are presented and assessed both individually and as a whole. Inherent properties of the architecture are presented and explained. At the same time, emergent properties are presented and investigated. As a whole, this paradigm of control is found to highlight the potential for a change in the paradigm of robotic control, and a new level in the taxonomy of human in the loop systems.

  6. The Embudito Mission: A Case Study of the Systematics of Autonomous Ground Mobile Robots

    SciTech Connect

    EICKER,PATRICK J.

    2001-02-01

    Ground mobile robots are much in the mind of defense planners at this time, being considered for a significant variety of missions with a diversity ranging from logistics supply to reconnaissance and surveillance. While there has been a very large amount of basic research funded in the last quarter century devoted to mobile robots and their supporting component technologies, little of this science base has been fully developed and deployed--notable exceptions being NASA's Mars rover and several terrestrial derivatives. The material in this paper was developed as a first exemplary step in the development of a more systematic approach to the R and D of ground mobile robots.

  7. Application of autonomous robotics to surveillance of waste storage containers for radioactive surface contamination

    SciTech Connect

    Sweeney, F.J.; Beckerman, M.; Butler, P.L.; Jones, J.P.; Reister, D.B.

    1991-01-01

    This paper describes a proof-of-principal demonstration performed with the HERMIES-III mobile robot to automate the inspection of waste storage drums for radioactive surface contamination and thereby reduce the human burden of operating a robot and worker exposure to potentially hazardous environments. Software and hardware for the demonstration were developed by a team consisting of Oak Ridge National Laboratory, and the Universities of Florida, Michigan, Tennessee, and Texas. Robot navigation, machine vision, manipulator control, parallel processing and human-machine interface techniques developed by the team were demonstrated utilizing advanced computer architectures. The demonstration consists of over 100,000 lines of computer code executing on nine computers.

  8. Application of autonomous robotics to surveillance of waste storage containers for radioactive surface contamination

    SciTech Connect

    Sweeney, F.J.; Beckerman, M.; Butler, P.L.; Jones, J.P.; Reister, D.B.

    1991-12-31

    This paper describes a proof-of-principal demonstration performed with the HERMIES-III mobile robot to automate the inspection of waste storage drums for radioactive surface contamination and thereby reduce the human burden of operating a robot and worker exposure to potentially hazardous environments. Software and hardware for the demonstration were developed by a team consisting of Oak Ridge National Laboratory, and the Universities of Florida, Michigan, Tennessee, and Texas. Robot navigation, machine vision, manipulator control, parallel processing and human-machine interface techniques developed by the team were demonstrated utilizing advanced computer architectures. The demonstration consists of over 100,000 lines of computer code executing on nine computers.

  9. Robotics.

    ERIC Educational Resources Information Center

    Waddell, Steve; Doty, Keith L.

    1999-01-01

    "Why Teach Robotics?" (Waddell) suggests that the United States lags behind Europe and Japan in use of robotics in industry and teaching. "Creating a Course in Mobile Robotics" (Doty) outlines course elements of the Intelligent Machines Design Lab. (SK)

  10. Robotics.

    ERIC Educational Resources Information Center

    Waddell, Steve; Doty, Keith L.

    1999-01-01

    "Why Teach Robotics?" (Waddell) suggests that the United States lags behind Europe and Japan in use of robotics in industry and teaching. "Creating a Course in Mobile Robotics" (Doty) outlines course elements of the Intelligent Machines Design Lab. (SK)

  11. Multi-sensor integration for autonomous robots in nuclear power plants

    SciTech Connect

    Mann, R.C.; Jones, J.P.; Beckerman, M.; Glover, C.W.; Farkas, L.; Bilbro, G.L.; Snyder, W.

    1989-01-01

    As part of a concerted RandD program in advanced robotics for hazardous environments, scientists and engineers at the Oak Ridge National Laboratory (ORNL) are performing research in the areas of systems integration, range-sensor-based 3-D world modeling, and multi-sensor integration. This program features a unique teaming arrangement that involves the universities of Florida, Michigan, Tennessee, and Texas; Odetics Corporation; and ORNL. This paper summarizes work directed at integrating information extracted from data collected with range sensors and CCD cameras on-board a mobile robot, in order to produce reliable descriptions of the robot's environment. Specifically, the paper describes the integration of two-dimensional vision and sonar range information, and an approach to integrate registered luminance and laser range images. All operations are carried out on-board the mobile robot using a 16-processor hypercube computer. 14 refs., 4 figs.

  12. Multi-Tier Multi-Agent Autonomous Robotic Planetary Surface/Subsurface Reconnaissance for Life

    NASA Astrophysics Data System (ADS)

    Fink, W.; Dohm, J. M.; Tarbell, M. A.; Hare, T. M.; Baker, V. R.; Schulze-Makuch, D.; Furfaro, R.; Fairén, A. G.; Ferré, T. P. A.; Miyamoto, H.; Komatsu, G.; Mahaney, W. C.

    2006-03-01

    Tier-scalable autonomous reconnaissance enables intelligent, unconstrained, and distributed science-driven exploration of prime locations on Venus, Mars, Io, Europa, Titan, and elsewhere, allowing for increased science return and the search for life.

  13. Magician Simulator. A Realistic Simulator for Heterogenous Teams of Autonomous Robots

    DTIC Science & Technology

    2011-01-18

    Communication is an issue in that data from the robots is expected to be provided at least once each second across an built up area that is up to 500m...0.5 meter, with updates expected every second. Other major issues included in the competition are as follows: • Power limitations—Robots need to...the simulator. A. The Environment The following issues are considered in the simulator: • The possibility of defining up to three phases, each

  14. Magician Simulator: A Realistic Simulator for Heterogenous Teams of Autonomous Robots. MAGIC 2010 Challenge

    DTIC Science & Technology

    2011-02-07

    communications, solar panels, and low power computer control. All components and peripherals were packaged as interchangeable modules, four per scout...Figure 1 for an example based on our simulation of the proposed Grand Challenge environment). Communication is an issue in that data from the robots...major issues included in the competition are as follows: • Power limitations—Robots need to return to base or enter a designated service zone (DSZ

  15. DC Motor Drive for Small Autonomous Robots with Educational and Research Purpose

    NASA Astrophysics Data System (ADS)

    Krklješ, Damir; Babković, Kalman; Nagy, László; Borovac, Branislav; Nikolić, Milan

    Many student robot competitions have been established during the last decade. One of them, and the most popular in Europe, is the European competition EUROBOT. The basic aim of this competition is to promote the robotics among young people, mostly students and high school pupils. The additional outcome of the competition is the development of faculty curriculums that are based on this competition. Such curriculum has been developed at the Faculty of Technical Science in Novi Sad. The curriculum duration is two semesters. During the first semester the theoretical basis is presented to the students. During the second semester the students, divided into teams of three to five students, develop the robots which will take part in the incoming EUROBOT competition. Since the time for the robot development is short, the basic electronic kit is provided for the students. The basic parts of the kit are two DC motor drives dedicated to the robot locomotion. The drives will also be used in the research concerning the multi segment robot foot. This paper presents the DC motor drive and its features. The experimental results concerning speed and position regulations and also the current limiting is presented too.

  16. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  17. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  18. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self

  19. Intelligent Mobile Autonomous System

    DTIC Science & Technology

    1987-01-01

    jerk application. (c) Negative jerk application. Group (a). Application of positve jerk. Force is increased from initial value to force of resistance...fundamentals of the new emerging area of autonomous robotics . The goal of this research is to develop a theory of design and functioning of Intelligent...scientific research. This report contributes to a new rapidly developing area of autonomous robotics . Actual experience of dealing with autonomous robots (or

  20. Autonomous intelligent assembly systems LDRD 105746 final report.

    SciTech Connect

    Anderson, Robert J.

    2013-04-01

    This report documents a three-year to develop technology that enables mobile robots to perform autonomous assembly tasks in unstructured outdoor environments. This is a multi-tier problem that requires an integration of a large number of different software technologies including: command and control, estimation and localization, distributed communications, object recognition, pose estimation, real-time scanning, and scene interpretation. Although ultimately unsuccessful in achieving a target brick stacking task autonomously, numerous important component technologies were nevertheless developed. Such technologies include: a patent-pending polygon snake algorithm for robust feature tracking, a color grid algorithm for uniquely identification and calibration, a command and control framework for abstracting robot commands, a scanning capability that utilizes a compact robot portable scanner, and more. This report describes this project and these developed technologies.

  1. Generating Self-Reliant Teams of Autonomous Cooperating Robots: Desired design Characteristics

    SciTech Connect

    Parker, L.E.

    1999-05-01

    The difficulties in designing a cooperative team are significant. Several of the key questions that must be resolved when designing a cooperative control architecture include: How do we formulate, describe, decompose, and allocate problems among a group of intelligent agents? How do we enable agents to communicate and interact? How do we ensure that agents act coherently in their actions? How do we allow agents to recognize and reconcile conflicts? However, in addition to these key issues, the software architecture must be designed to enable multi-robot teams to be robust, reliable, and flexible. Without these capabilities, the resulting robot team will not be able to successfully deal with the dynamic and uncertain nature of the real world. In this extended abstract, we first describe these desired capabilities. We then briefly describe the ALLIANCE software architecture that we have previously developed for multi-robot cooperation. We then briefly analyze the ALLIANCE architecture in terms of the desired design qualities identified.

  2. Ground Simulation of an Autonomous Satellite Rendezvous and Tracking System Using Dual Robotic Systems

    NASA Technical Reports Server (NTRS)

    Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.

    2012-01-01

    A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.

  3. Behavior generation strategy of artificial behavioral system by self-learning paradigm for autonomous robot tasks

    NASA Astrophysics Data System (ADS)

    Dağlarli, Evren; Temeltaş, Hakan

    2008-04-01

    In this study, behavior generation and self-learning paradigms are investigated for the real-time applications of multi-goal mobile robot tasks. The method is capable to generate new behaviors and it combines them in order to achieve multi goal tasks. The proposed method is composed from three layers: Behavior Generating Module, Coordination Level and Emotion -Motivation Level. Last two levels use Hidden Markov models to manage dynamical structure of behaviors. The kinematics and dynamic model of the mobile robot with non-holonomic constraints are considered in the behavior based control architecture. The proposed method is tested on a four-wheel driven and four-wheel steered mobile robot with constraints in simulation environment and results are obtained successfully.

  4. Creative Engineering Based Education with Autonomous Robots Considering Job Search Support

    NASA Astrophysics Data System (ADS)

    Takezawa, Satoshi; Nagamatsu, Masao; Takashima, Akihiko; Nakamura, Kaeko; Ohtake, Hideo; Yoshida, Kanou

    The Robotics Course in our Mechanical Systems Engineering Department offers “Robotics Exercise Lessons” as one of its Problem-Solution Based Specialized Subjects. This is intended to motivate students learning and to help them acquire fundamental items and skills on mechanical engineering and improve understanding of Robotics Basic Theory. Our current curriculum was established to accomplish this objective based on two pieces of research in 2005: an evaluation questionnaire on the education of our Mechanical Systems Engineering Department for graduates and a survey on the kind of human resources which companies are seeking and their expectations for our department. This paper reports the academic results and reflections of job search support in recent years as inherited and developed from the previous curriculum.

  5. A modular structure for the control of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Perebaskine, Victor-Olivier

    1992-02-01

    A hierarchically organized robot control structure is described. Four steps were achieved: characterization of the basic element types composing the functional layer, and identification of interaction modes between the different components of the control structure; specification and implementation of specific communication mechanisms to support these interaction modes in the control structure; study of the module structure, and its control; implementation of the system and validation by several experiments. One of the major aspects of this system is the possibility to program the robot's reactivity according to the mission requirements. Another important aspect is the ability to modify the relationships between modules during mission execution.

  6. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  7. First observations of teleseismic P-waves with autonomous underwater robots: towards future global network of mobile seismometers

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexei; Nolet, Guust; Hello, Yann; Simons, Frederik; Bonnieux, Sébastien

    2013-04-01

    We report here the first successful observations of underwater acoustic signals generated by teleseismic P-waves recorded by autonomous robots MERMAID (short for Mobile Earthquake Recording in Marine Areas by Independent Divers). During 2011-2012 we have conducted three test campaigns for a total duration of about 8 weeks in the Ligurian Sea which have allowed us to record nine teleseismic events (distance more than 60 degree) of magnitudes higher than 6 and one closer event (distance 23 degree) of magnitude 5.5. Our results indicate that no simple relation exists between the magnitude of the source event and the signal-to-noise ratio (SNR) of the corresponding acoustic signals. Other factors, such as fault orientation and meteorological conditions, play an important role in the detectability of the seismic events. We also show examples of the events recorded during these test runs and how their frequency characteristics allow them to be recognized automatically by an algorithm based on the wavelet transform. We shall also report on more recent results obtained during the first fully autonomous run (currently ongoing) of the final MERMAID design in the Mediterranean Sea.

  8. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot.

    PubMed

    Chen, Alvin I; Balter, Max L; Maguire, Timothy J; Yarmush, Martin L

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture.

  9. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot

    PubMed Central

    Chen, Alvin I.; Balter, Max L.; Maguire, Timothy J.; Yarmush, Martin L.

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture. PMID:26779381

  10. Approaching Complexity through Planful Play: Kindergarten Children's Strategies in Constructing an Autonomous Robot's Behavior

    ERIC Educational Resources Information Center

    Levy, S. T.; Mioduser, D.

    2010-01-01

    This study investigates how young children master, construct and understand intelligent rule-based robot behaviors, focusing on their strategies in gradually meeting the tasks' complexity. The wider aim is to provide a comprehensive map of the kinds of transitions and learning that take place in constructing simple emergent behaviors, particularly…

  11. 3-D Ultrasound Guidance of Autonomous Robot for Location of Ferrous Shrapnel

    PubMed Central

    Rogers, Albert J.; Light, Edward D.

    2010-01-01

    Vibrations can be induced in ferromagnetic shrapnel by a variable electromagnet. Real time 3-D color Doppler ultrasound located the induced motion in a needle fragment and determined its 3-D position in the scanner coordinates. This information was used to guide a robot which moved a probe to touch the shrapnel fragment. PMID:19574140

  12. Approaching Complexity through Planful Play: Kindergarten Children's Strategies in Constructing an Autonomous Robot's Behavior

    ERIC Educational Resources Information Center

    Levy, S. T.; Mioduser, D.

    2010-01-01

    This study investigates how young children master, construct and understand intelligent rule-based robot behaviors, focusing on their strategies in gradually meeting the tasks' complexity. The wider aim is to provide a comprehensive map of the kinds of transitions and learning that take place in constructing simple emergent behaviors, particularly…

  13. Design of a low-cost high-performance autonomous robot for nuclear environments

    SciTech Connect

    Burhanpurkar, V.P.

    1994-12-31

    This paper presents two key aspects of a novel low-cost modular mobile robot architecture for nuclear environments. Key features of the system are (a) a novel ultrasonic sensor for scene analysis in unstructured environments and (b) an efficient ground-search algorithm for ground-level contamination mapping without a priori maps or preprogramming.

  14. 3-D ultrasound guidance of autonomous robot for location of ferrous shrapnel.

    PubMed

    Rogers, Albert J; Light, Edward D; Smith, Stephen W

    2009-07-01

    Vibrations can be induced in ferromagnetic shrapnel by a variable electromagnet. Real time 3-D color Doppler ultrasound located the induced motion in a needle fragment and determined its 3-D position in the scanner coordinates. This information was used to guide a robot which moved a probe to touch the shrapnel fragment.

  15. Navy Requirements for Controlling Multiple Off-Board Robots Using the Autonomous Unmanned Vehicle Workbench

    DTIC Science & Technology

    2007-06-01

    I18N ) .................................................................9 3. Java Look + Feel...User Interface GWOT Global War on Terror xx HSI Human-Systems Integration HTML Hypertext Markup Language I18N Internationalization ID...without legal restrictions. 9 2. Internationalization ( I18N ) The United States is not the only country using AUVs. When robots from the US

  16. The Rise of Robots: The Military’s Use of Autonomous Lethal Force

    DTIC Science & Technology

    2015-02-17

    Michael Faraday stumbled upon a unique observation.1 Faraday discovered that electrical conduction increases with temperature in silver sulfide...integrated circuit, made of the silicates first discovered by Faraday , that brought the revolution in computer technology which has fundamentally... Michael Contratto aptly pointed out the potential for autonomous systems to circumvent the LOAC rules given our reliance on systems-of-systems and

  17. The research of autonomous obstacle avoidance of mobile robot based on multi-sensor integration

    NASA Astrophysics Data System (ADS)

    Zhao, Ming; Han, Baoling

    2016-11-01

    The object of this study is the bionic quadruped mobile robot. The study has proposed a system design plan for mobile robot obstacle avoidance with the binocular stereo visual sensor and the self-control 3D Lidar integrated with modified ant colony optimization path planning to realize the reconstruction of the environmental map. Because the working condition of a mobile robot is complex, the result of the 3D reconstruction with a single binocular sensor is undesirable when feature points are few and the light condition is poor. Therefore, this system integrates the stereo vision sensor blumblebee2 and the Lidar sensor together to detect the cloud information of 3D points of environmental obstacles. This paper proposes the sensor information fusion technology to rebuild the environment map. Firstly, according to the Lidar data and visual data on obstacle detection respectively, and then consider two methods respectively to detect the distribution of obstacles. Finally fusing the data to get the more complete, more accurate distribution of obstacles in the scene. Then the thesis introduces ant colony algorithm. It has analyzed advantages and disadvantages of the ant colony optimization and its formation cause deeply, and then improved the system with the help of the ant colony optimization to increase the rate of convergence and precision of the algorithm in robot path planning. Such improvements and integrations overcome the shortcomings of the ant colony optimization like involving into the local optimal solution easily, slow search speed and poor search results. This experiment deals with images and programs the motor drive under the compiling environment of Matlab and Visual Studio and establishes the visual 2.5D grid map. Finally it plans a global path for the mobile robot according to the ant colony algorithm. The feasibility and effectiveness of the system are confirmed by ROS and simulation platform of Linux.

  18. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  19. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  20. A demonstration of autonomous navigation and machine vision using the HERMIES-IIB robot

    SciTech Connect

    Burks, B.L.; Barnett, D.L.; Jones, J.P.; Killough, S.M.

    1987-01-01

    In this paper, advances to our mobile robot series (currently HERMIES-IIB) to include 8 NCUBE processors on-board, (computationally equivalent to 8 Vax 11/780's) operating in parallel, and augmentation of the sensor suite with cameras to facilitate on-board vision analysis and goal finding are described. The essential capabilities of the expert system described in earlier papers have been ported to the on-board HERMIES-IIB computers thereby eliminating off-board computation. A successful experiment is described in which a robot is placed in an initial arbitrary location without prior specification of the room contents, successfully discovers and navigates around stationary and moving obstacles, picks up and moves small obstacles, searches for a control panel, and reads the meters found on the panel. 19 refs., 5 figs.

  1. Automated Cartography by an Autonomous Mobile Robot Using Ultrasonic Range Finders

    DTIC Science & Technology

    1993-09-01

    Macintosh Powerbook 145 notebook computer with an Artic- ulate Systems Voice Navigator voice interface is provided for user communications with the...for corrosion and loose fittings. A voice interface is currently under development to provide a more intuitive robot/hu- man. This system will enable...Please read the owner’s manual prior to operating this computer. A Voice Navigator voice interface system is also available for voice recognition

  2. High-accuracy drilling with an image guided light weight robot: autonomous versus intuitive feed control.

    PubMed

    Tauscher, Sebastian; Fuchs, Alexander; Baier, Fabian; Kahrs, Lüder A; Ortmaier, Tobias

    2017-07-13

    Assistance of robotic systems in the operating room promises higher accuracy and, hence, demanding surgical interventions become realisable (e.g. the direct cochlear access). Additionally, an intuitive user interface is crucial for the use of robots in surgery. Torque sensors in the joints can be employed for intuitive interaction concepts. Regarding the accuracy, they lead to a lower structural stiffness and, thus, to an additional error source. The aim of this contribution is to examine, if an accuracy needed for demanding interventions can be achieved by such a system or not. Feasible accuracy results of the robot-assisted process depend on each work-flow step. This work focuses on the determination of the tool coordinate frame. A method for drill axis definition is implemented and analysed. Furthermore, a concept of admittance feed control is developed. This allows the user to control feeding along the planned path by applying a force to the robots structure. The accuracy is researched by drilling experiments with a PMMA phantom and artificial bone blocks. The described drill axis estimation process results in a high angular repeatability ([Formula: see text]). In the first set of drilling results, an accuracy of [Formula: see text] at entrance and [Formula: see text] at target point excluding imaging was achieved. With admittance feed control an accuracy of [Formula: see text] at target point was realised. In a third set twelve holes were drilled in artificial temporal bone phantoms including imaging. In this set-up an error of [Formula: see text] and [Formula: see text] was achieved. The results of conducted experiments show that accuracy requirements for demanding procedures such as the direct cochlear access can be fulfilled with compliant systems. Furthermore, it was shown that with the presented admittance feed control an accuracy of less then [Formula: see text] is achievable.

  3. Robust Agent Control of an Autonomous Robot with Many Sensors and Actuators

    DTIC Science & Technology

    1993-05-01

    The neural con- troller was developed by Beer and was inspired by Pearson’s flexor burst- generator model of cockroach locomotion. The goal was to...Workshop on Intelligent Robots and Systems’, Ibaraki, Japan, pp. 383-388. Beer , R. & Chiel, H. (1993), Simulations of Cockroach Locomotion and Es...Previous work exploring fully distributed, insect-like locomotion controllers has only been addressed for flat terrain ( Beer , Chiel, Quinn & Espenschied

  4. An autonomous robot inspired by insect neurophysiology pursues moving features in natural environments

    NASA Astrophysics Data System (ADS)

    Bagheri, Zahra M.; Cazzolato, Benjamin S.; Grainger, Steven; O'Carroll, David C.; Wiederman, Steven D.

    2017-08-01

    Objective. Many computer vision and robotic applications require the implementation of robust and efficient target-tracking algorithms on a moving platform. However, deployment of a real-time system is challenging, even with the computational power of modern hardware. Lightweight and low-powered flying insects, such as dragonflies, track prey or conspecifics within cluttered natural environments, illustrating an efficient biological solution to the target-tracking problem. Approach. We used our recent recordings from ‘small target motion detector’ neurons in the dragonfly brain to inspire the development of a closed-loop target detection and tracking algorithm. This model exploits facilitation, a slow build-up of response to targets which move along long, continuous trajectories, as seen in our electrophysiological data. To test performance in real-world conditions, we implemented this model on a robotic platform that uses active pursuit strategies based on insect behaviour. Main results. Our robot performs robustly in closed-loop pursuit of targets, despite a range of challenging conditions used in our experiments; low contrast targets, heavily cluttered environments and the presence of distracters. We show that the facilitation stage boosts responses to targets moving along continuous trajectories, improving contrast sensitivity and detection of small moving targets against textured backgrounds. Moreover, the temporal properties of facilitation play a useful role in handling vibration of the robotic platform. We also show that the adoption of feed-forward models which predict the sensory consequences of self-movement can significantly improve target detection during saccadic movements. Significance. Our results provide insight into the neuronal mechanisms that underlie biological target detection and selection (from a moving platform), as well as highlight the effectiveness of our bio-inspired algorithm in an artificial visual system.

  5. The Backseat Control Architecture for Autonomous Robotic Vehicles: A Case Study with the Iver2 AUV

    DTIC Science & Technology

    2010-06-01

    the vehicle to perform real-time adaptive environ- mental sampling. In addition to the oceanographic sensors, each vehicle is equipped with a 16...acoustic data to a set of 8-channel analog to digital (A/D) converter boards described in II-A.5, allowing the vehicle to perform real-time underwater target...for oceanographic sampling with the autonomous kayaks are described in [9]. III. THE BACKSEAT CONTROL ARCHITECTURE A. Overview The iOceanServerComms

  6. A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion.

    PubMed

    Manfredi, L; Assaf, T; Mintchev, S; Marrazza, S; Capantini, L; Orofino, S; Ascari, L; Grillner, S; Wallén, P; Ekeberg, O; Stefanini, C; Dario, P

    2013-10-01

    The bioinspired approach has been key in combining the disciplines of robotics with neuroscience in an effective and promising fashion. Indeed, certain aspects in the field of neuroscience, such as goal-directed locomotion and behaviour selection, can be validated through robotic artefacts. In particular, swimming is a functionally important behaviour where neuromuscular structures, neural control architecture and operation can be replicated artificially following models from biology and neuroscience. In this article, we present a biomimetic system inspired by the lamprey, an early vertebrate that locomotes using anguilliform swimming. The artefact possesses extra- and proprioceptive sensory receptors, muscle-like actuation, distributed embedded control and a vision system. Experiments on optimised swimming and on goal-directed locomotion are reported, as well as the assessment of the performance of the system, which shows high energy efficiency and adaptive behaviour. While the focus is on providing a robotic platform for testing biological models, the reported system can also be of major relevance for the development of engineering system applications.

  7. Novel Microbial Diversity Retrieved by Autonomous Robotic Exploration of the World's Deepest Vertical Phreatic Sinkhole

    NASA Astrophysics Data System (ADS)

    Sahl, Jason W.; Fairfield, Nathaniel; Harris, J. Kirk; Wettergreen, David; Stone, William C.; Spear, John R.

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (˜318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  8. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  9. An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers

    NASA Technical Reports Server (NTRS)

    Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun

    2007-01-01

    One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the

  10. Design and implementation of a mechanically heterogeneous robot group

    NASA Astrophysics Data System (ADS)

    Sukhatme, Gaurav S.; Montgomery, James F.; Mataric, Maja J.

    1999-08-01

    This paper describes the design and construction of a cooperative, heterogeneous robot group comprised of one semi-autonomous aerial robot and two autonomous ground robots. The robots are designed to perform automated surveillance and reconnaissance of an urban outdoor area using onboard sensing. The ground vehicles have GPS, sonar for obstacle detection and avoidance, and a simple color- based vision system. Navigation is performed using an optimal mixture of odometry and GPS. The helicopter is equipped with a GPS/INS system, a camera, and a framegrabber. Each robot has an embedded 486 PC/104 processor running the QNX real-time operating system. Individual robot controllers are behavior-based and decentralized. We describe a control strategy and architecture that coordinates the robots with minimal top- down planning. The overall system is controlled at high level by a single human operator using a specially designed control unit. The operator is able to task the group with a mission using a minimal amount of training. The group can re-task itself based on sensor inputs and can also be re- tasked by the operator. We describe a particular reconnaissance mission that the robots have been tested with, and lessons learned during the design and implementation. Our initial results with these experiments are encouraging given the challenging mechanics of the aerial robot. We conclude the paper with a discussion of ongoing and future work.

  11. Dissociated Emergent-Response System and Fine-Processing System in Human Neural Network and a Heuristic Neural Architecture for Autonomous Humanoid Robots

    PubMed Central

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence. PMID:21331371

  12. Dissociated emergent-response system and fine-processing system in human neural network and a heuristic neural architecture for autonomous humanoid robots.

    PubMed

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence.

  13. Robotics

    SciTech Connect

    Scheide, A.W.

    1983-11-01

    This article reviews some of the technical areas and history associated with robotics, provides information relative to the formation of a Robotics Industry Committee within the Industry Applications Society (IAS), and describes how all activities relating to robotics will be coordinated within the IEEE. Industrial robots are being used for material handling, processes such as coating and arc welding, and some mechanical and electronics assembly. An industrial robot is defined as a programmable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through variable programmed motions for a variety of tasks. The initial focus of the Robotics Industry Committee will be on the application of robotics systems to the various industries that are represented within the IAS.

  14. CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2009-12-01

    While artificial vision prostheses are quickly becoming a reality, actual testing time with visual prosthesis carriers is at a premium. Moreover, it is helpful to have a more realistic functional approximation of a blind subject. Instead of a normal subject with a healthy retina looking at a low-resolution (pixelated) image on a computer monitor or head-mounted display, a more realistic approximation is achieved by employing a subject-independent mobile robotic platform that uses a pixelated view as its sole visual input for navigation purposes. We introduce CYCLOPS: an AWD, remote controllable, mobile robotic platform that serves as a testbed for real-time image processing and autonomous navigation systems for the purpose of enhancing the visual experience afforded by visual prosthesis carriers. Complete with wireless Internet connectivity and a fully articulated digital camera with wireless video link, CYCLOPS supports both interactive tele-commanding via joystick, and autonomous self-commanding. Due to its onboard computing capabilities and extended battery life, CYCLOPS can perform complex and numerically intensive calculations, such as image processing and autonomous navigation algorithms, in addition to interfacing to additional sensors. Its Internet connectivity renders CYCLOPS a worldwide accessible testbed for researchers in the field of artificial vision systems. CYCLOPS enables subject-independent evaluation and validation of image processing and autonomous navigation systems with respect to the utility and efficiency of supporting and enhancing visual prostheses, while potentially reducing to a necessary minimum the need for valuable testing time with actual visual prosthesis carriers.

  15. Autonomous Marine Robotic Technology Reveals an Expansive Benthic Bacterial Community Relevant to Regional Nitrogen Biogeochemistry.

    PubMed

    Valentine, David L; Fisher, G Burch; Pizarro, Oscar; Kaiser, Carl L; Yoerger, Dana; Breier, John A; Tarn, Jonathan

    2016-10-06

    Benthic accumulations of filamentous, mat-forming bacteria occur throughout the oceans where bisulfide mingles with oxygen or nitrate, providing key but poorly quantified linkages between elemental cycles of carbon, nitrogen and sulfur. Here we used the autonomous underwater vehicle Sentry to conduct a contiguous, 12.5 km photoimaging survey of sea-floor colonies of filamentous bacteria between 80 and 579 m water depth, spanning the continental shelf to the deep suboxic waters of the Santa Barbara Basin (SBB). The survey provided >31 000 images and revealed contiguous, white-colored bacterial colonization coating > ∼80% of the ocean floor and spanning over 1.6 km, between 487 and 523 m water depth. Based on their localization within the stratified waters of the SBB we hypothesize a dynamic and annular biogeochemical zonation by which the bacteria capitalize on periodic flushing events to accumulate and utilize nitrate. Oceanographic time series data bracket the imaging survey and indicate rapid and contemporaneous nitrate loss, while autonomous capture of microbial communities from the benthic boundary layer concurrent with imaging provides possible identities for the responsible bacteria. Based on these observations we explore the ecological context of such mats and their possible importance in the nitrogen cycle of the SBB.

  16. Compact 3D lidar based on optically coupled horizontal and vertical scanning mechanism for the autonomous navigation of robots

    NASA Astrophysics Data System (ADS)

    Lee, Min-Gu; Baeg, Seung-Ho; Lee, Ki-Min; Lee, Hae-Seok; Baeg, Moon-Hong; Park, Jong-Ok; Kim, Hong-Ki

    2011-06-01

    The purpose of this research is to develop a new 3D LIDAR sensor, named KIDAR-B25, for measuring 3D image information with high range accuracy, high speed and compact size. To measure a distance to the target object, we developed a range measurement unit, which is implemented by the direct Time-Of-Flight (TOF) method using TDC chip, a pulsed laser transmitter as an illumination source (pulse width: 10 ns, wavelength: 905 nm, repetition rate: 30kHz, peak power: 20W), and an Si APD receiver, which has high sensitivity and wide bandwidth. Also, we devised a horizontal and vertical scanning mechanism, climbing in a spiral and coupled with the laser optical path. Besides, control electronics such as the motor controller, the signal processing unit, the power distributor and so on, are developed and integrated in a compact assembly. The key point of the 3D LIDAR design proposed in this paper is to use the compact scanning mechanism, which is coupled with optical module horizontally and vertically. This KIDAR-B25 has the same beam propagation axis for emitting pulse laser and receiving reflected one with no optical interference each other. The scanning performance of the KIDAR-B25 has proven with the stable operation up to 20Hz (vertical), 40Hz (horizontal) and the time is about 1.7s to reach the maximum speed. The range of vertical plane can be available up to +/-10 degree FOV (Field Of View) with a 0.25 degree angular resolution. The whole horizontal plane (360 degree) can be also available with 0.125 degree angular resolution. Since the KIDAR-B25 sensor has been planned and developed to be used in mobile robots for navigation, we conducted an outdoor test for evaluating its performance. The experimental results show that the captured 3D imaging data can be usefully applicable to the navigation of the robot for detecting and avoiding the moving objects with real time.

  17. Vertical stream curricula integration of problem-based learning using an autonomous vacuum robot in a mechatronics course

    NASA Astrophysics Data System (ADS)

    Chin, Cheng; Yue, Keng

    2011-10-01

    Difficulties in teaching a multi-disciplinary subject such as the mechatronics system design module in Departments of Mechatronics Engineering at Temasek Polytechnic arise from the gap in experience and skill among staff and students who have different backgrounds in mechanical, computer and electrical engineering within the Mechatronics Department. The departments piloted a new vertical stream curricula model (VSCAM) to enhance student learning in mechatronics system design through integration of educational activities from the first to the second year of the course. In this case study, a problem-based learning (PBL) method on an autonomous vacuum robot in the mechatronics systems design module was proposed to allow the students to have hands-on experience in the mechatronics system design. The proposed works included in PBL consist of seminar sessions, weekly works and project presentation to provide holistic assessment on teamwork and individual contributions. At the end of VSCAM, an integrative evaluation was conducted using confidence logs, attitude surveys and questionnaires. It was found that the activities were quite appreciated by the participating staff and students. Hence, PBL has served as an effective pedagogical framework for teaching multidisciplinary subjects in mechatronics engineering education if adequate guidance and support are given to staff and students.

  18. A novel autonomous, bioinspired swimming robot developed by neuroscientists and bioengineers.

    PubMed

    Stefanini, C; Orofino, S; Manfredi, L; Mintchev, S; Marrazza, S; Assaf, T; Capantini, L; Sinibaldi, E; Grillner, S; Wallén, P; Dario, P

    2012-06-01

    This paper describes the development of a new biorobotic platform inspired by the lamprey. Design, fabrication and implemented control are all based on biomechanical and neuroscientific findings on this eel-like fish. The lamprey model has been extensively studied and characterized in recent years because it possesses all basic functions and control mechanisms of higher vertebrates, while at the same time having fewer neurons and simplified neural structures. The untethered robot has a flexible body driven by compliant actuators with proprioceptive feedback. It also has binocular vision for vision-based navigation. The platform has been successfully and extensively experimentally tested in aquatic environments, has high energy efficiency and is ready to be used as investigation tool for high level motor tasks.

  19. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  20. Semi-autonomous robots for reactor containments. Annual summary report, [1993--1994

    SciTech Connect

    Not Available

    1994-05-06

    During 1993, the activity at the University was split into two primary groups. One group provided direct support for the development and testing of the RVIR vehicle. This effort culminated in a demonstration of the vehicle at ORNL during December. The second group of researchers focused attention on pushing the technology forward in the areas of radiation imaging, navigation, and sensing modalities. A major effort in technology transfer took place during this year. All of these efforts reflected in the periodic progress reports which are attached. During 1994, our attention will change from the Nuclear Energy program to the Environmental Restoration and Waste Management office. The immediate needs of the Robotics Technology Development Program within the Office of Technology Development of EM drove this change in target applications. The University will be working closely with the national laboratories to further develop and transfer existing technologies to mobile platforms which are currently being designed and employed in seriously hazardous environments.

  1. The Summer Robotic Autonomy Course

    NASA Technical Reports Server (NTRS)

    Nourbakhsh, Illah R.

    2002-01-01

    We offered a first Robotic Autonomy course this summer, located at NASA/Ames' new NASA Research Park, for approximately 30 high school students. In this 7-week course, students worked in ten teams to build then program advanced autonomous robots capable of visual processing and high-speed wireless communication. The course made use of challenge-based curricula, culminating each week with a Wednesday Challenge Day and a Friday Exhibition and Contest Day. Robotic Autonomy provided a comprehensive grounding in elementary robotics, including basic electronics, electronics evaluation, microprocessor programming, real-time control, and robot mechanics and kinematics. Our course then continued the educational process by introducing higher-level perception, action and autonomy topics, including teleoperation, visual servoing, intelligent scheduling and planning and cooperative problem-solving. We were able to deliver such a comprehensive, high-level education in robotic autonomy for two reasons. First, the content resulted from close collaboration between the CMU Robotics Institute and researchers in the Information Sciences and Technology Directorate and various education program/project managers at NASA/Ames. This collaboration produced not only educational content, but will also be focal to the conduct of formative and summative evaluations of the course for further refinement. Second, CMU rapid prototyping skills as well as the PI's low-overhead perception and locomotion research projects enabled design and delivery of affordable robot kits with unprecedented sensory- locomotory capability. Each Trikebot robot was capable of both indoor locomotion and high-speed outdoor motion and was equipped with a high-speed vision system coupled to a low-cost pan/tilt head. As planned, follow the completion of Robotic Autonomy, each student took home an autonomous, competent robot. This robot is the student's to keep, as she explores robotics with an extremely capable tool in the

  2. The Summer Robotic Autonomy Course

    NASA Technical Reports Server (NTRS)

    Nourbakhsh, Illah R.

    2002-01-01

    We offered a first Robotic Autonomy course this summer, located at NASA/Ames' new NASA Research Park, for approximately 30 high school students. In this 7-week course, students worked in ten teams to build then program advanced autonomous robots capable of visual processing and high-speed wireless communication. The course made use of challenge-based curricula, culminating each week with a Wednesday Challenge Day and a Friday Exhibition and Contest Day. Robotic Autonomy provided a comprehensive grounding in elementary robotics, including basic electronics, electronics evaluation, microprocessor programming, real-time control, and robot mechanics and kinematics. Our course then continued the educational process by introducing higher-level perception, action and autonomy topics, including teleoperation, visual servoing, intelligent scheduling and planning and cooperative problem-solving. We were able to deliver such a comprehensive, high-level education in robotic autonomy for two reasons. First, the content resulted from close collaboration between the CMU Robotics Institute and researchers in the Information Sciences and Technology Directorate and various education program/project managers at NASA/Ames. This collaboration produced not only educational content, but will also be focal to the conduct of formative and summative evaluations of the course for further refinement. Second, CMU rapid prototyping skills as well as the PI's low-overhead perception and locomotion research projects enabled design and delivery of affordable robot kits with unprecedented sensory- locomotory capability. Each Trikebot robot was capable of both indoor locomotion and high-speed outdoor motion and was equipped with a high-speed vision system coupled to a low-cost pan/tilt head. As planned, follow the completion of Robotic Autonomy, each student took home an autonomous, competent robot. This robot is the student's to keep, as she explores robotics with an extremely capable tool in the

  3. Nasa's Ant-Inspired Swarmie Robots

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.

    2016-01-01

    As humans push further beyond the grasp of earth, robotic missions in advance of human missions will play an increasingly important role. These robotic systems will find and retrieve valuable resources as part of an in-situ resource utilization (ISRU) strategy. They will need to be highly autonomous while maintaining high task performance levels. NASA Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots to be used as a ground-based research platform for ISRU missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in a previously unmapped environment and return those resources to a central site. This talk will guide the audience through the Swarmie robot project from its conception by students in a New Mexico research lab to its robot trials in an outdoor parking lot at NASA. The software technologies and techniques used on the project will be discussed, as well as various challenges and solutions that were encountered by the development team along the way.

  4. Stations Outdoors

    ERIC Educational Resources Information Center

    Madison, John P.; And Others

    1976-01-01

    Described is a program of outdoor education utilizing activity-oriented learning stations. Described are 13 activities including: a pond study, orienteering, nature crafts, outdoor mathematics, linear distance measurement, and area measurement. (SL)

  5. Automatic learning rate adjustment for self-supervising autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    Described is an application in which an Artificial Neural Network (ANN) controls the positioning of a robot arm with five degrees of freedom by using visual feedback provided by two cameras. This application and the specific ANN model, local liner maps, are based on the work of Ritter, Martinetz, and Schulten. We extended their approach by generating a filtered, average positioning error from the continuous camera feedback and by coupling the learning rate to this error. When the network learns to position the arm, the positioning error decreases and so does the learning rate until the system stabilizes at a minimum error and learning rate. This abolishes the need for a predetermined cooling schedule. The automatic cooling procedure results in a closed loop control with no distinction between a learning phase and a production phase. If the positioning error suddenly starts to increase due to an internal failure such as a broken joint, or an environmental change such as a camera moving, the learning rate increases accordingly. Thus, learning is automatically activated and the network adapts to the new condition after which the error decreases again and learning is 'shut off'. The automatic cooling is therefore a prerequisite for the autonomy and the fault tolerance of the system.

  6. LandingNav: a precision autonomous landing sensor for robotic platforms on planetary bodies

    NASA Astrophysics Data System (ADS)

    Katake, Anup; Bruccoleri, Chrisitian; Singla, Puneet; Junkins, John L.

    2010-01-01

    Increased interest in the exploration of extra terrestrial planetary bodies calls for an increase in the number of spacecraft landing on remote planetary surfaces. Currently, imaging and radar based surveys are used to determine regions of interest and a safe landing zone. The purpose of this paper is to introduce LandingNav, a sensor system solution for autonomous landing on planetary bodies that enables landing on unknown terrain. LandingNav is based on a novel multiple field of view imaging system that leverages the integration of different state of the art technologies for feature detection, tracking, and 3D dense stereo map creation. In this paper we present the test flight results of the LandingNav system prototype. Sources of errors due to hardware limitations and processing algorithms were identified and will be discussed. This paper also shows that addressing the issues identified during the post-flight test data analysis will reduce the error down to 1-2%, thus providing for a high precision 3D range map sensor system.

  7. Image processing for navigation on a mobile embedded platform: design of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Loose, Harald; Lemke, Christiane; Papazov, Chavdar

    2006-02-01

    This paper deals with intelligent mobile platforms connected to a camera controlled by a small hardware-platform called RCUBE. This platform is able to provide features of a typical actuator-sensor board with various inputs and outputs as well as computing power and image recognition capabilities. Several intelligent autonomous RCBUE devices can be equipped and programmed to participate in the BOSPORUS network. These components form an intelligent network for gathering sensor and image data, sensor data fusion, navigation and control of mobile platforms. The RCUBE platform provides a standalone solution for image processing, which will be explained and presented. It plays a major role for several components in a reference implementation of the BOSPORUS system. On the one hand, intelligent cameras will be positioned in the environment, analyzing the events from a fixed point of view and sharing their perceptions with other components in the system. On the other hand, image processing results will contribute to a reliable navigation of a mobile system, which is crucially important. Fixed landmarks and other objects appropriate for determining the position of a mobile system can be recognized. For navigation other methods are added, i.e. GPS calculations and odometers.

  8. Effects of robot-driven gait orthosis treadmill training on the autonomic response in rehabilitation-responsive stroke and cervical spondylotic myelopathy patients.

    PubMed

    Magagnin, Valentina; Bo, Ivano; Turiel, Maurizio; Fornari, Maurizio; Caiani, Enrico G; Porta, Alberto

    2010-06-01

    Body weight supported treadmill training (BWSTT) assisted with a robotic-driven gait orthosis is utilized in rehabilitation of individuals with lost motor skills. A typical rehabilitation session included: sitting, standing, suspension, robotic-assisted walking at 1.5 and 2.5km/h, respectively with 50% body weight support and recovery. While the effects of robotic-assisted BWSTT on motor performances were deeply studied, the influences on the cardiovascular control are still unknown. The aim of the study was to evaluate in stroke (ST) and cervical spondylotic myelopathy (CSM) patients: (1) the autonomic response during a traditional robotic-assisted BWSTT session of motor rehabilitation; (2) the effects of 30 daily sessions of BWSTT on cardiovascular regulation. The autonomic response was assessed through symbolic analysis of short-term heart rate variability in 11 pathologic subjects (5 ST and 6 CSM patients) whose motor skills were improved as a result of the rehabilitation therapy. Results showed variable individual responses to the rehabilitation session in ST patients at the beginning of the therapy. At the end of the rehabilitation process, the responses of ST patients were less variable and more similar to those previously observed in healthy subjects. CSM patients exhibited an exaggerated vagal response to the fastest walking phase during the first rehabilitative session. This abnormal response was limited after the last rehabilitative session. We conclude that robotic-assisted BWSTT is helpful in restoring cardiovascular control in rehabilitation-responsive ST patients and limiting vagal responses in rehabilitation-responsive CSM patients.

  9. Conversion and control of an all-terrain vehicle for use as an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Jacob, John S.; Gunderson, Robert W.; Fullmer, R. R.

    1998-08-01

    A systematic approach to ground vehicle automation is presented, combining low-level controls, trajectory generation and closed-loop path correction in an integrated system. Development of cooperative robotics for precision agriculture at Utah State University required the automation of a full-scale motorized vehicle. The Triton Predator 8- wheeled skid-steering all-terrain vehicle was selected for the project based on its ability to maneuver precisely and the simplicity of controlling the hydrostatic drivetrain. Low-level control was achieved by fitting an actuator on the engine throttle, actuators for the left and right drive controls, encoders on the left and right drive shafts to measure wheel speeds, and a signal pick-off on the alternator for measuring engine speed. Closed loop control maintains a desired engine speed and tracks left and right wheel speeds commands. A trajectory generator produces the wheel speed commands needed to steer the vehicle through a predetermined set of map coordinates. A planar trajectory through the points is computed by fitting a 2D cubic spline over each path segment while enforcing initial and final orientation constraints at segment endpoints. Acceleration and velocity profiles are computed for each trajectory segment, with the velocity over each segment dependent on turning radius. Left and right wheel speed setpoints are obtained by combining velocity and path curvature for each low-level timestep. The path correction algorithm uses GPS position and compass orientation information to adjust the wheel speed setpoints according to the 'crosstrack' and 'downtrack' errors and heading error. Nonlinear models of the engine and the skid-steering vehicle/ground interaction were developed for testing the integrated system in simulation. These test lead to several key design improvements which assisted final implementation on the vehicle.

  10. A multimodal interface for real-time soldier-robot teaming

    NASA Astrophysics Data System (ADS)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  11. Sample Return Robot Centennial Challenge

    NASA Image and Video Library

    2012-06-16

    A judge for the NASA-WPI Sample Return Robot Centennial Challenge follows a robot on the playing field during the challenge on Saturday, June 16, 2012 in Worcester, Mass. Teams were challenged to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)

  12. Atypical autonomic dysreflexia during robotic-assisted body weight supported treadmill training in an individual with motor incomplete spinal cord injury.

    PubMed

    Geigle, Paula R; Frye, Sara Kate; Perreault, John; Scott, William H; Gorman, Peter H

    2013-03-01

    A 41-year-old man with a history of C6 American Spinal Injury Association (ASIA) Impairment Scale (AIS) C spinal cord injury (SCI), enrolled in an Institutional Review Board (IRB)-approved, robotic-assisted body weight-supported treadmill training (BWSTT), and aquatic exercise research protocol developed asymptomatic autonomic dysreflexia (AD) during training. Little information is available regarding the relationship of robotic-assisted BWSTT and AD. After successfully completing 36 sessions of aquatic exercise, he reported exertional fatigue during his 10th Lokomat intervention and exhibited asymptomatic or silent AD during this and the three subsequent BWSTT sessions. Standard facilitators of AD were assessed and no obvious irritant identified other than the actual physical exertion and positioning required during robotic-assisted BWSTT. Increased awareness of potential silent AD presenting during robotic assisted BWSTT training for individuals with motor incomplete SCI is required as in this case AD clinical signs were not concurrent with occurrence. Frequent vital sign assessment before, during, and at conclusion of each BWSTT session is strongly recommended.

  13. Outdoor allergens.

    PubMed Central

    Burge, H A; Rogers, C A

    2000-01-01

    Outdoor allergens are an important part of the exposures that lead to allergic disease. Understanding the role of outdoor allergens requires a knowledge of the nature of outdoor allergen-bearing particles, the distributions of their source, and the nature of the aerosols (particle types, sizes, dynamics of concentrations). Primary sources for outdoor allergens include vascular plants (pollen, fern spores, soy dust), and fungi (spores, hyphae). Nonvascular plants, algae, and arthropods contribute small numbers of allergen-bearing particles. Particles are released from sources into the air by wind, rain, mechanical disturbance, or active discharge mechanisms. Once airborne, they follow the physical laws that apply to all airborne particles. Although some outdoor allergens penetrate indoor spaces, exposure occurs mostly outdoors. Even short-term peak outdoor exposures can be important in eliciting acute symptoms. Monitoring of airborne biological particles is usually by particle impaction and microscopic examination. Centrally located monitoring stations give regional-scale measurements for aeroallergen levels. Evidence for the role of outdoor allergens in allergic rhinitis is strong and is rapidly increasing for a role in asthma. Pollen and fungal spore exposures have both been implicated in acute exacerbations of asthma, and sensitivity to some fungal spores predicts the existence of asthma. Synergism and/or antagonism probably occurs with other outdoor air particles and gases. Control involves avoidance of exposure (staying indoors, preventing entry of outdoor aerosols) as well as immunotherapy, which is effective for pollen but of limited effect for spores. Outdoor allergens have been the subject of only limited studies with respect to the epidemiology of asthma. Much remains to be studied with respect to prevalence patterns, exposure and disease relationships, and control. PMID:10931783

  14. Performance of a scanning laser line striper in outdoor lighting

    NASA Astrophysics Data System (ADS)

    Mertz, Christoph

    2013-05-01

    For search and rescue robots and reconnaissance robots it is important to detect objects in their vicinity. We have developed a scanning laser line striper that can produce dense 3D images using active illumination. The scanner consists of a camera and a MEMS-micro mirror based projector. It can also detect the presence of optically difficult material like glass and metal. The sensor can be used for autonomous operation or it can help a human operator to better remotely control the robot. In this paper we will evaluate the performance of the scanner under outdoor illumination, i.e. from operating in the shade to operating in full sunlight. We report the range, resolution and accuracy of the sensor and its ability to reconstruct objects like grass, wooden blocks, wires, metal objects, electronic devices like cell phones, blank RPG, and other inert explosive devices. Furthermore we evaluate its ability to detect the presence of glass and polished metal objects. Lastly we report on a user study that shows a significant improvement in a grasping task. The user is tasked with grasping a wire with the remotely controlled hand of a robot. We compare the time it takes to complete the task using the 3D scanner with using a traditional video camera.

  15. Outdoor Mathematics

    ERIC Educational Resources Information Center

    Kennard, Jackie

    2007-01-01

    One of the most interesting developments in teaching has been the growing importance of the outdoor environment. Whether it be playground, garden or field, the outdoors offers a range of challenging experiences, especially in the delivery of early mathematics. Oral feedback to parents, together with photographic displays, can show them that…

  16. OUTDOOR EDUCATION.

    ERIC Educational Resources Information Center

    SMITH, JULIAN W.; AND OTHERS

    AN INTERDISCIPLINARY APPROACH IS USED TO RELATE A VARIETY OF CURRICULAR AREAS TO OUTDOOR EDUCATION. THE ROLE OF FEDERAL, STATE, AND VOLUNTARY ORGANIZATIONS IN PROMOTING EDUCATIONAL AND RECREATIONAL USES OF PUBLIC LANDS AND FACILITIES IS DISCUSSED. SUGGESTIONS ARE OFFERED FOR THE TRAINING OF PERSONNEL TO LEAD THE OUTDOOR EDUCATION PROGRAMS OF OUR…

  17. Outdoor Mathematics

    ERIC Educational Resources Information Center

    Kennard, Jackie

    2007-01-01

    One of the most interesting developments in teaching has been the growing importance of the outdoor environment. Whether it be playground, garden or field, the outdoors offers a range of challenging experiences, especially in the delivery of early mathematics. Oral feedback to parents, together with photographic displays, can show them that…

  18. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  19. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  20. Development of dog-like retrieving capability in a ground robot

    NASA Astrophysics Data System (ADS)

    MacKenzie, Douglas C.; Ashok, Rahul; Rehg, James M.; Witus, Gary

    2013-01-01

    This paper presents the Mobile Intelligence Team's approach to addressing the CANINE outdoor ground robot competition. The competition required developing a robot that provided retrieving capabilities similar to a dog, while operating fully autonomously in unstructured environments. The vision team consisted of Mobile Intelligence, the Georgia Institute of Technology, and Wayne State University. Important computer vision aspects of the project were the ability to quickly learn the distinguishing characteristics of novel objects, searching images for the object as the robot drove a search pattern, identifying people near the robot for safe operations, correctly identify the object among distractors, and localizing the object for retrieval. The classifier used to identify the objects will be discussed, including an analysis of its performance, and an overview of the entire system architecture presented. A discussion of the robot's performance in the competition will demonstrate the system's successes in real-world testing.