Science.gov

Sample records for outdoor autonomous robots

  1. Vision-based semi-autonomous outdoor robot system to reduce soldier workload

    NASA Astrophysics Data System (ADS)

    Richardson, Al; Rodgers, Michael H.

    2001-09-01

    Sensors and computational capability have not reached the point to enable small robots to navigate autonomously in unconstrained outdoor environments at tactically useful speeds. This problem is greatly reduced, however, if a soldier can lead the robot through terrain that he knows it can traverse. An application of this concept is a small pack-mule robot that follows a foot soldier over outdoor terrain. The solder would be responsible to avoid situations beyond the robot's limitations when encountered. Having learned the route, the robot could autonomously retrace the path carrying supplies and munitions. This would greatly reduce the soldier's workload under normal conditions. This paper presents a description of a developmental robot sensor system using low-cost commercial 3D vision and inertial sensors to address this application. The robot moves at fast walking speed and requires only short-range perception to accomplish its task. 3D-feature information is recorded on a composite route map that the robot uses to negotiate its local environment and retrace the path taught by the soldier leader.

  2. Autonomous robot using infrared thermal camera to discriminate objects in outdoor scene

    NASA Technical Reports Server (NTRS)

    Caillas, C.

    1990-01-01

    A complete autonomous legged robot is beig designed at Carnegie Mellon University to perform planetary exploration without human supervision. This robot must traverse unknown and geographically diverse areas in order to collect samples of materials. This paper describes how thermal imaging can be used to identify materials in order to find good footfall positions and collection sites of material. First, a model developed for determining the temperature of materials in an outdoor scene is presented. By applying this model, it is shown that it is possible to determine a physical characteristic of the material: thermal inertia. Second, experimental results are described that consist in recording thermal images of an outdoor scene constituted with sand and rock. Third, results and limitations of applying the model to experimental images are analyzed. Finally, the paper analyzes how basic segmentation algorithms can be combined with the thermal inertia segmentation in order to improve the discrimination of different kinds of materials.

  3. An adaptive localization system for outdoor/indoor navigation for autonomous robots

    NASA Astrophysics Data System (ADS)

    Pacis, E. B.; Sights, B.; Ahuja, G.; Kogut, G.; Everett, H. R.

    2006-05-01

    Many envisioned applications of mobile robotic systems require the robot to navigate in complex urban environments. This need is particularly critical if the robot is to perform as part of a synergistic team with human forces in military operations. Historically, the development of autonomous navigation for mobile robots has targeted either outdoor or indoor scenarios, but not both, which is not how humans operate. This paper describes efforts to fuse component technologies into a complete navigation system, allowing a robot to seamlessly transition between outdoor and indoor environments. Under the Joint Robotics Program's Technology Transfer project, empirical evaluations of various localization approaches were conducted to assess their maturity levels and performance metrics in different exterior/interior settings. The methodologies compared include Markov localization, global positioning system, Kalman filtering, and fuzzy-logic. Characterization of these technologies highlighted their best features, which were then fused into an adaptive solution. A description of the final integrated system is discussed, including a presentation of the design, experimental results, and a formal demonstration to attendees of the Unmanned Systems Capabilities Conference II in San Diego in December 2005.

  4. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    This project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. An existing teleoperated "Surveyor" robot developed by ARD...

  5. Robotic Lander Completes Multiple Outdoor Flight

    NASA Video Gallery

    NASA’s Robotic Lander Development Project in Huntsville, Ala., has successfully completed seven autonomous outdoor flight tests of a lander prototype, dubbed Mighty Eagle. On Oct. 14, Mighty Eagl...

  6. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  7. Demonstration of autonomous air monitoring through robotics

    SciTech Connect

    Rancatore, R.

    1989-11-01

    The project included modifying an existing teleoperated robot to include autonomous navigation, large object avoidance, and air monitoring and demonstrating that prototype robot system in indoor and outdoor environments. The robot was also modified to carry a HNU PI-101 Photoionization Detector air monitoring device. A sonar range finder, which already was an integral part of the Surveyor, was repositioned to the front of the robot chassis to detect large obstacles in the path of the robot. In addition, the software of the onboard computer was also extensively modified to provide: navigation control, dynamic steering to smoothly follow the wire-course without hesitation, obstacle avoidance, autonomous shut down and remote reporting of toxic substance detection.

  8. Autonomous mobile robots: Vehicles with cognitive control

    SciTech Connect

    Meystel, A.

    1987-01-01

    This book explores a new rapidly developing area of robotics. It describes the state-of-the-art intelligence control, applied machine intelligence, and research and initial stages of manufacturing of autonomous mobile robots. A complete account of the theoretical and experimental results obtained during the last two decades together with some generalizations on Autonomous Mobile Systems are included in this book. Contents: Introduction; Requirements and Specifications; State-of-the-art in Autonomous Mobile Robots Area; Structure of Intelligent Mobile Autonomous System; Planner, Navigator; Pilot; Cartographer; Actuation Control; Computer Simulation of Autonomous Operation; Testing the Autonomous Mobile Robot; Conclusions; Bibliography.

  9. Miniature Autonomous Robotic Vehicle (MARV)

    SciTech Connect

    Feddema, J.T.; Kwok, K.S.; Driessen, B.J.; Spletzer, B.L.; Weber, T.M.

    1996-12-31

    Sandia National Laboratories (SNL) has recently developed a 16 cm{sup 3} (1 in{sup 3}) autonomous robotic vehicle which is capable of tracking a single conducting wire carrying a 96 kHz signal. This vehicle was developed to assess the limiting factors in using commercial technology to build miniature autonomous vehicles. Particular attention was paid to the design of the control system to search out the wire, track it, and recover if the wire was lost. This paper describes the test vehicle and the control analysis. Presented in the paper are the vehicle model, control laws, a stability analysis, simulation studies and experimental results.

  10. A power autonomous monopedal robot

    NASA Astrophysics Data System (ADS)

    Krupp, Benjamin T.; Pratt, Jerry E.

    2006-05-01

    We present the design and initial results of a power-autonomous planar monopedal robot. The robot is a gasoline powered, two degree of freedom robot that runs in a circle, constrained by a boom. The robot uses hydraulic Series Elastic Actuators, force-controllable actuators which provide high force fidelity, moderate bandwidth, and low impedance. The actuators are mounted in the body of the robot, with cable drives transmitting power to the hip and knee joints of the leg. A two-stroke, gasoline engine drives a constant displacement pump which pressurizes an accumulator. Absolute position and spring deflection of each of the Series Elastic Actuators are measured using linear encoders. The spring deflection is translated into force output and compared to desired force in a closed loop force-control algorithm implemented in software. The output signal of each force controller drives high performance servo valves which control flow to each of the pistons of the actuators. In designing the robot, we used a simulation-based iterative design approach. Preliminary estimates of the robot's physical parameters were based on past experience and used to create a physically realistic simulation model of the robot. Next, a control algorithm was implemented in simulation to produce planar hopping. Using the joint power requirements and range of motions from simulation, we worked backward specifying pulley diameter, piston diameter and stroke, hydraulic pressure and flow, servo valve flow and bandwidth, gear pump flow, and engine power requirements. Components that meet or exceed these specifications were chosen and integrated into the robot design. Using CAD software, we calculated the physical parameters of the robot design, replaced the original estimates with the CAD estimates, and produced new joint power requirements. We iterated on this process, resulting in a design which was prototyped and tested. The Monopod currently runs at approximately 1.2 m/s with the weight of all

  11. Autonomous Robotic Inspection in Tunnels

    NASA Astrophysics Data System (ADS)

    Protopapadakis, E.; Stentoumis, C.; Doulamis, N.; Doulamis, A.; Loupos, K.; Makantasis, K.; Kopsiaftis, G.; Amditis, A.

    2016-06-01

    In this paper, an automatic robotic inspector for tunnel assessment is presented. The proposed platform is able to autonomously navigate within the civil infrastructures, grab stereo images and process/analyse them, in order to identify defect types. At first, there is the crack detection via deep learning approaches. Then, a detailed 3D model of the cracked area is created, utilizing photogrammetric methods. Finally, a laser profiling of the tunnel's lining, for a narrow region close to detected crack is performed; allowing for the deduction of potential deformations. The robotic platform consists of an autonomous mobile vehicle; a crane arm, guided by the computer vision-based crack detector, carrying ultrasound sensors, the stereo cameras and the laser scanner. Visual inspection is based on convolutional neural networks, which support the creation of high-level discriminative features for complex non-linear pattern classification. Then, real-time 3D information is accurately calculated and the crack position and orientation is passed to the robotic platform. The entire system has been evaluated in railway and road tunnels, i.e. in Egnatia Highway and London underground infrastructure.

  12. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel. PMID:26227680

  13. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  14. Tele-robotic/autonomous control using controlshell

    SciTech Connect

    Wilhelmsen, K.C.; Hurd, R.L.; Couture, S.

    1996-12-10

    A tele-robotic and autonomous controller architecture for waste handling and sorting has been developed which uses tele-robotics, autonomous grasping and image processing. As a starting point, prior work from LLNL and ORNL was restructured and ported to a special real-time development environment. Significant improvements in collision avoidance, force compliance, and shared control aspects were then developed. Several orders of magnitude improvement were made in some areas to meet the speed and robustness requirements of the application.

  15. Control algorithms for autonomous robot navigation

    SciTech Connect

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  16. Progress in outdoor navigation by the SAIL developmental robot

    NASA Astrophysics Data System (ADS)

    Zhang, Nan; Weng, John J.; Huang, Xiao

    2002-02-01

    A sensory mapping method, called Staggered Hierarchical Mapping (SHM), and its developmental algorithm are described in this paper. SHM is a model motivated by human early visual pathways including processing performed by the retina, Lateral Geniculate Nucleus (LGN) and the primary visual cortex. The work reported here concerns not only the design of such a series of processors but also their autonomous development. The primary goal is to address a long standing open problem of visual information processing in that processing elements that are dedicated to receptive fields of different retinal positions and different scales (sizes) must be concurrently functioning, in robotic and other applications in unstructured environments. A new Incremental Principal Component Analysis (IPCA) method is used to automatically develop orientation sensitive and other needed filters. For a fast convergence, the lateral inhibition of sensory neurons is modelled by what is called residual images. A set of staggered receptive fields models the pattern of positioning of processing cells. From sequentially sensed video frames, the proposed developing algorithm develops a hierarchy of filters, whose outputs are uncorrelated within each layer, but with increasing scale of receptive fields from low to higher layers. To study the completeness of the representation generated by the SHM, we experimentally show that the response produced at any layer is sufficient to corresponding retinal image. As an application domain, we describe out preliminary experiments of autonomous navigation by the SAIL robot, and why a mapping like the SHM is needed in our next phase of work of vision guided autonomous navigation in outdoor environments.

  17. Tele/Autonomous Robot For Nuclear Facilities

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.

    1994-01-01

    Fail-safe tele/autonomous robotic system makes it unnecessary for human technicians to enter nuclear-fuel-reprocessing facilities and other high-radiation or otherwise hazardous industrial environments. Used to carry out experiments as exchanging equipment modules, turning bolts, cleaning surfaces, and grappling turning objects by use of mixture of autonomous actions and teleoperation with either single arm or two cooperating arms. System capable of fully autonomous operation, teleoperation or shared control.

  18. Automatic learning by an autonomous mobile robot

    SciTech Connect

    de Saussure, G.; Spelt, P.F.; Killough, S.M.; Pin, F.G.; Weisbin, C.R.

    1989-01-01

    This paper describes recent research in automatic learning by the autonomous mobile robot HERMIES-IIB at the Center for Engineering Systems Advanced Research (CESAR). By acting on the environment and observing the consequences during a set of training examples, the robot learns a sequence of successful manipulations on a simulated control panel. The robot learns to classify panel configurations in order to deal with new configurations that are not part of the original training set. 5 refs., 2 figs.

  19. Autonomous Student Experiences in Outdoor and Adventure Education

    ERIC Educational Resources Information Center

    Daniel, Brad; Bobilya, Andrew J.; Kalisch, Kenneth R.; McAvoy, Leo H.

    2014-01-01

    This article explores the current state of knowledge regarding the use of autonomous student experiences (ASE) in outdoor and adventure education (OAE) programs. ASE are defined as components (e.g., solo, final expedition) in which participants have a greater measure of choice and control over the planning, execution, and outcomes of their…

  20. Reference test courses for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Jacoff, Adam; Messina, Elena; Evans, John

    2001-09-01

    One approach to measuring the performance of intelligent systems is to develop standardized or reproducible tests. These tests may be in a simulated environment or in a physical test course. The National Institute of Standards and Technology has developed a test course for evaluating the performance of mobile autonomous robots operating in an urban search and rescue mission. The test course is designed to simulate a collapsed building structure at various levels of fidelity. The course will be used in robotic competitions, such as the American Association for Artificial Intelligence (AAAI) Mobile Robot Competition and the RoboCup Rescue. Designed to be repeatable and highly reconfigurable, the test course challenges a robot's cognitive capabilities such as perception, knowledge representation, planning, autonomy and collaboration. The goal of the test course is to help define useful performance metrics for autonomous mobile robots which, if widely accepted, could accelerate development of advanced robotic capabilities by promoting the re-use of algorithms and system components. The course may also serve as a prototype for further development of performance testing environments which enable robot developers and purchasers to objectively evaluate robots for a particular application. In this paper we discuss performance metrics for autonomous mobile robots, the use of representative urban search and rescue scenarios as a challenge domain, and the design criteria for the test course.

  1. Mapping planetary caves with an autonomous, heterogeneous robot team

    NASA Astrophysics Data System (ADS)

    Husain, Ammar; Jones, Heather; Kannan, Balajee; Wong, Uland; Pimentel, Tiago; Tang, Sarah; Daftry, Shreyansh; Huber, Steven; Whittaker, William L.

    Caves on other planetary bodies offer sheltered habitat for future human explorers and numerous clues to a planet's past for scientists. While recent orbital imagery provides exciting new details about cave entrances on the Moon and Mars, the interiors of these caves are still unknown and not observable from orbit. Multi-robot teams offer unique solutions for exploration and modeling subsurface voids during precursor missions. Robot teams that are diverse in terms of size, mobility, sensing, and capability can provide great advantages, but this diversity, coupled with inherently distinct low-level behavior architectures, makes coordination a challenge. This paper presents a framework that consists of an autonomous frontier and capability-based task generator, a distributed market-based strategy for coordinating and allocating tasks to the different team members, and a communication paradigm for seamless interaction between the different robots in the system. Robots have different sensors, (in the representative robot team used for testing: 2D mapping sensors, 3D modeling sensors, or no exteroceptive sensors), and varying levels of mobility. Tasks are generated to explore, model, and take science samples. Based on an individual robot's capability and associated cost for executing a generated task, a robot is autonomously selected for task execution. The robots create coarse online maps and store collected data for high resolution offline modeling. The coordination approach has been field tested at a mock cave site with highly-unstructured natural terrain, as well as an outdoor patio area. Initial results are promising for applicability of the proposed multi-robot framework to exploration and modeling of planetary caves.

  2. Development of autonomous grasping and navigating robot

    NASA Astrophysics Data System (ADS)

    Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi

    2015-01-01

    The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.

  3. INL Autonomous Navigation System

    Energy Science and Technology Software Center (ESTSC)

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  4. Autonomous Navigation for Mobile Robots with Human-Robot Interaction

    NASA Astrophysics Data System (ADS)

    Ballantyne, James; Johns, Edward; Valibeik, Salman; Wong, Charence; Yang, Guang-Zhong

    Dynamic and complex indoor environments present a challenge for mobile robot navigation. The robot must be able to simultaneously map the environment, which often has repetitive features, whilst keep track of its pose and location. This chapter introduces some of the key considerations for human guided navigation. Rather than letting the robot explore the environment fully autonomously, we consider the use of human guidance for progressively building up the environment map and establishing scene association, learning, as well as navigation and planning. After the guide has taken the robot through the environment and indicated the points of interest via hand gestures, the robot is then able to use the geometric map and scene descriptors captured during the tour to create a high-level plan for subsequent autonomous navigation within the environment. Issues related to gesture recognition, multi-cue integration, tracking, target pursuing, scene association and navigation planning are discussed.

  5. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  6. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1989-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  7. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1988-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  8. Supervised autonomous robotic soft tissue surgery.

    PubMed

    Shademan, Azad; Decker, Ryan S; Opfermann, Justin D; Leonard, Simon; Krieger, Axel; Kim, Peter C W

    2016-05-01

    The current paradigm of robot-assisted surgeries (RASs) depends entirely on an individual surgeon's manual capability. Autonomous robotic surgery-removing the surgeon's hands-promises enhanced efficacy, safety, and improved access to optimized surgical techniques. Surgeries involving soft tissue have not been performed autonomously because of technological limitations, including lack of vision systems that can distinguish and track the target tissues in dynamic surgical environments and lack of intelligent algorithms that can execute complex surgical tasks. We demonstrate in vivo supervised autonomous soft tissue surgery in an open surgical setting, enabled by a plenoptic three-dimensional and near-infrared fluorescent (NIRF) imaging system and an autonomous suturing algorithm. Inspired by the best human surgical practices, a computer program generates a plan to complete complex surgical tasks on deformable soft tissue, such as suturing and intestinal anastomosis. We compared metrics of anastomosis-including the consistency of suturing informed by the average suture spacing, the pressure at which the anastomosis leaked, the number of mistakes that required removing the needle from the tissue, completion time, and lumen reduction in intestinal anastomoses-between our supervised autonomous system, manual laparoscopic surgery, and clinically used RAS approaches. Despite dynamic scene changes and tissue movement during surgery, we demonstrate that the outcome of supervised autonomous procedures is superior to surgery performed by expert surgeons and RAS techniques in ex vivo porcine tissues and in living pigs. These results demonstrate the potential for autonomous robots to improve the efficacy, consistency, functional outcome, and accessibility of surgical techniques. PMID:27147588

  9. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-01-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  10. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-10-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  11. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  12. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  13. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks. PMID:24852272

  14. Autonomous Mobile Robot That Can Read

    NASA Astrophysics Data System (ADS)

    Létourneau, Dominic; Michaud, François; Valin, Jean-Marc

    2004-12-01

    The ability to read would surely contribute to increased autonomy of mobile robots operating in the real world. The process seems fairly simple: the robot must be capable of acquiring an image of a message to read, extract the characters, and recognize them as symbols, characters, and words. Using an optical Character Recognition algorithm on a mobile robot however brings additional challenges: the robot has to control its position in the world and its pan-tilt-zoom camera to find textual messages to read, potentially having to compensate for its viewpoint of the message, and use the limited onboard processing capabilities to decode the message. The robot also has to deal with variations in lighting conditions. In this paper, we present our approach demonstrating that it is feasible for an autonomous mobile robot to read messages of specific colors and font in real-world conditions. We outline the constraints under which the approach works and present results obtained using a Pioneer 2 robot equipped with a Pentium 233 MHz and a Sony EVI-D30 pan-tilt-zoom camera.

  15. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings. PMID:22893571

  16. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  17. Autonomous mobile robot research using the HERMIES-III robot

    SciTech Connect

    Pin, F.G.; Beckerman, M.; Spelt, P.F.; Robinson, J.T.; Weisbin, C.R.

    1989-01-01

    This paper reports on the status and future directions in the research, development and experimental validation of intelligent control techniques for autonomous mobile robots using the HERMIES-III robot at the Center for Engineering Systems Advanced research (CESAR) at Oak Ridge National Laboratory (ORNL). HERMIES-III is the fourth robot in a series of increasingly more sophisticated and capable experimental test beds developed at CESAR. HERMIES-III is comprised of a battery powered, onmi-directional wheeled platform with a seven degree-of-freedom manipulator arm, video cameras, sonar range sensors, laser imaging scanner and a dual computer system containing up to 128 NCUBE nodes in hypercube configuration. All electronics, sensors, computers, and communication equipment required for autonomous operation of HERMIES-III are located on board along with sufficient battery power for three to four hours of operation. The paper first provides a more detailed description of the HERMIES-III characteristics, focussing on the new areas of research and demonstration now possible at CESAR with this new test-bed. The initial experimental program is then described with emphasis placed on autonomous performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES- III). The paper concludes with a discussion of the integration problems and safety considerations necessarily arising from the set-up of an experimental program involving human-scale, multi-autonomous mobile robots performance. 10 refs., 3 figs.

  18. An architecture for an autonomous learning robot

    NASA Technical Reports Server (NTRS)

    Tillotson, Brian

    1988-01-01

    An autonomous learning device must solve the example bounding problem, i.e., it must divide the continuous universe into discrete examples from which to learn. We describe an architecture which incorporates an example bounder for learning. The architecture is implemented in the GPAL program. An example run with a real mobile robot shows that the program learns and uses new causal, qualitative, and quantitative relationships.

  19. Development of Virtual Robot Based on Autonomous Behavior Acquisition

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masahito; Iwadate, Kenji; Ooe, Ryosuke; Suzuki, Ikuo; Furukawa, Masashi

    In this paper, we demonstrate a design of autonomous virtual robots and develop a design tool for autonomous virtual robots. A virtual robot can behave autonomously by using its own sensors and controllers on three-dimensional physically modeled environment. An approximate fluid environment model based on the drag force modeling is presented. The developed tool can simulate a physical environment at any time during the modeling process. A combinatorial use of neural network implementation for controllers and optimization method (genetic algorithm or particle swarm optimization) enables us to create autonomous behaviors of virtual robots.

  20. Artificial consciousness, artificial emotions, and autonomous robots.

    PubMed

    Cardon, Alain

    2006-12-01

    Nowadays for robots, the notion of behavior is reduced to a simple factual concept at the level of the movements. On another hand, consciousness is a very cultural concept, founding the main property of human beings, according to themselves. We propose to develop a computable transposition of the consciousness concepts into artificial brains, able to express emotions and consciousness facts. The production of such artificial brains allows the intentional and really adaptive behavior for the autonomous robots. Such a system managing the robot's behavior will be made of two parts: the first one computes and generates, in a constructivist manner, a representation for the robot moving in its environment, and using symbols and concepts. The other part achieves the representation of the previous one using morphologies in a dynamic geometrical way. The robot's body will be seen for itself as the morphologic apprehension of its material substrata. The model goes strictly by the notion of massive multi-agent's organizations with a morphologic control. PMID:17016730

  1. An autonomous vision-based mobile robot

    NASA Astrophysics Data System (ADS)

    Baumgartner, Eric Thomas

    This dissertation describes estimation and control methods for use in the development of an autonomous mobile robot for structured environments. The navigation of the mobile robot is based on precise estimates of the position and orientation of the robot within its environment. The extended Kalman filter algorithm is used to combine information from the robot's drive wheels with periodic observations of small, wall-mounted, visual cues to produce the precise position and orientation estimates. The visual cues are reliably detected by at least one video camera mounted on the mobile robot. Typical position estimates are accurate to within one inch. A path tracking algorithm is also developed to follow desired reference paths which are taught by a human operator. Because of the time-independence of the tracking algorithm, the speed that the vehicle travels along the reference path is specified independent from the tracking algorithm. The estimation and control methods have been applied successfully to two experimental vehicle systems. Finally, an analysis of the linearized closed-loop control system is performed to study the behavior and the stability of the system as a function of various control parameters.

  2. PRIMUS: autonomous driving robot for military applications

    NASA Astrophysics Data System (ADS)

    Schwartz, Ingo

    2000-07-01

    This article describes the government experimental program PRIMUS (PRogram of Intelligent Mobile Unmanned Systems) and the achieved results of phase C demonstrated in summer 1999 on a military prooving ground. In this program there shall be shown the autonomous driving on an unmanned robot in open terrain. The most possible degree of autonomy shall be reached with today's technology to get a platform for different missions. The goal is to release the soldier from high dangerous tasks, to increase the performance and to come to a reduction of personnel and costs with unmanned systems. In phase C of the program two small tracked vehicles (Digitized Wiesel 2, airtransportable by CH53) are used. One as a robot vehicle the other as a command & control system. The Wiesel 2 is configured as a drive by wire-system and therefore well suited for the adaption of control computers. The autonomous detection and avoidance of obstacles in unknown, not cooperative environment is the main task. For navigation and orientation a sensor package is integrated. To detect obstacles the scene in the driving corridor of the robot is scanned 4 times per second by a 3D- Range image camera (LADAR). The measured 3D-range image is converted into a 2D-obstacle map and used as input for calculation of an obstacle free path. The combination of local navigation (obstacle avoidance) and global navigation leads to a collission free driving in open terrain to a predefined goal point with a velocity of up to 25km/h. A contour tracker with a TV-camera as sensor is also implemented which allows to follow contours (e.g. edge of a meadow) or to drive on paved or unpaved roads with a velocity up to 50km/h. In addition to these autonomous driving modes the operator in the command & control station can drive the robot by remote control. All the functions were successfully demonstrated in the summer 1999 on a military prooving ground. During a mission example the robot vehicle covered a distance of several

  3. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  4. Task-level control for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid

    1994-01-01

    Task-level control refers to the integration and coordination of planning, perception, and real-time control to achieve given high-level goals. Autonomous mobile robots need task-level control to effectively achieve complex tasks in uncertain, dynamic environments. This paper describes the Task Control Architecture (TCA), an implemented system that provides commonly needed constructs for task-level control. Facilities provided by TCA include distributed communication, task decomposition and sequencing, resource management, monitoring and exception handling. TCA supports a design methodology in which robot systems are developed incrementally, starting first with deliberative plans that work in nominal situations, and then layering them with reactive behaviors that monitor plan execution and handle exceptions. To further support this approach, design and analysis tools are under development to provide ways of graphically viewing the system and validating its behavior.

  5. Development of autonomous eating mechanism for biomimetic robots

    NASA Astrophysics Data System (ADS)

    Jeong, Kil-Woong; Cho, Ik-Jin; Lee, Yun-Jung

    2005-12-01

    Most of the recently developed robots are human friendly robots which imitate animals or humans such as entertainment robot, bio-mimetic robot and humanoid robot. Interest for these robots are being increased because the social trend is focused on health, welfare, and graying. Autonomous eating functionality is most unique and inherent behavior of pets and animals. Most of entertainment robots and pet robots make use of internal-type battery. Entertainment robots and pet robots with internal-type battery are not able to operate during charging the battery. Therefore, if a robot has an autonomous function for eating battery as its feeds, the robot is not only able to operate during recharging energy but also become more human friendly like pets. Here, a new autonomous eating mechanism was introduced for a biomimetic robot, called ELIRO-II(Eating LIzard RObot version 2). The ELIRO-II is able to find a food (a small battery), eat and evacuate by itself. This work describe sub-parts of the developed mechanism such as head-part, mouth-part, and stomach-part. In addition, control system of autonomous eating mechanism is described.

  6. Autonomous biomorphic robots as platforms for sensors

    SciTech Connect

    Tilden, M.; Hasslacher, B.; Mainieri, R.; Moses, J.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomous machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.

  7. Computer vision for autonomous robotics in space

    NASA Astrophysics Data System (ADS)

    Wong, Andrew K. C.

    1993-08-01

    This paper presents a computer vision system being developed at the Pattern Analysis and Machine Intelligence (PAMI) Lab of the University of Waterloo and at the Vision, Intelligence and Robotics Technologies Corporation (VIRTEK) in support of the Canadian Space Autonomous Robotics Project. This system was originally developed for flexible manufacturing and guidance of autonomous roving vehicles. In the last few years, it has been engineered to support the operations of the Mobile Service System (MSS) (or its equivalence) for the Space Station Project. In the near term, this vision system will provide vision capability for the recognition, location and tracking of payloads as well as for relating the spatial information to the manipulator for capturing, manipulating and berthing payloads. In the long term, it will serve in the role of inspection, surveillance and servicing of the Station. Its technologies will be continually expanded and upgraded to meet the demand as the needs of the Space Station evolve and grow. Its spin-off technologies will benefit the industrial sectors as well.

  8. Radio Frequency Mapping using an Autonomous Robot: Application to the 2.4 GHz Band

    NASA Astrophysics Data System (ADS)

    Lebreton, J. M.; Murad, N. M.; Lorion, R.

    2016-03-01

    Radio signal strength measurement systems are essential to build a Radio Frequency (RF) mapping in indoor and outdoor environments for different application scenarios. This paper presents an autonomous robot making the construction of a radio signal mapping, by collecting and forwarding different useful information related to all access point devices and inherent to the robot towards the base station. A real case scenario is considered by measuring the RF field from our department network. The RF signal mapping consistency is shown by fitting the measurements with the radio signal strength model in two-dimensional area, and a path-loss exponent of 2.3 is estimated for the open corridor environment.

  9. Robotic technologies for outdoor industrial vehicles

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony

    2001-09-01

    The commercial industries of agriculture, mining, construction, and material handling employ a wide variety of mobile machines, including tractors, combines, Load-Haul-Dump vehicles, trucks, paving machines, fork trucks, and many more. Automation of these vehicles promises to improve productivity, reduce operational costs, and increase safety. Since the vehicles typically operate in difficult environments, under all weather conditions, and in the presence of people and other obstacles, reliable automation faces severe technical challenges. Furthermore, the viable technology solutions are constrained by cost considerations. Fortunately, due to the limited application domain, repetitive nature, and the utility of partial automation for most tasks, robotics technologies can have a profound impact on industrial vehicles. In this paper, we describe a technical approach developed at Carnegie Mellon University for automating mobile machines in several applications, including mass excavation, mining, and agriculture. The approach is introduced via case studies, and the results are presented.

  10. Autonomous robot behavior based on neural networks

    NASA Astrophysics Data System (ADS)

    Grolinger, Katarina; Jerbic, Bojan; Vranjes, Bozo

    1997-04-01

    The purpose of autonomous robot is to solve various tasks while adapting its behavior to the variable environment, expecting it is able to navigate much like a human would, including handling uncertain and unexpected obstacles. To achieve this the robot has to be able to find solution to unknown situations, to learn experienced knowledge, that means action procedure together with corresponding knowledge on the work space structure, and to recognize working environment. The planning of the intelligent robot behavior presented in this paper implements the reinforcement learning based on strategic and random attempts for finding solution and neural network approach for memorizing and recognizing work space structure (structural assignment problem). Some of the well known neural networks based on unsupervised learning are considered with regard to the structural assignment problem. The adaptive fuzzy shadowed neural network is developed. It has the additional shadowed hidden layer, specific learning rule and initialization phase. The developed neural network combines advantages of networks based on the Adaptive Resonance Theory and using shadowed hidden layer provides ability to recognize lightly translated or rotated obstacles in any direction.

  11. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  12. Object guided autonomous exploration for mobile robots in indoor environments

    NASA Astrophysics Data System (ADS)

    Nieto-Granda, Carlos; Choudhary, Siddarth; Rogers, John G.; Twigg, Jeff; Murali, Varun; Christensen, Henrik I.

    2014-06-01

    Autonomous mobile robotic teams are increasingly used in exploration of indoor environments. Accurate modeling of the world around the robot and describing the interaction of the robot with the world greatly increases the ability of the robot to act autonomously. This paper demonstrates the ability of autonomous robotic teams to find objects of interest. A novel feature of our approach is the object discovery and the use of it to augment the mapping and navigation process. The generated map can then be decomposed into semantic regions while also considering the distance and line of sight to anchor points. The advantage of this approach is that the robot can return a dense map of the region around an object of interest. The robustness of this approach is demonstrated in indoor environments with multiple platforms with the objective of discovering objects of interest.

  13. Reactive navigational controller for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Hawkins, Scott

    1993-12-01

    Autonomous mobile robots must respond to external challenges and threats in real time. One way to satisfy this requirement is to use a fast low level intelligence to react to local environment changes. A fast reactive controller has been implemented which performs the task of real time local navigation by integrating primitive elements of perception, planning, and control. Competing achievement and constraint behaviors are used to allow abstract qualitative specification of navigation goals. An interface is provided to allow a higher level deliberative intelligence with a more global perspective to set local goals for the reactive controller. The reactive controller's simplistic strategies may not always succeed, so a means to monitor and redirect the reactive controller is provided.

  14. Quantifying Emergent Behavior of Autonomous Robots

    NASA Astrophysics Data System (ADS)

    Martius, Georg; Olbrich, Eckehard

    2015-10-01

    Quantifying behaviors of robots which were generated autonomously from task-independent objective functions is an important prerequisite for objective comparisons of algorithms and movements of animals. The temporal sequence of such a behavior can be considered as a time series and hence complexity measures developed for time series are natural candidates for its quantification. The predictive information and the excess entropy are such complexity measures. They measure the amount of information the past contains about the future and thus quantify the nonrandom structure in the temporal sequence. However, when using these measures for systems with continuous states one has to deal with the fact that their values will depend on the resolution with which the systems states are observed. For deterministic systems both measures will diverge with increasing resolution. We therefore propose a new decomposition of the excess entropy in resolution dependent and resolution independent parts and discuss how they depend on the dimensionality of the dynamics, correlations and the noise level. For the practical estimation we propose to use estimates based on the correlation integral instead of the direct estimation of the mutual information using the algorithm by Kraskov et al. (2004) which is based on next neighbor statistics because the latter allows less control of the scale dependencies. Using our algorithm we are able to show how autonomous learning generates behavior of increasing complexity with increasing learning duration.

  15. Biomimetic smart sensors for autonomous robotic behavior I: acoustic processing

    NASA Astrophysics Data System (ADS)

    Deligeorges, Socrates; Xue, Shuwan; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Robotics are rapidly becoming an integral tool on the battlefield and in homeland security, replacing humans in hazardous conditions. To enhance the effectiveness of robotic assets and their interaction with human operators, smart sensors are required to give more autonomous function to robotic platforms. Biologically inspired sensors are an essential part of this development of autonomous behavior and can increase both capability and performance of robotic systems. Smart, biologically inspired acoustic sensors have the potential to extend autonomous capabilities of robotic platforms to include sniper detection, vehicle tracking, personnel detection, and general acoustic monitoring. The key to enabling these capabilities is biomimetic acoustic processing using a time domain processing method based on the neural structures of the mammalian auditory system. These biologically inspired algorithms replicate the extremely adaptive processing of the auditory system yielding high sensitivity over broad dynamic range. The algorithms provide tremendous robustness in noisy and echoic spaces; properties necessary for autonomous function in real world acoustic environments. These biomimetic acoustic algorithms also provide highly accurate localization of both persistent and transient sounds over a wide frequency range, using baselines on the order of only inches. A specialized smart sensor has been developed to interface with an iRobot Packbot® platform specifically to enhance its autonomous behaviors in response to personnel and gunfire. The low power, highly parallel biomimetic processor, in conjunction with a biomimetic vestibular system (discussed in the companion paper), has shown the system's autonomous response to gunfire in complicated acoustic environments to be highly effective.

  16. Autonomous Realtime Threat-Hunting Robot (ARTHR

    ScienceCinema

    INL

    2009-09-01

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  17. Autonomous Realtime Threat-Hunting Robot (ARTHR

    SciTech Connect

    INL

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  18. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  19. An Autonomous Mobile Robot for Tsukuba Challenge: JW-Future

    NASA Astrophysics Data System (ADS)

    Fujimoto, Katsuharu; Kaji, Hirotaka; Negoro, Masanori; Yoshida, Makoto; Mizutani, Hiroyuki; Saitou, Tomoya; Nakamura, Katsu

    “Tsukuba Challenge” is the only of its kind to require mobile robots to work autonomously and safely on public walkways. In this paper, we introduce the outline of our robot “JW-Future”, developed for this experiment based on an electric wheel chair. Additionally, the significance of participation to such a technical trial is discussed from the viewpoint of industries.

  20. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  1. Experimentation and concept formation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Oliver, G.; Silliman, M.

    1990-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning which involves autonomous concept formation using feedback from trial-and-error experimentation with the environment. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 11 refs., 7 figs.

  2. Tele-assistance for semi-autonomous robots

    NASA Technical Reports Server (NTRS)

    Rogers, Erika; Murphy, Robin R.

    1994-01-01

    This paper describes a new approach in semi-autonomous mobile robots. In this approach the robot has sufficient computerized intelligence to function autonomously under a certain set of conditions, while the local system is a cooperative decision making unit that combines human and machine intelligence. Communication is then allowed to take place in a common mode and in a common language. A number of exception-handling scenarios that were constructed as a result of experiments with actual sensor data collected from two mobile robots were presented.

  3. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  4. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  5. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  6. Autonomous Evolution of Dynamic Gaits with Two Quadruped Robots

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Takamura, Seichi; Yamamoto, Takashi; Fujita, Masahiro

    2004-01-01

    A challenging task that must be accomplished for every legged robot is creating the walking and running behaviors needed for it to move. In this paper we describe our system for autonomously evolving dynamic gaits on two of Sony's quadruped robots. Our evolutionary algorithm runs on board the robot and uses the robot's sensors to compute the quality of a gait without assistance from the experimenter. First we show the evolution of a pace and trot gait on the OPEN-R prototype robot. With the fastest gait, the robot moves at over 10/min/min., which is more than forty body-lengths/min. While these first gaits are somewhat sensitive to the robot and environment in which they are evolved, we then show the evolution of robust dynamic gaits, one of which is used on the ERS-110, the first consumer version of AIBO.

  7. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    ScienceCinema

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2010-01-08

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  8. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    SciTech Connect

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  9. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  10. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work. PMID:21095654

  11. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-11-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  12. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-01-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  13. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans. PMID:24558734

  14. FPGA implementation of vision algorithms for small autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Lee, D. J.; Archibald, J. K.

    2005-10-01

    The use of on-board vision with small autonomous robots has been made possible by the advances in the field of Field Programmable Gate Array (FPGA) technology. By connecting a CMOS camera to an FPGA board, on-board vision has been used to reduce the computation time inherent in vision algorithms. The FPGA board allows the user to create custom hardware in a faster, safer, and more easily verifiable manner that decreases the computation time and allows the vision to be done in real-time. Real-time vision tasks for small autonomous robots include object tracking, obstacle detection and avoidance, and path planning. Competitions were created to demonstrate that our algorithms work with our small autonomous vehicles in dealing with these problems. These competitions include Mouse-Trapped-in-a-Box, where the robot has to detect the edges of a box that it is trapped in and move towards them without touching them; Obstacle Avoidance, where an obstacle is placed at any arbitrary point in front of the robot and the robot has to navigate itself around the obstacle; Canyon Following, where the robot has to move to the center of a canyon and follow the canyon walls trying to stay in the center; the Grand Challenge, where the robot had to navigate a hallway and return to its original position in a given amount of time; and Stereo Vision, where a separate robot had to catch tennis balls launched from an air powered cannon. Teams competed on each of these competitions that were designed for a graduate-level robotic vision class, and each team had to develop their own algorithm and hardware components. This paper discusses one team's approach to each of these problems.

  15. Automatic detection and classification of obstacles with applications in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Rosas-Miranda, Dario I.

    2016-04-01

    Hardware implementation of an automatic detection and classification of objects that can represent an obstacle for an autonomous mobile robot using stereo vision algorithms is presented. We propose and evaluate a new method to detect and classify objects for a mobile robot in outdoor conditions. This method is divided in two parts, the first one is the object detection step based on the distance from the objects to the camera and a BLOB analysis. The second part is the classification step that is based on visuals primitives and a SVM classifier. The proposed method is performed in GPU in order to reduce the processing time values. This is performed with help of hardware based on multi-core processors and GPU platform, using a NVIDIA R GeForce R GT640 graphic card and Matlab over a PC with Windows 10.

  16. Autonomous navigation with teams of aerial robots

    NASA Astrophysics Data System (ADS)

    Michael, Nathan; Kumar, Vijay

    2011-06-01

    There are many examples in nature where large groups of individuals are able to maintain three-dimensional formations while navigating in complex environments. This paper addresses the development of a framework and robot controllers that enable a group of aerial robots to maintain a formation with partial state information while avoiding collisions. The central concept is to develop a low-dimensional abstraction of the large teams of robots, facilitate planning, command, and control in a low-dimensional space, and to realize commands or plans in the abstract space by synthesizing controllers for individual robots that respect the specified abstraction. The fundamental problem that is addressed in this paper relates to coordinated control of multiple UAVs in close proximity. We develop a representation for a team of robots based on the first and second statistical moments of the system and design kinematic, exponentially stabilizing controllers for point robots. The selection of representation permits a controller design that is invariant to the number of robots in the system, requires limited global state information, and reduces the complexity of the planning problem by generating an abstract planning and control space determined by the moment parameterization. We present experimental results with a team of quadrotors and discuss considerations such as aerodynamic interactions between robots.

  17. ODYSSEUS autonomous walking robot: The leg/arm design

    NASA Technical Reports Server (NTRS)

    Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.

    1994-01-01

    ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.

  18. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  19. The ARK (Autonomous Robot for a Known environment) project

    NASA Astrophysics Data System (ADS)

    Nickerson, S. B.; Camacho, F.; Mader, D. L.; Milios, E. E.; Jenkin, M. R. M.; Bains, N.; Braun, P.; Green, D.; Hung, S.; Korba, L.

    1991-05-01

    The main goal of the project is to build a mobile robot that can navigate in a known indoor environment using computer vision as its main sensor, with the aid of an internal geometric model of its environment. A second goal is to explore the technology in such a way as to best illustrate its usefulness and commercial potential. The theory will focus on the development and testing of computer vision algorithms as aids for robot navigation. Two robots will be built: ARK-1 (autonomous robot for a known environment); and ARK-2. ARK-1 will be tethered and will be used to test the vision algorithms. ARK-2 will be untethered, will use other sensors in addition to vision, will have a real-time operating system and will operate in an industrial environment. The platforms for both ARK- 1 and ARK-2 will be the same as that of a robot being developed at NRC for industrial applications.

  20. GPS and odometer data fusion for outdoor robots continuous positioning

    NASA Astrophysics Data System (ADS)

    Pozo-Ruz, Ana; Garcia-Perez, Lia; Garcia-Alegre, Maria C.; Guinea, Domingo; Ribeiro, Angela; Sandoval, Francisco

    2002-02-01

    Present work describes an approximation to obtain the best estimation of the position of the outdoor robot ROJO, a low cost lawnmower to perform unmanned precision agriculture task such are the spraying of pesticides in horticulture. For continuous location of ROJO, two redundant sensors have been installed onboard: a DGPS submetric precision model and an odometric system. DGPS system will allow an absolute positioning of the vehicle in the field, but GPS failures in the reception of the signals due to obstacles and electrical and meteorological disturbance, lead us to the integration of the odometric system. Thus, a robust odometer based upon magnetic strip sensors has been designed and integrated in the vehicle. These sensors continuosly deliver the position of the vehicle relative to its initial position, complementing the DGPS blindness periods. They give an approximated location of the vehicle in the field that can be in turn conveniently updated and corrected by the DGPS. Thus, to provided the best estimation, a fusion algorithm has been proposed and proved, wherein the best estimation is calculated as the maximum value of the join probability function obtained from both position estimation of the onboard sensors. Some results are presented to show the performance of the proposed sensor fusion technique.

  1. Navigation and learning experiments by an autonomous robot

    SciTech Connect

    de Saussure, G.; Weisbin, C.R.; Spelt, P.F.

    1988-01-01

    Developing an autonomous mobile robot capable of navigation, surveillance and manipulation in complex and dynamic environments is a key research activity at CESAR, Oak Ridge National Laboratory's Center for Engineering Systems Advanced Research. The latest series of completed experiments was performed using the autonomous mobile robot HERMIES-IIB (Hostile Environment Robotic Machine Intelligence Experiment Series II-B). The next section describes HERMIES-IIB and some of its major components required for autonomous operation in unstructured, dynamic environments. Section 3 outlines some ongoing research in autonomous navigation. Section 4 discusses our newest research in machine learning concepts. Section 5 describes a successful experiment in which the robot is placed in an arbitrary initial location without any prior specification of the content of its environment, successively discovers and navigates around stationary or moving obstacles, picks up and moves small obstacles, searches for a control panel and performs a learned sequence of manipulations on the panel devices. The last section outlines some future directions of the program.

  2. Defining proprioceptive behaviors for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Overholt, James L.; Hudas, Greg R.; Gerhart, Grant R.

    2002-07-01

    Proprioception is a sense of body position and movement that supports the control of many automatic motor functions such as posture and locomotion. This concept, normally relegated to the fields of neural physiology and kinesiology, is being utilized in the field of unmanned mobile robotics. This paper looks at developing proprioceptive behaviors for use in controlling an unmanned ground vehicle. First, we will discuss the field of behavioral control of mobile robots. Next, a discussion of proprioception and the development of proprioceptive sensors will be presented. We will then focus on the development of a unique neural-fuzzy architecture that will be used to incorporate the control behaviors coming directly from the proprioceptive sensors. Finally we will present a simulation experiment where a simple multi-sensor robot, utilizing both external and proprioceptive sensors, is presented with the task of navigating an unknown terrain to a known target position. Results of the mobile robot utilizing this unique fusion methodology will be discussed.

  3. Concurrent planning and execution for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid G.

    1992-01-01

    The Task Control Architecture (TCA) provides communication and coordination facilities to construct distributed, concurrent robotic systems. The use of TCA in a system that walks a legged robot through rugged terrain is described. The walking system, as originally implemented, had a sequential sense-plan-act control cycle. Utilizing TCA features for task sequencing and monitoring, the system was modified to concurrently plan and execute steps. Walking speed improved by over 30 percent, with only a relatively modest conversion effort.

  4. Applications of concurrent neuromorphic algorithms for autonomous robots

    NASA Technical Reports Server (NTRS)

    Barhen, J.; Dress, W. B.; Jorgensen, C. C.

    1988-01-01

    This article provides an overview of studies at the Oak Ridge National Laboratory (ORNL) of neural networks running on parallel machines applied to the problems of autonomous robotics. The first section provides the motivation for our work in autonomous robotics and introduces the computational hardware in use. Section 2 presents two theorems concerning the storage capacity and stability of neural networks. Section 3 presents a novel load-balancing algorithm implemented with a neural network. Section 4 introduces the robotics test bed now in place. Section 5 concerns navigation issues in the test-bed system. Finally, Section 6 presents a frequency-coded network model and shows how Darwinian techniques are applied to issues of parameter optimization and on-line design.

  5. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  6. Autonomous learning in humanoid robotics through mental imagery.

    PubMed

    Di Nuovo, Alessandro G; Marocco, Davide; Di Nuovo, Santo; Cangelosi, Angelo

    2013-05-01

    In this paper we focus on modeling autonomous learning to improve performance of a humanoid robot through a modular artificial neural networks architecture. A model of a neural controller is presented, which allows a humanoid robot iCub to autonomously improve its sensorimotor skills. This is achieved by endowing the neural controller with a secondary neural system that, by exploiting the sensorimotor skills already acquired by the robot, is able to generate additional imaginary examples that can be used by the controller itself to improve the performance through a simulated mental training. Results and analysis presented in the paper provide evidence of the viability of the approach proposed and help to clarify the rational behind the chosen model and its implementation. PMID:23122490

  7. System safety analysis of an autonomous mobile robot

    SciTech Connect

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  8. Biomimetic smart sensors for autonomous robotic behavior II: vestibular processing

    NASA Astrophysics Data System (ADS)

    Xue, Shuwan; Deligeorges, Socrates; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Limited autonomous behaviors are fast becoming a critical capability in the field of robotics as robotic applications are used in more complicated and interactive environments. As additional sensory capabilities are added to robotic platforms, sensor fusion to enhance and facilitate autonomous behavior becomes increasingly important. Using biology as a model, the equivalent of a vestibular system needs to be created in order to orient the system within its environment and allow multi-modal sensor fusion. In mammals, the vestibular system plays a central role in physiological homeostasis and sensory information integration (Fuller et al, Neuroscience 129 (2004) 461-471). At the level of the Superior Colliculus in the brain, there is multimodal sensory integration across visual, auditory, somatosensory, and vestibular inputs (Wallace et al, J Neurophysiol 80 (1998) 1006-1010), with the vestibular component contributing a strong reference frame gating input. Using a simple model for the deep layers of the Superior Colliculus, an off-the-shelf 3-axis solid state gyroscope and accelerometer was used as the equivalent representation of the vestibular system. The acceleration and rotational measurements are used to determine the relationship between a local reference frame of a robotic platform (an iRobot Packbot®) and the inertial reference frame (the outside world), with the simulated vestibular input tightly coupled with the acoustic and optical inputs. Field testing of the robotic platform using acoustics to cue optical sensors coupled through a biomimetic vestibular model for "slew to cue" gunfire detection have shown great promise.

  9. Multiagent collaboration for experimental calibration of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Vachon, Bertrand; Berge-Cherfaoui, Veronique

    1991-03-01

    This paper presents an action in mission SOCRATES whose aim is the development of a self-calibration method for an autonomous mobile robot. The robot has to determine the precise location of the coordinate system shared by its sensors. Knowledge of this system is a sine qua non condition for efficient multisensor fusion and autonomous navigation in an unknown environment. But, as perceptions and motions are not accurate, this knowledge can only be achieved by multisensor fusion. The application described highlights this kind of problem. Multisensor fusion is used here especially in its symbolic aspect. Useful knowledge includes both numerous data coming from various sensors and suitable ways to process these data. A blackboard architecture has been chosen to manage useful information. Knowledge sources are called agents and the implement physical sensors (perceptors or actuators) as well as logical sensors (high level data processors). The problem to solve is self- calibration which includes the determination of the coordinate system R of the robot and the transformations necessary to convert data from sensor reference to R. The origin of R has been chosen to be O, the rotation center of the robot. As its genuine location may vary due to robot or ground characteristics, an experimental determination of O is attempted. A strategy for measuring distances in approximate positions is proposed. This strategy must take into account the fact that motions of the robot as well as perceptions may be inaccurate. Results obtained during experiments and future extensions of the system are discussed.

  10. Rice-obot 1: An intelligent autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R.; Ciscon, L.; Berberian, D.

    1989-01-01

    The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes.

  11. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, Aed M.; Ward, Clyde R.; Jones, Joel D.; Mallet, William R.; Harpring, Larry J.; Collins, Montenius X.; Anderson, Erin K.

    1999-01-01

    A mobile robotic system that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console.

  12. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, A.M.; Ward, C.R.; Jones, J.D.; Mallet, W.R.; Harpring, L.J.; Collins, M.X.; Anderson, E.K.

    1999-08-10

    A mobile robotic system is described that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console. 4 figs.

  13. An Aerial–Ground Robotic System for Navigation and Obstacle Mapping in Large Outdoor Areas

    PubMed Central

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  14. An aerial–ground robotic system for navigation and obstacle mapping in large outdoor areas.

    PubMed

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  15. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  16. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed. PMID:15828659

  17. Autonomous Robot System for Sensor Characterization

    SciTech Connect

    David Bruemmer; Douglas Few; Frank Carney; Miles Walton; Heather Hunting; Ron Lujan

    2004-03-01

    This paper discusses an innovative application of new Markov localization techniques that combat the problem of odometry drift, allowing a novel control architecture developed at the Idaho National Engineering and Environmental Laboratory (INEEL) to be utilized within a sensor characterization facility developed at the Remote Sensing Laboratory (RSL) in Nevada. The new robotic capability provided by the INEEL will allow RSL to test and evaluate a wide variety of sensors including radiation detection systems, machine vision systems, and sensors that can detect and track heat sources (e.g. human bodies, machines, chemical plumes). By accurately moving a target at varying speeds along designated paths, the robotic solution allows the detection abilities of a wide variety of sensors to be recorded and analyzed.

  18. A task control architecture for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Mitchell, Tom

    1990-01-01

    An architecture is presented for controlling robots that have multiple tasks, operate in dynamic domains, and require a fair degree of autonomy. The architecture is built on several layers of functionality, including a distributed communication layer, a behavior layer for querying sensors, expanding goals, and executing commands, and a task level for managing the temporal aspects of planning and achieving goals, coordinating tasks, allocating resources, monitoring, and recovering from errors. Application to a legged planetary rover and an indoor mobile manipulator is described.

  19. AMiRESot - A New Robot Soccer League with Autonomous Miniature Robots

    NASA Astrophysics Data System (ADS)

    Witkowski, Ulf; Sitte, Joaquin; Herbrechtsmeier, Stefan; Rückert, Ulrich

    AMiRESot is a new robot soccer league that is played with small autonomous miniature robots. Team sizes are defined with one, two, and three robots per team. Special to the AMiRESot league are the fully autonomous behavior of the robots and their small size. For the matches, the rules mainly follow the FIFA laws with some modifications being useful for robot soccer. The new AMiRESot soccer robot is small in size (maximum 110 mm diameter) but a powerful vehicle, equipped with a differential drive system. For sensing, the robots in their basic configuration are equipped with active infrared sensors and a color image sensor. For information processing a powerful mobile processor and reconfigurable hardware resources (FPGA) are available. Due to the robot’s modular structure it can be easily extended by additional sensing and processing resources. This paper gives an overview of the AMiRESot rules and presents details of the new robot platform used for AMiRESot.

  20. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation. PMID:20365620

  1. Robotic reactions: Delay-induced patterns in autonomous vehicle systems

    NASA Astrophysics Data System (ADS)

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  2. A fuzzy logic controller for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Yen, John; Pfluger, Nathan

    1993-01-01

    The ability of a mobile robot system to plan and move intelligently in a dynamic system is needed if robots are to be useful in areas other than controlled environments. An example of a use for this system is to control an autonomous mobile robot in a space station, or other isolated area where it is hard or impossible for human life to exist for long periods of time (e.g., Mars). The system would allow the robot to be programmed to carry out the duties normally accomplished by a human being. Some of the duties that could be accomplished include operating instruments, transporting objects, and maintenance of the environment. The main focus of our early work has been on developing a fuzzy controller that takes a path and adapts it to a given environment. The robot only uses information gathered from the sensors, but retains the ability to avoid dynamically placed obstacles near and along the path. Our fuzzy logic controller is based on the following algorithm: (1) determine the desired direction of travel; (2) determine the allowed direction of travel; and (3) combine the desired and allowed directions in order to determine a direciton that is both desired and allowed. The desired direction of travel is determined by projecting ahead to a point along the path that is closer to the goal. This gives a local direction of travel for the robot and helps to avoid obstacles.

  3. Omnivision-based autonomous mobile robotic platform

    NASA Astrophysics Data System (ADS)

    Cao, Zuoliang; Hu, Jun; Cao, Jin; Hall, Ernest L.

    2001-10-01

    As a laboratory demonstration platform, TUT-I mobile robot provides various experimentation modules to demonstrate the robotics technologies that are involved in remote control, computer programming, teach-and-playback operations. Typically, the teach-and-playback operation has been proved to be an effective solution especially in structured environments. The path generated in the teach mode and path correction in real-time using path error detecting in the playback mode are demonstrated. The vision-based image database is generated as the given path representation in the teaching procedure. The algorithm of an online image positioning is performed for path following. Advanced sensory capability is employed to provide environment perception. A unique omni directional vision (omni-vision) system is used for localization and navigation. The omni directional vision involves an extremely wide-angle lens, which has the feature that a dynamic omni-vision image is processed in real time to respond the widest view during the movement. The beacon guidance is realized by observing locations of points derived from over-head features such as predefined light arrays in a building. The navigation approach is based upon the omni-vision characteristics. A group of ultrasonic sensors is employed for obstacle avoidance.

  4. Autonomous robotic operations for on-orbit satellite servicing

    NASA Astrophysics Data System (ADS)

    Ogilvie, Andrew; Allport, Justin; Hannah, Michael; Lymer, John

    2008-04-01

    The Orbital Express Demonstration System (OEDS) flight test successfully demonstrated technologies required to autonomously service satellites on-orbit. The mission's integrated robotics solution, the Orbital Express Demonstration Manipulator System (OEDMS) developed by MDA, performed critical flight test operations. The OEDMS comprised a six-jointed robotic manipulator arm and its avionics, non-proprietary servicing and ORU (Orbital Replacement Unit) interfaces, a vision and arm control system for autonomous satellite capture, and a suite of Ground Segment and Flight Segment software allowing script generation and execution under supervised or full autonomy. The arm was mounted on ASTRO, the servicer spacecraft developed by Boeing. The NextSat, developed by Ball Aerospace, served as the client satellite. The OEDMS demonstrated two key goals of the OEDS flight test: autonomous free-flyer capture and berthing of a client satellite, and autonomous transfer of ORUs from servicer to client and back. The paper provides a description of the OEDMS and the key operations it performed.

  5. Autonomous Robotic Refueling System (ARRS) for rapid aircraft turnaround

    NASA Astrophysics Data System (ADS)

    Williams, O. R.; Jackson, E.; Rueb, K.; Thompson, B.; Powell, K.

    An autonomous robotic refuelling system is being developed to achieve rapid aircraft turnaround, notably during combat operations. The proposed system includes a gantry positioner with sufficient reach to position a robotic arm that performs the refuelling tasks; a six degree of freedom manipulator equipped with a remote center of compliance, torque sensor, and a gripper that can handle standard tools; a computer vision system to locate and guide the refuelling nozzle, inspect the nozzle, and avoid collisions; and an operator interface with video and graphics display. The control system software will include components designed for trajectory planning and generation, collision detection, sensor interfacing, sensory processing, and human interfacing. The robotic system will be designed so that upgrading to perform additional tasks will be relatively straightforward.

  6. Active object programming for military autonomous mobile robot software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-10-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge panel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  7. Active objects programming for military autonomous mobile robots software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-09-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge pannel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  8. Acquisition of Autonomous Behaviors by Robotic Assistants

    NASA Technical Reports Server (NTRS)

    Peters, R. A., II; Sarkar, N.; Bodenheimer, R. E.; Brown, E.; Campbell, C.; Hambuchen, K.; Johnson, C.; Koku, A. B.; Nilas, P.; Peng, J.

    2005-01-01

    Our research achievements under the NASA-JSC grant contributed significantly in the following areas. Multi-agent based robot control architecture called the Intelligent Machine Architecture (IMA) : The Vanderbilt team received a Space Act Award for this research from NASA JSC in October 2004. Cognitive Control and the Self Agent : Cognitive control in human is the ability to consciously manipulate thoughts and behaviors using attention to deal with conflicting goals and demands. We have been updating the IMA Self Agent towards this goal. If opportunity arises, we would like to work with NASA to empower Robonaut to do cognitive control. Applications 1. SES for Robonaut, 2. Robonaut Fault Diagnostic System, 3. ISAC Behavior Generation and Learning, 4. Segway Research.

  9. Evolving self-assembly in autonomous homogeneous robots: experiments with two physical robots.

    PubMed

    Ampatzis, Christos; Tuci, Elio; Trianni, Vito; Christensen, Anders Lyhne; Dorigo, Marco

    2009-01-01

    This research work illustrates an approach to the design of controllers for self-assembling robots in which the self-assembly is initiated and regulated by perceptual cues that are brought forth by the physical robots through their dynamical interactions. More specifically, we present a homogeneous control system that can achieve assembly between two modules (two fully autonomous robots) of a mobile self-reconfigurable system without a priori introduced behavioral or morphological heterogeneities. The controllers are dynamic neural networks evolved in simulation that directly control all the actuators of the two robots. The neurocontrollers cause the dynamic specialization of the robots by allocating roles between them based solely on their interaction. We show that the best evolved controller proves to be successful when tested on a real hardware platform, the swarm-bot. The performance achieved is similar to the one achieved by existing modular or behavior-based approaches, also due to the effect of an emergent recovery mechanism that was neither explicitly rewarded by the fitness function, nor observed during the evolutionary simulation. Our results suggest that direct access to the orientations or intentions of the other agents is not a necessary condition for robot coordination: Our robots coordinate without direct or explicit communication, contrary to what is assumed by most research works in collective robotics. This work also contributes to strengthening the evidence that evolutionary robotics is a design methodology that can tackle real-world tasks demanding fine sensory-motor coordination. PMID:19463056

  10. Autonomous intelligent robotic manipulator for on-orbit servicing

    NASA Astrophysics Data System (ADS)

    Larouche, Benoit P.

    The doctoral research is to develop an autonomous intelligent robotic manipulator technology for on-orbit servicing (OOS). More specifically, the research is focused on one of the most critical tasks in OOS- the capture of a non-cooperative object whilst minimizing impact forces and accelerations. The objective of the research is: the development of a vision-based control theory, and the implementation and testing of the developed theory by designing and constructing a custom non-redundant holonomic robotic manipulator. The research validated the newly developed control theory and its ability to (i) capture a moving target autonomously and (ii) minimize unfavourable contact dynamics during the most critical parts of the capture operations between the capture satellite and a non-cooperative/tumbling object. A custom robotic manipulator functional prototype has been designed, assembled, constructed, and programmed from concept to completion in order to provide full customizability and controllability in both the hardware and the software. Based on the test platform, a thorough experimental investigation has been conducted to validate the newly developed control methodologies to govern the behaviour of the robotic manipulators (RM) in an autonomous capture. The capture itself is effected on non-cooperative targets in zero-gravity simulated environment. The RM employs a vision system, force sensors, and encoders in order to sense its environment. The control is effected through position and pseudo-torque inputs to three stepper motors and three servo motors. The controller is a modified hybrid force/neural network impedance controller based on N. Hogan's original work. The experimental results demonstrate the set objectives of this thesis have been successfully achieved.

  11. Design and simulation of a motion controller for a wheeled mobile-robot autonomous navigation

    NASA Astrophysics Data System (ADS)

    Alhaj Ali, Souma M.; Hall, Ernest L.

    2005-10-01

    This paper describes the development of PD, PID Computed-Torque (CT), and a PD digital motion controller for the autonomous navigation of a Wheeled Mobile Robot (WMR) in outdoor environments. The controllers select the suitable control torques, so that the WMR follows the desired path produced from a navigation algorithm described in a previous paper. PD CT, PID CT, and PD digital controllers were developed using a linear system design procedure to select the feedback control signal that stabilizes the tracking error equation. The torques needed for the motors were computed by using the inverse of the dynamic equation for the WMR. Simulation software was developed to simulate the performance and efficiency of the controllers. Simulation results verified the effectiveness of the controllers under different motion trajectories, comparing the performance of the three controllers shows that the PD digital controller was the best where the tracking error did not exceed .05 using 20 msec sample period. The significance of this work lies in the development of CT and digital controllers for WMR navigation, instead of robot manipulators. These CT controllers will facilitate the use of WMRs in many applications including defense, industrial, personal, and medical robots.

  12. A software architecture for autonomous orbital robotics

    NASA Astrophysics Data System (ADS)

    Henshaw, Carl G.; Akins, Keith; Creamer, N. Glenn; Faria, Matthew; Flagg, Cris; Hayden, Matthew; Healy, Liam; Hrolenok, Brian; Johnson, Jeffrey; Lyons, Kimberly; Pipitone, Frank; Tasker, Fred

    2006-05-01

    SUMO, the Spacecraft for the Universal Modification of Orbits, is a DARPA-sponsored spacecraft designed to provide orbital repositioning services to geosynchronous satellites. Such services may be needed to facilitate changing the geostationary slot of a satellite, to allow a satellite to be used until the propellant is expended instead of reserving propellant for a retirement burn, or to rescue a satellite stranded in geosynchronous transfer orbit due to a launch failure. Notably, SUMO is being designed to be compatible with the current geosynchronous satellite catalog, which implies that it does not require the customer spacecraft to have special docking fixtures, optical guides, or cooperative communications or pose sensors. In addition, the final approach and grapple will be performed autonomously. SUMO is being designed and built by the Naval Center for Space Technology, a division of the U.S. Naval Research Laboratory in Washington, DC. The nature of the SUMO concept mission leads to significant challenges in onboard spacecraft autonomy. Also, because research and development in machine vision, trajectory planning, and automation algorithms for SUMO is being pursued in parallel with flight software development, there are considerable challenges in prototyping and testing algorithms in situ and in transitioning these algorithms from laboratory form into software suitable for flight. This paper discusses these challenges, outlining the current SUMO design from the standpoint of flight algorithms and software. In particular, the design of the SUMO phase 1 laboratory demonstration software is described in detail. The proposed flight-like software architecture is also described.

  13. Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)

    NASA Technical Reports Server (NTRS)

    Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey

    1990-01-01

    Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.

  14. Low-cost semi-autonomous manipulation technique for explosive ordnance disposal robots

    NASA Astrophysics Data System (ADS)

    Czop, Andrew; Del Signore, Michael J.; Hacker, Kurt

    2008-04-01

    Robotic manipulators used on current EOD robotic platforms exhibit very few autonomous capabilities. This lack of autonomy forces the operator to completely control manipulator movements. With the increasing complexity of robotic manipulators, this can prove to be a very complex and tedious task. The development of autonomous capabilities for platform navigation are currently being extensively researched and applied to EOD robots. While autonomous manipulation has also been researched, this technology has yet to appear in fielded EOD robotic systems. As a result, there is a need for the exploration and development of manipulator automation within the scope of EOD robotics. In addition, due to the expendable nature of EOD robotic assets, the addition of this technology needs to add little to the overall cost of the robotic system. To directly address the need for a low-cost semi-autonomous manipulation capability for EOD robots, the Naval Explosive Ordnance Disposal Technology Division (NAVEODTECHDIV) proposes the Autonomous Robotic Manipulator (ARM). The ARM incorporates several semi-autonomous manipulation behaviors including point-and-click movement, user-defined distance movement, user-defined angle positioning, memory locations to save and recall manipulator positions, and macros to memorize and repeat multi-position repetitive manipulator movements. These semi-autonomous behaviors will decrease an EOD operator's time on target by reducing the manipulation workload in a user-friendly fashion. This conference paper will detail the background of the project, design of the prototype, algorithm development, implementation, results, and future work.

  15. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  16. Dynamic map building for an autonomous mobile robot

    SciTech Connect

    Leonard, J.J.; Durrant-Whyte, H.F. ); Cox, I.J. )

    1992-08-01

    This article presents an algorithm for autonomous map building and maintenance for a mobile robot. The authors believe that mobile robot navigation can be treated as a problem of tracking geometric features that occur naturally in the environment. They represent each feature in the map by a location estimate (the feature state vector) and two distinct measures of uncertainty: a covariance matrix to represent uncertainty in feature location, and a credibility measure to represent their belief in the validity of the feature. During each position update cycle, predicted measurements are generated for each geometric feature in the map and compared with actual sensor observations. Successful matches cause a feature's credibility to be increased. Unpredicted observations are used to initialize new geometric features, while unobserved predictions result in a geometric feature's credibility being decreased. They also describe experimental results obtained with the algorithm that demonstrate successful map building using real sonar data.

  17. Emotion understanding from the perspective of autonomous robots research.

    PubMed

    Cañamero, Lola

    2005-05-01

    In this paper, I discuss some of the contributions that modeling emotions in autonomous robots can make towards understanding human emotions-'as sited in the brain' and as used in our interactions with the environment-and emotions in general. Such contributions are linked, on the one hand, to the potential use of such robotic models as tools and 'virtual laboratories' to test and explore systematically theories and models of human emotions, and on the other hand to a modeling approach that fosters conceptual clarification and operationalization of the relevant aspects of theoretical notions and models. As illustrated by an overview of recent advances in the field, this area is still in its infancy. However, the work carried out already shows that we share many conceptual problems and interests with other disciplines in the affective sciences and that sound progress necessitates multidisciplinary efforts. PMID:15963689

  18. The WPI Autonomous Mobile Robot Project: A Progress Report

    NASA Astrophysics Data System (ADS)

    Green, Peter E.; Hall, Kyle S.

    1987-01-01

    This paper presents a report on the WPI autonomous mobile robot (WAMR). This robot is currently under development by the Intelligent Machines Project at WPI. Its purpose is to serve as a testbed for real-time artificial intelligence. WAMR is expected to find its way from one place in a building to another, avoiding people and obstacles enroute. It is given no a priori knowledge of the building, but must learn about its environment by goal-directed exploration. Design concepts and descriptions of the major items completed thus far are presented. WAMR is a self-contained, wheeled robot that uses evidence based techniques to reason about actions. The robot builds and continually updates a world model of its environment. This is done using a combination of ultrasonic and visual data. This world model is interpreted and movement plans are generated by a planner utilizing uses real-time incremental evidence techniques. These movement plans are then carried out by a hierarchical evidence-based adaptive controller. Two interesting features of the robot are the line imaging ultrasonic sensor and the video subsystem. The former uses frequency variation to form a line image of obstacles between one and twenty feet in front of the robot. The latter attempts to mimic the human eye using neural network pattern recognition techniques. Several items have been completed thus far. The paper describes some of these, including the multiprocessor navigator and non-skid motion control system, the ultrasonic line imager, the concepts of the vision system, and the computer hardware and software environment.

  19. Autonomous, teleoperated, and shared control of robot systems

    SciTech Connect

    Anderson, R.J.

    1994-12-31

    This paper illustrates how different modes of operation such as bilateral teleoperation, autonomous control, and shared control can be described and implemented using combinations of modules in the SMART robot control architecture. Telerobotics modes are characterized by different ``grids`` of SMART icons, where each icon represents a portion of run-time code that implements a passive control law. By placing strict requirements on the module`s input-output behavior and using scattering theory to develop a passive sampling technique, a flexible, expandable telerobot architecture is achieved. An automatic code generation tool for generating SMART systems is also described.

  20. On autonomous terrain model acquistion by a mobile robot

    NASA Technical Reports Server (NTRS)

    Rao, N. S. V.; Iyengar, S. S.; Weisbin, C. R.

    1987-01-01

    The following problem is considered: A point robot is placed in a terrain populated by an unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on scan from vertices strategy is given by the summation of i = 1 (sup n) N(O sub i)-n and summation of i = 1 (sup n) N(O sub i)-2n in two- and three-dimensional terrains respectively, where O = (O sub 1, O sub 2,....O sub n) set of the obstacles in the terrain, and N(O sub i) is the number of vertices of the obstacle O sub i.

  1. Design of a Micro-Autonomous Robot for Use in Astronomical Instruments

    NASA Astrophysics Data System (ADS)

    Cochrane, W. A.; Luo, X.; Lim, T.; Taylor, W. D.; Schnetler, H.

    2012-07-01

    A Micro-Autonomous Positioning System (MAPS) has been developed using micro-autonomous robots for the deployment of small mirrors within multi-object astronomical instruments for use on the next generation ground-based telescopes. The micro-autonomous robot is a two-wheel differential drive robot with a footprint of approximately 20 × 20 mm. The robot uses two brushless DC Smoovy motors with 125:1 planetary gearheads for positioning the mirror. This article describes the various elements of the overall system and in more detail the various robot designs. Also described in this article is the build and test of the most promising design, proving that micro-autonomous robot technology can be used in precision controlled applications.

  2. Recognition of traversable areas for mobile robotic navigation in outdoor environments.

    SciTech Connect

    Hutchinson, Scott Alan; Davidson, James C.

    2003-06-01

    In this paper we consider the problem of automatically determining whether regions in an outdoor environment can be traversed by a mobile robot. We propose a two-level classifier that uses data from a single color image to make this determination. At the low level, we have implemented three classifiers based on color histograms, directional filters and local binary patterns. The outputs of these low level classifiers are combined using a voting scheme that weights the results of each classifier using an estimate of its error probability. We present results from a large number of trials using a database of representative images acquired in real outdoor environments.

  3. Using insect electroantennogram sensors on autonomous robots for olfactory searches.

    PubMed

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities or explosive traces in landmine fields face the same problem as insects foraging for food or searching for mates: the olfactory search is constrained by the physics of turbulent transport. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells or toxic and illicit substances. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration or using nanostructured gas sensors that mimic insect antennae. PMID:25145980

  4. Using Insect Electroantennogram Sensors on Autonomous Robots for Olfactory Searches

    PubMed Central

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities1 or explosive traces in landmine fields2 face the same problem as insects foraging for food or searching for mates3: the olfactory search is constrained by the physics of turbulent transport4. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7 but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells8 or toxic and illicit substances9-11. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14 or using nanostructured gas sensors that mimic insect antennae15

  5. Study on a human guidance method for autonomous cruise of indoor robot

    NASA Astrophysics Data System (ADS)

    Jia, Bao-Zhi; Zhu, Ming

    2011-12-01

    This paper describes a method of human guidance for autonomous cruise of indoor robot. A low-cost robot follows a person in a room and notes the path for autonomous cruise using its monocular vision. A method of video-based object detection and tracking is taken to detect the target by the video received from the robot's camera. The validity of the human guidance method is proved by the experiment.

  6. Laser-Based Pedestrian Tracking in Outdoor Environments by Multiple Mobile Robots

    PubMed Central

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  7. Laser-based pedestrian tracking in outdoor environments by multiple mobile robots.

    PubMed

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  8. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  9. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  10. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree

    PubMed Central

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J.

    2015-01-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon’s operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis. PMID:26405563

  11. Integrated robotic vehicle control system for outdoor container handling

    NASA Astrophysics Data System (ADS)

    Viitanen, Jouko O.; Haverinen, Janne; Mattila, Pentti; Maekelae, Hannu; von Numers, Thomas; Stanek, Zbigniev; Roening, Juha

    1997-09-01

    We describe an integrated system developed for use onboard a moving work machine. The machine is targeted to such applications as e.g. automatic container handling at loading terminals. The main emphasis is on the various environment perception duties required by autonomous or semi-autonomous operation. These include obstacle detection, container position determination, localization needed for efficient navigation and measurement of docking and grasping locations of containers. Practical experience is reported on the use of several different types of technologies for the tasks. For close distance measurement, such as container row following, ultrasonic measurement was used, with associated control software. For obstacle and docking position detection, 3D active vision techniques were developed with structured lighting, utilizing also motion estimation techniques. Depth from defocus-based methods were developed for passive 3D vision. For localization, fusion of data from several sources was carried out. These included dead-reckoning data from odometry, an inertial unit, and several alternative external localization devices, i.e. real-time kinematic GPS, inductive and optical transponders. The system was integrated to run on a real-time operating system platform, using a high-level software specification tool that created the hierarchical control structure of the software.

  12. Assessment of a visually guided autonomous exploration robot

    NASA Astrophysics Data System (ADS)

    Harris, C.; Evans, R.; Tidey, E.

    2008-10-01

    A system has been developed to enable a robot vehicle to autonomously explore and map an indoor environment using only visual sensors. The vehicle is equipped with a single camera, whose output is wirelessly transmitted to an off-board standard PC for processing. Visual features within the camera imagery are extracted and tracked, and their 3D positions are calculated using a Structure from Motion algorithm. As the vehicle travels, obstacles in its surroundings are identified and a map of the explored region is generated. This paper discusses suitable criteria for assessing the performance of the system by computer-based simulation and practical experiments with a real vehicle. Performance measures identified include the positional accuracy of the 3D map and the vehicle's location, the efficiency and completeness of the exploration and the system reliability. Selected results are presented and the effect of key system parameters and algorithms on performance is assessed. This work was funded by the Systems Engineering for Autonomous Systems (SEAS) Defence Technology Centre established by the UK Ministry of Defence.

  13. Mechanical deployment system on aries an autonomous mobile robot

    SciTech Connect

    Rocheleau, D.N.

    1995-12-01

    ARIES (Autonomous Robotic Inspection Experimental System) is under development for the Department of Energy (DOE) to survey and inspect drums containing low-level radioactive waste stored in warehouses at DOE facilities. This paper focuses on the mechanical deployment system-referred to as the camera positioning system (CPS)-used in the project. The CPS is used for positioning four identical but separate camera packages consisting of vision cameras and other required sensors such as bar-code readers and light stripe projectors. The CPS is attached to the top of a mobile robot and consists of two mechanisms. The first is a lift mechanism composed of 5 interlocking rail-elements which starts from a retracted position and extends upward to simultaneously position 3 separate camera packages to inspect the top three drums of a column of four drums. The second is a parallelogram special case Grashof four-bar mechanism which is used for positioning a camera package on drums on the floor. Both mechanisms are the subject of this paper, where the lift mechanism is discussed in detail.

  14. Versatile 360-deg panoramic optical system for autonomous robots

    NASA Astrophysics Data System (ADS)

    Barton, George G.; Feldman, Sidney; Beckstead, Jeffrey A.; Nordhauser, Sidney R.

    1999-01-01

    Autonomous mobile robots require wide angle vision for navigation and threat detection and analysis, best served with full panoramic vision. The panoramic optical element is a unique inexpensive first surface reflective aspheric convex cone. This cone can be sized and configured for any vertical FOV desired. The cone acts as a negative optical element generating a panoramic virtual image. When this virtual image is viewed through a standard camera lens it produces at the lenses focal pane a panoramic toroidal image with a translational linearity of > 99 percent. One of three image transducers can be used to convert the toroidal panoramic image to a video signal. Raster scanned CCDs, radially scanned Vidicons and linear CCD arrays on a mechanically rotated state, each have their own particular advantage. Field object distances can be determined in two ways. If the robot is moving the range can be calculated by the size change of a field object versus the distance traversed in a specific time interval. By vertically displacing the panoramic camera by several inches a quasibinocular system is created and the range determined by simple math. Ranging thus produces the third dimension.

  15. PRIMUS: realization aspects of an autonomous unmanned robot

    NASA Astrophysics Data System (ADS)

    Schwartz, Ingo

    1998-07-01

    In the experimental program PRIMUS (PRogram of Intelligent Mobile Unmanned Systems) there shall be shown the autonomous driving of an unmanned robot in open terrain. The goal is to achieve the most possible degree of autonomy. A small tracked vehicle (Wiesel 2) is used as a robot vehicle. This tank is configured as a 'drive by wire-'system and is therefore well suited for the adaptation of control computers. For navigation and orientation in open terrain a sensor package is integrated. To detect obstacles the scene in the driving corridor of the robot is scanned 4 times per second by a 3D- Range image camera (LADAR). The measured 3D-range image is converted into a 2D-obstacle map and used as input for calculation of an obstacle free path. The combination of local navigation (obstacle avoidance) and global navigation leads to a collision free driving in open terrain to a predefined goal point with a velocity of up to 25 km/h. In addition a contour tracker with a TV-camera as sensor is implemented which allows to follow contours (edge of a meadow) or to drive on paved and unpaved roads with a velocity up to 50 km/h. Because of the driving in open terrain there are given high demands on the real time implementation of all the sub-functions in the system. For the most part the described functions will be coded in the programming language Ada. The software will be embedded in a distributed VMEbus-based multicomputer- /multiprocessor system. Up to 20 PowerPC 603 and some 68030/40-CPUs are used to build up a high performance computer system. The Hardware (HW) is adapted to the environmental conditions of the tracked vehicle.

  16. [Locomotion and control study on autonomous interventional diagnostic micro-robots].

    PubMed

    Gu, Da-qiang; Zhou, Yong

    2008-09-01

    This paper introduces the locomotion control and the research status of the autonomous interventional diagnostic micro-robots in detail, outlines technical problems and difficulties now existing, and discusses the developing trend of locomotion control. PMID:19119659

  17. Interaction dynamics of multiple autonomous mobile robots in bounded spatial domains

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    A general navigation strategy for multiple autonomous robots in a bounded domain is developed analytically. Each robot is modeled as a spherical particle (i.e., an effective spatial domain about the center of mass); its interactions with other robots or with obstacles and domain boundaries are described in terms of the classical many-body problem; and a collision-avoidance strategy is derived and combined with homing, robot-robot, and robot-obstacle collision-avoidance strategies. Results from homing simulations involving (1) a single robot in a circular domain, (2) two robots in a circular domain, and (3) one robot in a domain with an obstacle are presented in graphs and briefly characterized.

  18. Human-robot interaction for field operation of an autonomous helicopter

    NASA Astrophysics Data System (ADS)

    Jones, Henry L.; Frew, Eric W.; Woodley, Bruce R.; Rock, Stephen M.

    1999-01-01

    The robustness of autonomous robotic systems to unanticipated circumstances is typically insufficient for use in the field. The many skills of a human user often fill this gap in robotic capability. To incorporate the human into the system, a useful interaction between man and machine must exist. This interaction should enable useful communication to be exchanged in a natural way between human and robot on a variety of levels. This paper describes the current human-robot interaction of the Stanford HUMMINGBIRD autonomous helicopter. In particular, the paper discuses the elements of the system that enable multiple levels of communication. An intelligent system agent manages the different inputs given to the helicopter. An advanced user interface gives the user and helicopter a method for exchanging useful information. Using this human-robot interaction, the HUMMINGBIRD has carried out various autonomous search, tracking, and retrieval missions.

  19. Concept for practical exercises for studying autonomous flying robots in a university environment: Part I

    NASA Astrophysics Data System (ADS)

    Band, Ricardo; Pleban, Johann; Schön, Stefan; Creutzburg, Reiner; Fischer, Arno

    2013-03-01

    The aim of this paper is to demonstrate the usefulness of a concept of practical exercises for studying autonomous flying robots for computer science students in a university environment. It will show how students may assemble, program, fly, network and apply autonomous flying robots i.e. drones, quadrocopters, hexacopters, octocopters, helicopters, helicams, bugbots in different exercises, improve their skills, theoretical and practical knowledge in different aspects.

  20. An effective trace-guided wavefront navigation and map-building approach for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Jan, Gene Eu

    2013-12-01

    This paper aims to address a trace-guided real-time navigation and map building approach of an autonomous mobile robot. Wave-front based global path planner is developed to generate a global trajectory for an autonomous mobile robot. Modified Vector Field Histogram (M-VFH) is employed based on the LIDAR sensor information to guide the robot locally to be autonomously traversed with obstacle avoidance by following traces provided by the global path planner. A local map composed of square grids is created through the local navigator while the robot traverses with limited LIDAR sensory information. From the measured sensory information, a map of the robot's immediate limited surroundings is dynamically built for the robot navigation. The real-time wave-front based navigation and map building methodology has been successfully demonstrated in a Player/Stage simulation environment. With the wave-front-based global path planner and M-VFH local navigator, a safe, short, and reasonable trajectory is successfully planned in a majority of situations without any templates, without explicitly optimizing any global cost functions, and without any learning procedures. Its effectiveness, feasibility, efficiency and simplicity of the proposed real-time navigation and map building of an autonomous mobile robot have been successfully validated by simulation and comparison studies. Comparison studies of the proposed approach with the other path planning approaches demonstrate that the proposed method is capable of planning more reasonable and shorter collision-free trajectories autonomously.

  1. Robot mapping in large-scale mixed indoor and outdoor environments

    NASA Astrophysics Data System (ADS)

    Rogers, John G.; Young, Stuart; Gregory, Jason; Nieto-Granda, Carlos; Christensen, Henrik I.

    2013-05-01

    Tactical situational awareness in unstructured and mixed indoor / outdoor scenarios is needed for urban combat as well as rescue operations. Two of the key functionalities needed by robot systems to function in an unknown environment are the ability to build a map of the environment and to determine its position within that map. In this paper, we present a strategy to build dense maps and to automatically close loops from 3D point clouds; this has been integrated into a mapping system dubbed OmniMapper. We will present both the underlying system, and experimental results from a variety of environments such as office buildings, at military training facilities and in large scale mixed indoor and outdoor environments.

  2. Distributed, collaborative human-robotic networks for outdoor experiments in search, identify and track

    NASA Astrophysics Data System (ADS)

    Lee, Daniel; McClelland, Mark; Schneider, Joseph; Yang, Tsung-Lin; Gallagher, Dan; Wang, John; Shah, Danelle; Ahmed, Nisar; Moran, Pete; Jones, Brandon; Leung, Tung-Sing; Nathan, Aaron; Kress-Gazit, Hadas; Campbell, Mark

    2010-10-01

    This paper presents an overview of a human-robotic system under development at Cornell which is capable of mapping an unknown environment, as well as discovering, tracking, and neutralizing several static and dynamic objects of interest. In addition, the robots can coordinate their individual tasks with one another without overly burdening a human operator. The testbed utilizes the Segway RMP platform, with lidar, vision, IMU and GPS sensors. The software draws from autonomous systems research, specifically in the areas of pose estimation, target detection and tracking, motion and behavioral planning, and human robot interaction. This paper also details experimental scenarios of mapping, tracking, and neutralization presented by way of pictures, data, and movies.

  3. Behavior-based multi-robot collaboration for autonomous construction tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.

  4. Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 and Smart Autonomous Sand-Swimming Excavator

    NASA Technical Reports Server (NTRS)

    Sandy, Michael

    2015-01-01

    The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.

  5. A hybrid microbial dielectric elastomer generator for autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, Iain A.; Ieropoulos, Ioannis; McKay, Thomas; O'Brien, Benjamin; Melhuish, Chris

    2010-04-01

    We are developing a hybrid Dielectric Elastomer Generator (DEG)-Microbial Fuel Cell (MFC) energy harvester . The system is for EcoBot, an Autonomous Robot (AR) that currently uses its MFCs to extract electrical energy from biomass, in the form of flies. MFCs, though reliable are slow to store charge. Thus, EcoBot operations are characterized by active periods followed by dormant periods when energy stores recover. Providing an alternate energy harvester such as a DEG, driven by wind or water, could therefore increase active time and also provide high voltage energy for direct use by on-board systems employing dielectric elastomer actuators (DEAs). Energy can be harvested from a DEG when work is done on its elastomer membrane.. However, the DEG requires an initial charge and additional charge to compensate for losses due to leakage. The starting charge can be supplied by the EcoBot MFC capacitor. We have developed a self-primer circuit that uses some of the harvested charge to prime the membrane at each cycle. The low voltage MFC initial priming charge was boosted using a voltage converter that was then electrically disconnected. The DEG membrane was cyclically stretched producing charge that replenished leakage losses and energy that could potentially be stored. A further study demonstrated that the DEG with self-primer circuit can boost voltage from very low values without the need for a voltage converter, thus reducing circuit complexity and improving efficiency.

  6. Biomimetic autonomous robot inspired by the Cyanea capillata (Cyro).

    PubMed

    Villanueva, Alex A; Marut, Kenneth J; Michael, Tyler; Priya, Shashank

    2013-12-01

    A biomimetic robot inspired by Cyanea capillata, termed as 'Cyro', was developed to meet the functional demands of underwater surveillance in defense and civilian applications. The vehicle was designed to mimic the morphology and swimming mechanism of the natural counterpart. The body of the vehicle consists of a rigid support structure with linear DC motors which actuate eight mechanical arms. The mechanical arms in conjunction with artificial mesoglea create the hydrodynamic force required for propulsion. The full vehicle measures 170 cm in diameter and has a total mass of 76 kg. An analytical model of the mechanical arm kinematics was developed. The analytical and experimental bell kinematics were analyzed and compared to the C. capillata. Cyro was found to reach the water surface untethered and autonomously from a depth of 182 cm in five actuation cycles. It achieved an average velocity of 8.47 cm s(-1) while consuming an average power of 70 W. A two-axis thrust stand was developed to calculate the thrust directly from a single bell segment yielding an average thrust of 27.9 N for the whole vehicle. Steady state velocity during Cyro's swimming test was not reached but the measured performance during its last swim cycle resulted in a cost of transport of 10.9 J (kg ⋅ m)(-1) and total efficiency of 0.03. PMID:24166747

  7. Meeting the Complex 21 challenge: Autonomous mobile robotics

    SciTech Connect

    Holland, J.M.

    1993-12-31

    Complex 21 focuses attention on developing the technology to store, inventory, account for, protect, and maintain nuclear material into the 21st Century. The optimum nuclear storage facility would be one operated by a minimum number of on-site personnel. ``As many people as necessary and as few as possible,`` would be a good rule of thumb for staffing a nuclear storage site. Human presence adds certain safety and security considerations to the technology equation. It is no small chore to fashion a technological solution that meets the combined challenges of nuclear material handling, physical protection, inventory management, fire watch, security, and personnel safety. What is needed is a multi-purpose technology with industrial, military, scientific, and security applications; a technology that can pull and carry; one that can autonomously patrol large facilities, monitor environmental conditions, detect smoke, gas, flame, heat, humidity, intrusion, chemical, and radiation hazards; one that can survey inventory and keep accurate, detailed data logs; one that can react to and interact with people, material, equipment, conditions, and events in its work environment. Cybermotion Inc., of Roanoke, VA, has been designing, manufacturing, selling and supporting mobile robotic systems since 1984. The company`s systems are at work in research, industrial, military, nuclear, security, and hazardous environment applications around the world. This paper describes some of these applications and especially the type of instruments they carry to perform monitoring and security patrols.

  8. Concept formation and generalization based on experimentation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Lyness, E.; Oliver, G.; Silliman, M.

    1989-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning problems which involves autonomous concept formation using feedback from trial-and-error learning. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 9 refs., 5 figs.

  9. Application of autonomous robotized systems for the collection of nearshore topographic changing and hydrodynamic measurements

    NASA Astrophysics Data System (ADS)

    Belyakov, Vladimir; Makarov, Vladimir; Zezyulin, Denis; Kurkin, Andrey; Pelinovsky, Efim

    2015-04-01

    Hazardous phenomena in the coastal zone lead to the topographic changing which are difficulty inspected by traditional methods. It is why those autonomous robots are used for collection of nearshore topographic and hydrodynamic measurements. The robot RTS-Hanna is well-known (Wubbold, F., Hentschel, M., Vousdoukas, M., and Wagner, B. Application of an autonomous robot for the collection of nearshore topographic and hydrodynamic measurements. Coastal Engineering Proceedings, 2012, vol. 33, Paper 53). We describe here several constructions of mobile systems developed in Laboratory "Transported Machines and Transported Complexes", Nizhny Novgorod State Technical University. They can be used in the field surveys and monitoring of wave regimes nearshore.

  10. An intelligent hybrid behavior coordination system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Fallouh, Samer

    2013-12-01

    In this paper, development of a low-cost PID controller with an intelligent behavior coordination system for an autonomous mobile robot is described that is equipped with IR sensors, ultrasonic sensors, regulator, and RC filters on the robot platform based on HCS12 microcontroller and embedded systems. A novel hybrid PID controller and behavior coordination system is developed for wall-following navigation and obstacle avoidance of an autonomous mobile robot. Adaptive control used in this robot is a hybrid PID algorithm associated with template and behavior coordination models. Software development contains motor control, behavior coordination intelligent system and sensor fusion. In addition, the module-based programming technique is adopted to improve the efficiency of integrating the hybrid PID and template as well as behavior coordination model algorithms. The hybrid model is developed to synthesize PID control algorithms, template and behavior coordination technique for wall-following navigation with obstacle avoidance systems. The motor control, obstacle avoidance, and wall-following navigation algorithms are developed to propel and steer the autonomous mobile robot. Experiments validate how this PID controller and behavior coordination system directs an autonomous mobile robot to perform wall-following navigation with obstacle avoidance. Hardware configuration and module-based technique are described in this paper. Experimental results demonstrate that the robot is successfully capable of being guided by the hybrid PID controller and behavior coordination system for wall-following navigation with obstacle avoidance.

  11. Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)

    2002-01-01

    Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.

  12. Multi-robot terrain coverage and task allocation for autonomous detection of landmines

    NASA Astrophysics Data System (ADS)

    Dasgupta, Prithviraj; Muñoz-Meléndez, Angélica; Guruprasad, K. R.

    2012-06-01

    Multi-robot systems comprising of heterogeneous autonomous vehicles on land, air, water are being increasingly used to assist or replace humans in different hazardous missions. Two crucial aspects in such multi-robot systems are to: a) explore an initially unknown region of interest to discover tasks, and, b) allocate and share the discovered tasks between the robots in a coordinated manner using a multi-robot task allocation (MRTA) algorithm. In this paper, we describe results from our research on multi-robot terrain coverage and MRTA algorithms within an autonomous landmine detection scenario, done as part of the COMRADES project. Each robot is equipped with a different type of landmine detection sensor and different sensors, even of the same type, can have different degrees of accuracy. The landmine detection-related operations performed by each robot are abstracted as tasks and multiple robots are required to complete a single task. First, we describe a distributed and robust terrain coverage algorithm that employs Voronoi partitions to divide the area of interest among the robots and then uses a single-robot coverage algorithm to explore each partition for potential landmines. Then, we describe MRTA algorithms that use the location information of discovered potential landmines and employ either a greedy strategy, or, an opportunistic strategy to allocate tasks among the robots while attempting to minimize the time (energy) expended by the robots to perform the tasks. We report experimental results of our algorithms using accurately-simulated Corobot robots within the Webots simulator performing a multi-robot, landmine detection operation.

  13. An Autonomous Mobile Robot Guided by a Chaotic True Random Bits Generator

    NASA Astrophysics Data System (ADS)

    Volos, Ch. K.; Kyprianidis, I. M.; Stouboulos, I. N.; Stavrinides, S. G.; Anagnostopoulos, A. N.

    In this work a robot's controller, which ensures chaotic motion to an autonomous mobile robot, is presented. This new strategy, which is very useful in many robotic missions, generates an unpredictable trajectory by using a chaotic path planning generator. The proposed generator produces a trajectory, which is the result of a sequence of planned target locations. In contrary with other similar works, this one is based on a new chaotic true random bits generator, which has as a basic feature the coexistence of two different synchronization phenomena between mutually coupled identical nonlinear circuits. Simulation tests confirm that the whole robot's workplace is covered with unpredictable way in a very satisfactory time.

  14. Remote wave measurements using autonomous mobile robotic systems

    NASA Astrophysics Data System (ADS)

    Kurkin, Andrey; Zeziulin, Denis; Makarov, Vladimir; Belyakov, Vladimir; Tyugin, Dmitry; Pelinovsky, Efim

    2016-04-01

    The project covers the development of a technology for monitoring and forecasting the state of the coastal zone environment using radar equipment transported by autonomous mobile robotic systems (AMRS). Sought-after areas of application are the eastern and northern coasts of Russia, where continuous collection of information on topographic changes of the coastal zone and carrying out hydrodynamic measurements in inaccessible to human environment are needed. The intensity of the reflection of waves, received by radar surveillance, is directly related to the height of the waves. Mathematical models and algorithms for processing experimental data (signal selection, spectral analysis, wavelet analysis), recalculation of landwash from data on heights of waves far from the shore, determination of the threshold values of heights of waves far from the shore have been developed. There has been developed the program complex for functioning of the experimental prototype of AMRS, comprising the following modules: data loading module, reporting module, module of georeferencing, data analysis module, monitoring module, hardware control module, graphical user interface. Further work will be connected with carrying out tests of manufactured experimental prototype in conditions of selected routes coastline of Sakhalin Island. Conducting field tests will allow to reveal the shortcomings of development and to identify ways of optimization of the structure and functioning algorithms of AMRS, as well as functioning the measuring equipment. The presented results have been obtained in Nizhny Novgorod State Technical University n.a. R. Alekseev in the framework of the Federal Target Program «Research and development on priority directions of scientific-technological complex of Russia for 2014 - 2020 years» (agreement № 14.574.21.0089 (unique identifier of agreement - RFMEFI57414X0089)).

  15. An integrated design and fabrication strategy for entirely soft, autonomous robots.

    PubMed

    Wehner, Michael; Truby, Ryan L; Fitzgerald, Daniel J; Mosadegh, Bobak; Whitesides, George M; Lewis, Jennifer A; Wood, Robert J

    2016-08-25

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots. PMID:27558065

  16. How to make an autonomous robot as a partner with humans: design approach versus emergent approach.

    PubMed

    Fujita, M

    2007-01-15

    In this paper, we discuss what factors are important to realize an autonomous robot as a partner with humans. We believe that it is important to interact with people without boring them, using verbal and non-verbal communication channels. We have already developed autonomous robots such as AIBO and QRIO, whose behaviours are manually programmed and designed. We realized, however, that this design approach has limitations; therefore we propose a new approach, intelligence dynamics, where interacting in a real-world environment using embodiment is considered very important. There are pioneering works related to this approach from brain science, cognitive science, robotics and artificial intelligence. We assert that it is important to study the emergence of entire sets of autonomous behaviours and present our approach towards this goal. PMID:17148048

  17. Autonomous navigation of a mobile robot using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, H.; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of a mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation is a-priori unknown environments is discussed. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse inaccurate sensor data. 17 refs., 6 figs.

  18. Using custom-designed VLSI fuzzy inferencing chips for the autonomous navigation of a mobile robot

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, Hiroyuki; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI fuzzy inferencing chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation in apriori unknown environments is discussed. An approach using superposition of elemental sensor-based behaviors is shown to alloy easy development and testing of the inferencing rule base, while providing for progressive addition of behaviors to resolve situations of increasing complexity. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse and inaccurate sensor data. 17 refs., 6 figs.

  19. A testbed for a unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Lee, T.; Tso, K.; Backes, P.; Lloyd, J.

    1990-01-01

    This paper describes a complete robot control facility built at the Jet Propulsion Laboratory as part of NASA a telerobotics program to develop a state-of-the-art robot control environment for laboratory based space-like experiments. This system, which is now fully operational, has the following features: separation of the computing facilities into local and remote sites, autonomous motion generation in joint or Cartesian coordinates, dual-arm force reflecting teleoperation with voice interaction between the operator and the robots, shared control between the autonomously generated motions and operator controlled teleoperation, and dual-arm coordinated trajectory generation. The system has been used to carry out realistic experiments such as the exchange of an Orbital Replacement Unit (ORU), bolt turning, and door opening, using a mixture of autonomous actions and teleoperation, with either a single arm or two cooperating arms.

  20. The experimental humanoid robot H7: a research platform for autonomous behaviour.

    PubMed

    Nishiwaki, Koichi; Kuffner, James; Kagami, Satoshi; Inaba, Masayuki; Inoue, Hirochika

    2007-01-15

    This paper gives an overview of the humanoid robot 'H7', which was developed over several years as an experimental platform for walking, autonomous behaviour and human interaction research at the University of Tokyo. H7 was designed to be a human-sized robot capable of operating autonomously in indoor environments designed for humans. The hardware is relatively simple to operate and conduct research on, particularly with respect to the hierarchical design of its control architecture. We describe the overall design goals and methodology, along with a summary of its online walking capabilities, autonomous vision-based behaviours and automatic motion planning. We show experimental results obtained by implementations running within a simulation environment as well as on the actual robot hardware. PMID:17148051

  1. Research and development of Ro-boat: an autonomous river cleaning robot

    NASA Astrophysics Data System (ADS)

    Sinha, Aakash; Bhardwaj, Prashant; Vaibhav, Bipul; Mohommad, Noor

    2013-12-01

    Ro-Boat is an autonomous river cleaning intelligent robot incorporating mechanical design and computer vision algorithm to achieve autonomous river cleaning and provide a sustainable environment. Ro-boat is designed in a modular fashion with design details such as mechanical structural design, hydrodynamic design and vibrational analysis. It is incorporated with a stable mechanical system with air and water propulsion, robotic arms and solar energy source and it is proceed to become autonomous by using computer vision. Both "HSV Color Space" and "SURF" are proposed to use for measurements in Kalman Filter resulting in extremely robust pollutant tracking. The system has been tested with successful results in the Yamuna River in New Delhi. We foresee that a system of Ro-boats working autonomously 24x7 can clean a major river in a city on about six months time, which is unmatched by alternative methods of river cleaning.

  2. Autonomous discovery and learning by a mobile robot in unstructured environments

    SciTech Connect

    Pin, F.G.; de Saussure, G.; Spelt, P.F.; Barnett, D.L.; Killough, S.M.; Weisbin, C.R.

    1988-01-01

    This paper presents recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of autonomous discovery and learning of emergency and maintenance tasks in unstructured environments by a mobile robot. The methodologies for learning basic operating principles of control devices, and for using the acquired knowledge to solve new problems with conditions not encountered before are presented. The algorithms necessary for the robot to discover problem-solving sequences of actions, through experimentation with the environment, in the two cases of immediate feedback and delayed feedback are described. The inferencing schemes allowing the robot to classify the information acquired from a reduced set of examples and to generalize its knowledge to a much wider problem-solving domain are also provided. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot is then presented. 8 refs., 2 figs.

  3. An architectural approach to create self organizing control systems for practical autonomous robots

    NASA Technical Reports Server (NTRS)

    Greiner, Helen

    1991-01-01

    For practical industrial applications, the development of trainable robots is an important and immediate objective. Therefore, the developing of flexible intelligence directly applicable to training is emphasized. It is generally agreed upon by the AI community that the fusion of expert systems, neural networks, and conventionally programmed modules (e.g., a trajectory generator) is promising in the quest for autonomous robotic intelligence. Autonomous robot development is hindered by integration and architectural problems. Some obstacles towards the construction of more general robot control systems are as follows: (1) Growth problem; (2) Software generation; (3) Interaction with environment; (4) Reliability; and (5) Resource limitation. Neural networks can be successfully applied to some of these problems. However, current implementations of neural networks are hampered by the resource limitation problem and must be trained extensively to produce computationally accurate output. A generalization of conventional neural nets is proposed, and an architecture is offered in an attempt to address the above problems.

  4. Autonomous Mobile Platform for Research in Cooperative Robotics

    NASA Technical Reports Server (NTRS)

    Daemi, Ali; Pena, Edward; Ferguson, Paul

    1998-01-01

    This paper describes the design and development of a platform for research in cooperative mobile robotics. The structure and mechanics of the vehicles are based on R/C cars. The vehicle is rendered mobile by a DC motor and servo motor. The perception of the robot's environment is achieved using IR sensors and a central vision system. A laptop computer processes images from a CCD camera located above the testing area to determine the position of objects in sight. This information is sent to each robot via RF modem. Each robot is operated by a Motorola 68HC11E micro-controller, and all actions of the robots are realized through the connections of IR sensors, modem, and motors. The intelligent behavior of each robot is based on a hierarchical fuzzy-rule based approach.

  5. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  6. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    NASA Astrophysics Data System (ADS)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  7. Autonomous avoidance based on motion delay of master-slave surgical robot.

    PubMed

    Inoue, Shintaro; Toyoda, Kazutaka; Kobayashi, Yo; Fujie, Masakatsu G

    2009-01-01

    Safe use of master-slave robots for endoscopic surgery requires autonomous motions to avert contact with vital organs, blood vessels, and nerves. Here we describe an avoidance control algorithm with delay compensation that takes the dynamic characteristics of the robot into account. To determine the operating parameters, we measured frequency characteristics of each joint of the slave-manipulator. The results suggest this delay compensation program improves avoidance performance. PMID:19964112

  8. Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl

    2015-01-01

    Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.

  9. Introduction to autonomous mobile robotics using Lego Mindstorms NXT

    NASA Astrophysics Data System (ADS)

    Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-12-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.

  10. Autonomous Motion Learning for Intra-Vehicular Activity Space Robot

    NASA Astrophysics Data System (ADS)

    Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo

    Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.

  11. Robust performance of multiple tasks by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Einstein, R.; Jones, J.P.; Spelt, P.D.; Weisbin, C.R.

    1989-01-01

    There have been many successful mobile robot experiments, but very few papers have appeared that examine the range of applicability, or robustness, of a robot system. The purpose of this paper is to determine and quantify robustness of the Hermies-IIB experimental capabilities. 6 refs., 1 tab.

  12. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  13. Autonomous Mobile Robot Navigation Using Harmonic Potential Field

    NASA Astrophysics Data System (ADS)

    Panati, Subbash; Baasandorj, Bayanjargal; Chong, Kil To

    2015-05-01

    Mobile robot navigation has been an area of robotics which has gained massive attention among the researchers of robotics community. Path planning and obstacle avoidance are the key aspects of mobile robot navigation. This paper presents harmonic potential field based navigation algorithm for mobile robots. Harmonic potential field method overcomes the issue of local minima which was a major bottleneck in the case of artificial potential field method. The harmonic potential field is calculated using harmonic functions and Dirichlet boundary conditions are used for the obstacles, goal and initial position. The simulation results shows that the proposed method is able to overcome the local minima issue and navigate successfully from initial position to the goal without colliding into obstacles in static environment.

  14. Manifold traversing as a model for learning control of autonomous robots

    NASA Technical Reports Server (NTRS)

    Szakaly, Zoltan F.; Schenker, Paul S.

    1992-01-01

    This paper describes a recipe for the construction of control systems that support complex machines such as multi-limbed/multi-fingered robots. The robot has to execute a task under varying environmental conditions and it has to react reasonably when previously unknown conditions are encountered. Its behavior should be learned and/or trained as opposed to being programmed. The paper describes one possible method for organizing the data that the robot has learned by various means. This framework can accept useful operator input even if it does not fully specify what to do, and can combine knowledge from autonomous, operator assisted and programmed experiences.

  15. Development of the Research Platform of Small Autonomous Blimp Robot

    NASA Astrophysics Data System (ADS)

    Takaya, Toshihiko; Kawamura, Hidenori; Yamamoto, Masahito; Ohuchi, Azuma

    A blimp robot is attractive as an small flight robot and can float in the air by buoyancy and realize safe to the crash small flight with low energy and can movement for a long time compared with other flight robots with low energy and can movement for a long time compared with other flight robots. However, control of an airplane robot is difficult for the nonlinear characteristic exposed to inertia by the air flow in response to influence. Therefore, the applied research which carried out the maximum use of such in recent years a blimp robot's feature is prosperous. In this paper, we realized development of blimp robot for research which can be used general-purpose by carrying out clue division of the blimp robot body at a unit, and constituting and building for research of blimp robot, and application development. On the other hand, by developing a general-purpose blimp robot research platform, improvement in the research efficiency of many researchers can be attained, and further, research start of blimp robot becomes easy and contributes to development of research. We performed the experiments for the above-mentioned proof. 1. Checked basic keeping position performance and that various orbital operation was possible. And the unit exchange ease of software unit was checked by the experiment which exchanges the control layer of software for learning control from PID control, and carries out comparison of operation. 2. In order to check the exchange ease of hardware unit, the sensor was exchanged for the microphon from the camera, and control of operation was checked. 3. For the unit addition ease, the microphon which carries out sound detection with the picture detection with a camera was added, and control of operation was verified. 4. The unit exchange was carried out for the check of a function addition and the topological map generation experiment by addition of an ultrasonic sensor was conducted. Developed blimp robot for research mounted the exchange ease

  16. A Prototype Novel Sensor for Autonomous, Space Based Robots - Phase 2

    NASA Technical Reports Server (NTRS)

    Squillante, M. R.; Derochemont, L. P.; Cirignano, L.; Lieberman, P.; Soller, M. S.

    1990-01-01

    The goal of this program was to develop new sensing capabilities for autonomous robots operating in space. Information gained by the robot using these new capabilities would be combined with other information gained through more traditional capabilities, such as video, to help the robot characterize its environment as well as to identify known or unknown objects that it encounters. Several sensing capabilities using nuclear radiation detectors and backscatter technology were investigated. The result of this research has been the construction and delivery to NASA of a prototype system with three capabilities for use by autonomous robots. The primary capability was the use of beta particle backscatter measurements to determine the average atomic number (Z) of an object. This gives the robot a powerful tool to differentiate objects which may look the same, such as objects made out of different plastics or other light weight materials. In addition, the same nuclear sensor used in the backscatter measurement can be used as a nuclear spectrometer to identify sources of nuclear radiation that may be encountered by the robot, such as nuclear powered satellites. A complete nuclear analysis system is included in the software and hardware of the prototype system built in phase 2 of this effort. Finally, a method to estimate the radiation dose in the environment of the robot has been included as a third capability. Again, the same nuclear sensor is used in a different operating mode and with different analysis software. Each of these capabilities are described.

  17. Motor-response learning at a process control panel by an autonomous robot

    SciTech Connect

    Spelt, P.F.; de Saussure, G.; Lyness, E.; Pin, F.G.; Weisbin, C.R.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring and manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.

  18. Autonomous intelligent military robots: Army ants, killer bees, and cybernetic soldiers

    NASA Astrophysics Data System (ADS)

    Finkelstein, Robert

    The rationale for developing autonomous intelligent robots in the military is to render conventional warfare systems ineffective and indefensible. The Desert Storm operation demonstrated the effectiveness of such systems as unmanned air and ground vehicles and indicated the future possibilities of robotic technology. Robotic military vehicles would have the advantages of expendability, low cost, lower complexity compared to manned systems, survivability, maneuverability, and a capability to share in instantaneous communication and distributed processing of combat information. Basic characteristics of intelligent systems and hierarchical control systems with sensor inputs are described. Genetic algorithms are seen as a means of achieving appropriate levels of intelligence in a robotic system. Potential impacts of robotic technology in the military are outlined.

  19. A unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, Samad; Lee, Thomas S.; Tso, Kam Sing; Backes, Paul G.; Lloyd, John

    1991-01-01

    A description is given of complete robot control facility built as part of a NASA telerobotics program to develop a state-of-the-art robot control environment for performing experiments in the repair and assembly of spacelike hardware to gain practical knowledge of such work and to improve the associated technology. The basic architecture of the manipulator control subsystem is presented. The multiarm Robot Control C Library (RCCL), a key software component of the system, is described, along with its implementation on a Sun-4 computer. The system's simulation capability is also described, and the teleoperation and shared control features are explained.

  20. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators

    PubMed Central

    Onal, Cagdas D.; Rus, Daniela

    2014-01-01

    Abstract In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input–output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion.

  1. Detection of Water Hazards for Autonomous Robotic Vehicles

    NASA Technical Reports Server (NTRS)

    Matthes, Larry; Belluta, Paolo; McHenry, Michael

    2006-01-01

    Four methods of detection of bodies of water are under development as means to enable autonomous robotic ground vehicles to avoid water hazards when traversing off-road terrain. The methods involve processing of digitized outputs of optoelectronic sensors aboard the vehicles. It is planned to implement these methods in hardware and software that would operate in conjunction with the hardware and software for navigation and for avoidance of solid terrain obstacles and hazards. The first method, intended for use during the day, is based on the observation that, under most off-road conditions, reflections of sky from water are easily discriminated from the adjacent terrain by their color and brightness, regardless of the weather and of the state of surface waves on the water. Accordingly, this method involves collection of color imagery by a video camera and processing of the image data by an algorithm that classifies each pixel as soil, water, or vegetation according to its color and brightness values (see figure). Among the issues that arise is the fact that in the presence of reflections of objects on the opposite shore, it is difficult to distinguish water by color and brightness alone. Another issue is that once a body of water has been identified by means of color and brightness, its boundary must be mapped for use in navigation. Techniques for addressing these issues are under investigation. The second method, which is not limited by time of day, is based on the observation that ladar returns from bodies of water are usually too weak to be detected. In this method, ladar scans of the terrain are analyzed for returns and the absence thereof. In appropriate regions, the presence of water can be inferred from the absence of returns. Under some conditions in which reflections from the bottom are detectable, ladar returns could, in principle, be used to determine depth. The third method involves the recognition of bodies of water as dark areas in short

  2. Adaptive artificial neural network for autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    The topics are presented in viewgraph form and include: neural network controller for robot arm positioning with visual feedback; initial training of the arm; automatic recovery from cumulative fault scenarios; and error reduction by iterative fine movements.

  3. Autonomous robot for detecting subsurface voids and tunnels using microgravity

    NASA Astrophysics Data System (ADS)

    Wilson, Stacy S.; Crawford, Nicholas C.; Croft, Leigh Ann; Howard, Michael; Miller, Stephen; Rippy, Thomas

    2006-05-01

    Tunnels have been used to evade security of defensive positions both during times of war and peace for hundreds of years. Tunnels are presently being built under the Mexican Border by drug smugglers and possibly terrorists. Several have been discovered at the border crossing at Nogales near Tucson, Arizona, along with others at other border towns. During this war on terror, tunnels under the Mexican Border pose a significant threat for the security of the United States. It is also possible that terrorists will attempt to tunnel under strategic buildings and possibly discharge explosives. The Center for Cave and Karst Study (CCKS) at Western Kentucky University has a long and successful history of determining the location of caves and subsurface voids using microgravity technology. Currently, the CCKS is developing a remotely controlled robot which will be used to locate voids underground. The robot will be a remotely controlled vehicle that will use microgravity and GPS to accurately detect and measure voids below the surface. It is hoped that this robot will also be used in military applications to locate other types of voids underground such as tunnels and bunkers. It is anticipated that the robot will be able to function up to a mile from the operator. This paper will describe the construction of the robot and the use of microgravity technology to locate subsurface voids with the robot.

  4. Challenging of path planning algorithms for autonomous robot in known environment

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Irwan, N.; Zuraida, Raja Lailatul; Shaharum, Umairah; Hanafi@Omar, Hafiz Mohd

    2014-06-01

    Most of the mobile robot path planning is estimated to reach its predetermined aim through the shortest path and avoiding the obstacles. This paper is a survey on path planning algorithms of various current research and existing system of Unmanned Ground Vehicles (UGV) where their challenging issues to be intelligent autonomous robot. The focuses are some short reviews on individual papers for UGV in the known environment. Methods and algorithms in path planning for the autonomous robot had been discussed. From the reviews, we obtained that the algorithms proposed are appropriate for some cases such as single or multiple obstacles, static or movement obstacle and optimal shortest path. This paper also describes some pros and cons for every reviewed paper toward algorithms improvement for further work.

  5. Terrain coverage of an unknown room by an autonomous mobile robot

    SciTech Connect

    VanderHeide, J.R.

    1995-12-05

    Terrain coverage problems are nearly as old as mankind: they were necessary early in our history for basic activities such as finding food and other necessities. As our societies and their associated machineries have grown more complex, we have not outgrown the need for this primitive skill. It is still used on a small scale for cleaning tasks and on a large scale for {open_quotes}search and report{close_quotes} missions of various kinds. The motivation for automating this process may not lie in the novelty of anything we might gain as an end product, but in freedom from something which we as humans find tedious, time-consuming and sometimes dangerous. Here we consider autonomous coverage of a terrain, typically indoor rooms, by a mobile robot that has no a priori model of the terrain. In evaluating its surroundings, the robot employs only inexpensive and commercially available ultrasonic and infrared sensors. The proposed solution is a basic step - a proof of principle - that can contribute to robots capable of autonomously performing tasks such as vacuum cleaning, mopping, radiation scanning, etc. The area of automatic terrain coverage and the closely related problem of terrain model acquisition have been studied both analytically and experimentally. Compared to the existing works, the following are three major distinguishing aspects of our study: (1) the theory is actually applied to an existing robot, (2) the robot has no a priori knowledge of the terrain, and (3) the robot can be realized relatively inexpensively.

  6. Development of an Interactive Augmented Environment and Its Application to Autonomous Learning for Quadruped Robots

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi

    This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.

  7. A traffic priority language for collision-free navigation of autonomous mobile robots in dynamic environments.

    PubMed

    Bourbakis, N G

    1997-01-01

    This paper presents a generic traffic priority language, called KYKLOFORTA, used by autonomous robots for collision-free navigation in a dynamic unknown or known navigation space. In a previous work by X. Grossmman (1988), a set of traffic control rules was developed for the navigation of the robots on the lines of a two-dimensional (2-D) grid and a control center coordinated and synchronized their movements. In this work, the robots are considered autonomous: they are moving anywhere and in any direction inside the free space, and there is no need of a central control to coordinate and synchronize them. The requirements for each robot are i) visual perception, ii) range sensors, and iii) the ability of each robot to detect other moving objects in the same free navigation space, define the other objects perceived size, their velocity and their directions. Based on these assumptions, a traffic priority language is needed for each robot, making it able to decide during the navigation and avoid possible collision with other moving objects. The traffic priority language proposed here is based on a set of primitive traffic priority alphabet and rules which compose pattern of corridors for the application of the traffic priority rules. PMID:18255898

  8. Evaluation of a Home Biomonitoring Autonomous Mobile Robot

    PubMed Central

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  9. Evaluation of a Home Biomonitoring Autonomous Mobile Robot.

    PubMed

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  10. An autonomous mobil robot to perform waste drum inspections

    SciTech Connect

    Peterson, K.D.; Ward, C.R.

    1994-03-01

    A mobile robot is being developed by the Savannah River Technology Center (SRTC) Robotics Group of Westinghouse Savannah River company (WSRC) to perform mandated inspections of waste drums stored in warehouse facilities. The system will reduce personnel exposure and create accurate, high quality documentation to ensure regulatory compliance. Development work is being coordinated among several DOE, academic and commercial entities in accordance with DOE`s technology transfer initiative. The prototype system was demonstrated in November of 1993. A system is now being developed for field trails at the Fernald site.

  11. A Behavior-Based Strategy for Single and Multi-Robot Autonomous Exploration

    PubMed Central

    Cepeda, Jesus S.; Chaimowicz, Luiz; Soto, Rogelio; Gordillo, José L.; Alanís-Reyes, Edén A.; Carrillo-Arce, Luis C.

    2012-01-01

    In this paper, we consider the problem of autonomous exploration of unknown environments with single and multiple robots. This is a challenging task, with several potential applications. We propose a simple yet effective approach that combines a behavior-based navigation with an efficient data structure to store previously visited regions. This allows robots to safely navigate, disperse and efficiently explore the environment. A series of experiments performed using a realistic robotic simulator and a real testbed scenario demonstrate that our technique effectively distributes the robots over the environment and allows them to quickly accomplish their mission in large open spaces, narrow cluttered environments, dead-end corridors, as well as rooms with minimum exits.

  12. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  13. Semi-autonomous exploration of multi-floor buildings with a legged robot

    NASA Astrophysics Data System (ADS)

    Wenger, Garrett J.; Johnson, Aaron M.; Taylor, Camillo J.; Koditschek, Daniel E.

    2015-05-01

    This paper presents preliminary results of a semi-autonomous building exploration behavior using the hexapedal robot RHex. Stairwells are used in virtually all multi-floor buildings, and so in order for a mobile robot to effectively explore, map, clear, monitor, or patrol such buildings it must be able to ascend and descend stairwells. However most conventional mobile robots based on a wheeled platform are unable to traverse stairwells, motivating use of the more mobile legged machine. This semi-autonomous behavior uses a human driver to provide steering input to the robot, as would be the case in, e.g., a tele-operated building exploration mission. The gait selection and transitions between the walking and stair climbing gaits are entirely autonomous. This implementation uses an RGBD camera for stair acquisition, which offers several advantages over a previously documented detector based on a laser range finder, including significantly reduced acquisition time. The sensor package used here also allows for considerable expansion of this behavior. For example, complete automation of the building exploration task driven by a mapping algorithm and higher level planner is presently under development.

  14. Automatic tracking of laparoscopic instruments for autonomous control of a cameraman robot.

    PubMed

    Amini Khoiy, Keyvan; Mirbagheri, Alireza; Farahmand, Farzam

    2016-06-01

    Background An automated instrument tracking procedure was designed and developed for autonomous control of a cameraman robot during laparoscopic surgery. Material and methods The procedure was based on an innovative marker-free segmentation algorithm for detecting the tip of the surgical instruments in laparoscopic images. A compound measure of Saturation and Value components of HSV color space was incorporated that was enhanced further using the Hue component and some essential characteristics of the instrument segment, e.g., crossing the image boundaries. The procedure was then integrated into the controlling system of the RoboLens cameraman robot, within a triple-thread parallel processing scheme, such that the tip is always kept at the center of the image. Results Assessment of the performance of the system on prerecorded real surgery movies revealed an accuracy rate of 97% for high quality images and about 80% for those suffering from poor lighting and/or blood, water and smoke noises. A reasonably satisfying performance was also observed when employing the system for autonomous control of the robot in a laparoscopic surgery phantom, with a mean time delay of 200ms. Conclusion It was concluded that with further developments, the proposed procedure can provide a practical solution for autonomous control of cameraman robots during laparoscopic surgery operations. PMID:26872883

  15. Development of a semi-autonomous service robot with telerobotic capabilities

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; White, D. R.

    1987-01-01

    The importance to the United States of semi-autonomous systems for application to a large number of manufacturing and service processes is very clear. Two principal reasons emerge as the primary driving forces for development of such systems: enhanced national productivity and operation in environments whch are hazardous to humans. Completely autonomous systems may not currently be economically feasible. However, autonomous systems that operate in a limited operation domain or that are supervised by humans are within the technology capability of this decade and will likely provide reasonable return on investment. The two research and development efforts of autonomy and telerobotics are distinctly different, yet interconnected. The first addresses the communication of an intelligent electronic system with a robot while the second requires human communication and ergonomic consideration. Discussed here are work in robotic control, human/robot team implementation, expert system robot operation, and sensor development by the American Welding Institute, MTS Systems Corporation, and the Colorado School of Mines--Center for Welding Research.

  16. Semi-autonomous surgical tasks using a miniature in vivo surgical robot.

    PubMed

    Dumpert, Jason; Lehman, Amy C; Wood, Nathan A; Oleynikov, Dmitry; Farritor, Shane M

    2009-01-01

    Natural Orifice Translumenal Endoscopic Surgery (NOTES) is potentially the next step in minimally invasive surgery. This type of procedure could reduce patient trauma through eliminating external incisions, but poses many surgical challenges that are not sufficiently overcome with current flexible endoscopy tools. A robotic platform that attempts to emulate a laparoscopic interface for performing NOTES procedures is being developed to address these challenges. These robots are capable of entering the peritoneal cavity through the upper gastrointestinal tract, and once inserted are not constrained by incisions, allowing for visualization and manipulations throughout the cavity. In addition to using these miniature in vivo robots for NOTES procedures, these devices can also be used to perform semi-autonomous surgical tasks. Such tasks could be useful in situations where the patient is in a location far from a trained surgeon. A surgeon at a remote location could control the robot even if the communication link between surgeon and patient has low bandwidth or very high latency. This paper details work towards using the miniature robot to perform simple surgical tasks autonomously. PMID:19963710

  17. Exploration of Teisi Knoll by Autonomous Underwater Vehicle "R-One Robot"

    NASA Astrophysics Data System (ADS)

    Ura, Tamaki; Obara, Takashi; Nagahashi, Kenji; Nakane, Kenji; Sakai, Shoji; Oyabu, Yuji; Sakamaki, Takashi; Takagawa, Shinichi; Kawano, Hiroshi; Gamo, Toshitaka; Takano, Michiaki; Doi, Takashi

    This paper outlines the exploration of Teisi Knoll by the autonomous underwater vehicle the R-One Robot, as carried out October 19-22, 2000, and presents images taken by the sidescan SONAR fitted to the bottom of the vehicle. The R-One Robot was launched from the R/V Kaiyo, started diving near the support ship, followed predetermined tracklines which were defined by waypoints, and finally came back to the destination where it was recovered by the support vessel. In order to minimize positioning error, which is determined by the inertial navigation system and Doppler SONAR, the robot ascended to the surface several times to ascertain its precise position using the global positioning system, the antenna of which is fitted on the vertical fin. Taking advantage of this positioning system, the robot followed the predetermined tracklines with an error of less than 40 meters in 30 minutes of continuous submerging. Disturbance to the robot is small enough compared to towed vehicles that its movement can be regarded as stable. This stability resulted in clear side scanning images of the knoll and surrounding sea floor. The robot stopped at the center of the knoll, and descended vertically into the crater. When the vehicle was in the crater, anomalous manganese ion concentrations were detected by the in situ trace metal micro-analyzer GAMOS, which was loaded in the payload bay at the front of the robot.

  18. An integrated movement capture and control platform applied towards autonomous movements of surgical robots.

    PubMed

    Daluja, Sachin; Golenberg, Lavie; Cao, Alex; Pandya, Abhilash K; Auner, Gregory W; Klein, Michael D

    2009-01-01

    Robotic surgery has gradually gained acceptance due to its numerous advantages such as tremor filtration, increased dexterity and motion scaling. There remains, however, a significant scope for improvement, especially in the areas of surgeon-robot interface and autonomous procedures. Previous studies have attempted to identify factors affecting a surgeon's performance in a master-slave robotic system by tracking hand movements. These studies relied on conventional optical or magnetic tracking systems, making their use impracticable in the operating room. This study concentrated on building an intrinsic movement capture platform using microcontroller based hardware wired to a surgical robot. Software was developed to enable tracking and analysis of hand movements while surgical tasks were performed. Movement capture was applied towards automated movements of the robotic instruments. By emulating control signals, recorded surgical movements were replayed by the robot's end-effectors. Though this work uses a surgical robot as the platform, the ideas and concepts put forward are applicable to telerobotic systems in general. PMID:19377115

  19. Collaboration among a Group of Self-Autonomous Mobile Robots with Diversified Personalities

    NASA Astrophysics Data System (ADS)

    Tauchi, Makiko; Sagawa, Yuji; Tanaka, Toshimitsu; Sugie, Noboru

    Simulation studies were carried out about a group of self-autonomous mobile robots collaborating in collection cleaning-up tasks. The robots are endowed with two kinds of human-like personalities; positivity and tenderness. Dependent on the rank of positivity, decision is made on which one of robots nearby should avoid collision and which one of robots heading for the same small baggage should carry one. As for large baggage which can be carried only by two collaborating robots, tenderness plays an essential role. In the first series of simulation, the initial configuration of 4 robots, 4 small baggage, and 2 large baggage were fixed. The cleaning-up tasks were carried out for all combinations of personalities, 625 cases in total. In the second series, 8 robots performed the task. 5 voluntarily cases were chosen to carry out 100 simulations for each case, by changing the configuration of baggage. From the results of the simulation, it was found that the heterogeneous group performs the task more effectively than the homogeneous group. It seems that diversity in personality is good for survival. In addition to the performance index of task execution time, satisfaction index is introduced to evaluate the degree of satisfaction of the group, too.

  20. R-MASTIF: robotic mobile autonomous system for threat interrogation and object fetch

    NASA Astrophysics Data System (ADS)

    Das, Aveek; Thakur, Dinesh; Keller, James; Kuthirummal, Sujit; Kira, Zsolt; Pivtoraiko, Mihail

    2013-01-01

    Autonomous robotic "fetch" operation, where a robot is shown a novel object and then asked to locate it in the field, re- trieve it and bring it back to the human operator, is a challenging problem that is of interest to the military. The CANINE competition presented a forum for several research teams to tackle this challenge using state of the art in robotics technol- ogy. The SRI-UPenn team fielded a modified Segway RMP 200 robot with multiple cameras and lidars. We implemented a unique computer vision based approach for textureless colored object training and detection to robustly locate previ- ously unseen objects out to 15 meters on moderately flat terrain. We integrated SRI's state of the art Visual Odometry for GPS-denied localization on our robot platform. We also designed a unique scooping mechanism which allowed retrieval of up to basketball sized objects with a reciprocating four-bar linkage mechanism. Further, all software, including a novel target localization and exploration algorithm was developed using ROS (Robot Operating System) which is open source and well adopted by the robotics community. We present a description of the system, our key technical contributions and experimental results.

  1. Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.

  2. Analysis of mutual assured destruction-like scenario with swarms of non-recallable autonomous robots

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    This paper considers the implications of the creation of an autonomous robotic fighting force without recall-ability which could serve as a deterrent to a `total war' magnitude attack. It discusses the technical considerations for this type of robotic system and the limited enhancements required to current technologies (particularly UAVs) needed to create such a system. Particular consideration is paid to how the introduction of this type of technology by one actor could create a need for reciprocal development. Also considered is the prospective utilization of this type of technology by non-state actors and the impact of this on state actors.

  3. Application of concurrent engineering methods to the design of an autonomous aerial robot

    NASA Astrophysics Data System (ADS)

    Ingalls, Stephen A.

    1991-12-01

    This paper documents the year-long efforts of a multidisciplinary design team to design, build, and support an autonomous aerial robotics system. The system was developed to participate in the Association for Unmanned Vehicle System's (AUVS) First International Aerial Robotics Competition which was held in Atlanta, Georgia on the Georgia Tech campus on July 29th, 1991. As development time and budget were extremely limited, the team elected to attempt the design using concurrent engineering design methods. These methods were validated in an IDA study by Winner 1 in the late- 1980's to be particularly adept at handling the difficulties to design presented by these limitations.

  4. Welding torch trajectory generation for hull joining using autonomous welding mobile robot

    NASA Astrophysics Data System (ADS)

    Hascoet, J. Y.; Hamilton, K.; Carabin, G.; Rauch, M.; Alonso, M.; Ares, E.

    2012-04-01

    Shipbuilding processes involve highly dangerous manual welding operations. Welding of ship hulls presents a hazardous environment for workers. This paper describes a new robotic system, developed by the SHIPWELD consortium, that moves autonomously on the hull and automatically executes the required welding processes. Specific focus is placed on the trajectory control of such a system and forms the basis for the discussion in this paper. It includes a description of the robotic hardware design as well as some methodology used to establish the torch trajectory control.

  5. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    NASA Astrophysics Data System (ADS)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  6. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  7. Positional estimation techniques for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Aggarwal, J. K.

    1990-01-01

    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.

  8. Autonomous star field identification for robotic solar system exploration

    NASA Astrophysics Data System (ADS)

    Scholl, Marija S.

    A six-feature all-sky star field identification algorithm has been developed. The minimum identifiable star pattern element consists of an oriented star triplet defined by three stars, their celestial coordinates and visual magnitudes. This algorithm has been integrated with a CCD-based imaging camera. The autonomous intelligent camera identifies in real time any star field without a priori knowledge. Observatory tests on star fields with this intelligent camera are described.

  9. Autonomous dexterous end-effectors for space robotics

    NASA Technical Reports Server (NTRS)

    Bekey, George A.; Iberall, Thea; Liu, Huan

    1989-01-01

    The development of a knowledge-based controller is summarized for the Belgrade/USC robot hand, a five-fingered end effector, designed for maximum autonomy. The biological principles of the hand and its architecture are presented. The conceptual and software aspects of the grasp selection system are discussed, including both the effects of the geometry of the target object and the task to be performed. Some current research issues are presented.

  10. Autonomous and Remote-Controlled Airborne and Ground-Based Robotic Platforms for Adaptive Geophysical Surveying

    NASA Astrophysics Data System (ADS)

    Spritzer, J. M.; Phelps, G. A.

    2011-12-01

    Low-cost autonomous and remote-controlled robotic platforms have opened the door to precision-guided geophysical surveying. Over the past two years, the U.S. Geological Survey, Senseta, NASA Ames Research Center, and Carnegie Mellon University Silicon Valley, have developed and deployed small autonomous and remotely controlled vehicles for geophysical investigations. The purpose of this line of investigation is to 1) increase the analytical capability, resolution, and repeatability, and 2) decrease the time, and potentially the cost and map-power necessary to conduct near-surface geophysical surveys. Current technology has advanced to the point where vehicles can perform geophysical surveys autonomously, freeing the geoscientist to process and analyze the incoming data in near-real time. This has enabled geoscientists to monitor survey parameters; process, analyze and interpret the incoming data; and test geophysical models in the same field session. This new approach, termed adaptive surveying, provides the geoscientist with choices of how the remainder of the survey should be conducted. Autonomous vehicles follow pre-programmed survey paths, which can be utilized to easily repeat surveys on the same path over large areas without the operator fatigue and error that plague man-powered surveys. While initial deployments with autonomous systems required a larger field crew than a man-powered survey, over time operational experience costs and man power requirements will decrease. Using a low-cost, commercially available chassis as the base for autonomous surveying robotic systems promise to provide higher precision and efficiency than human-powered techniques. An experimental survey successfully demonstrated the adaptive techniques described. A magnetic sensor was mounted on a small rover, which autonomously drove a prescribed course designed to provide an overview of the study area. Magnetic data was relayed to the base station periodically, processed and gridded. A

  11. Road network modeling in open source GIS to manage the navigation of autonomous robots

    NASA Astrophysics Data System (ADS)

    Mangiameli, Michele; Muscato, Giovanni; Mussumeci, Giuseppe

    2013-10-01

    The autonomous navigation of a robot can be accomplished through the assignment of a sequence of waypoints previously identified in the territory to be explored. In general, the starting point is a vector graph of the network consisting of possible paths. The vector graph can be directly available in the case of actual road networks, or it can be modeled, i.e. on the basis of cartographic supports or, even better, of a digital terrain model (DTM). In this paper we present software procedures developed in Grass-GIS, PostGIS and QGIS environments to identify, model, and visualize a road graph and to extract and normalize sequence of waypoints which can be transferred to a robot for its autonomous navigation.

  12. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    PubMed

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1). PMID:23524383

  13. Automatic generation of modules of object categorization for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Gorbenko, Anna

    2013-10-01

    Many robotic tasks require advanced systems of visual sensing. Robotic systems of visual sensing must be able to solve a number of different complex problems of visual data analysis. Object categorization is one of such problems. In this paper, we propose an approach to automatic generation of computationally effective modules of object categorization for autonomous mobile robots. This approach is based on the consideration of the stack cover problem. In particular, it is assumed that the robot is able to perform an initial inspection of the environment. After such inspection, the robot needs to solve the stack cover problem by using a supercomputer. A solution of the stack cover problem allows the robot to obtain a template for computationally effective scheduling of object categorization. Also, we consider an efficient approach to solve the stack cover problem. In particular, we consider an explicit reduction from the decision version of the stack cover problem to the satisfiability problem. For different satisfiability algorithms, the results of computational experiments are presented.

  14. Monocular SLAM for Autonomous Robots with Enhanced Features Initialization

    PubMed Central

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-01-01

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided. PMID:24699284

  15. Monocular SLAM for autonomous robots with enhanced features initialization.

    PubMed

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-01-01

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided. PMID:24699284

  16. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized

  17. Command and Control Architectures for Autonomous Micro-Robotic Forces - FY-2000 Project Report

    SciTech Connect

    Dudenhoeffer, Donald Dean

    2001-04-01

    Advances in Artificial Intelligence (AI) and micro-technologies will soon give rise to production of large-scale forces of autonomous micro-robots with systems of innate behaviors and with capabilities of self-organization and real world tasking. Such organizations have been compared to schools of fish, flocks of birds, herds of animals, swarms of insects, and military squadrons. While these systems are envisioned as maintaining a high degree of autonomy, it is important to understand the relationship of man with such machines. In moving from research studies to the practical deployment of large-scale numbers of robots, one of critical pieces that must be explored is the command and control architecture for humans to re-task and also inject global knowledge, experience, and intuition into the force. Tele-operation should not be the goal, but rather a level of adjustable autonomy and high-level control. If a herd of sheep is comparable to the collective of robots, then the human element is comparable to the shepherd pulling in strays and guiding the herd in the direction of greener pastures. This report addresses the issues and development of command and control for largescale numbers of autonomous robots deployed as a collective force.

  18. A path planning algorithm for lane-following-based autonomous mobile robot navigation

    NASA Astrophysics Data System (ADS)

    Aljeroudi, Yazan; Paulik, Mark; Krishnan, Mohan; Luo, Chaomin

    2010-01-01

    In this paper we address the problem of autonomous robot navigation in a "roadway" type environment, where the robot has to drive forward on a defined path that could be impeded by the presence of obstacles. The specific context is the Autonomous Challenge of the Intelligent Ground Vehicle Competition (www.igvc.org). The task of the path planner is to ensure that the robot follows the path without turning back, as can happen in switchbacks, and/or leaving the course, as can happen in dashed or single lane line situations. A multi-behavior path planning algorithm is proposed. The first behavior determines a goal using a center of gravity (CoG) computation from the results of image processing techniques designed to extract lane lines. The second behavior is based on developing a sense of the current "general direction" of the contours of the course. This is gauged based on the immediate path history of the robot. An adaptive-weight-based fusion of the two behaviors is used to generate the best overall direction. This multi-behavior path planning strategy has been evaluated successfully in a Player/Stage simulation environment and subsequently implemented in the 2009 IGVC. The details of our experience will be presented at the conference.

  19. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan.

    PubMed

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  20. Effectiveness of Social Behaviors for Autonomous Wheelchair Robot to Support Elderly People in Japan

    PubMed Central

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  1. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  2. Emergence of Leadership in a Group of Autonomous Robots

    PubMed Central

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different “styles” of leadership (active and passive). PMID:26340449

  3. Emergence of Leadership in a Group of Autonomous Robots.

    PubMed

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different "styles" of leadership (active and passive). PMID:26340449

  4. Lane identification and path planning for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    McKeon, Robert T.; Paulik, Mark; Krishnan, Mohan

    2006-10-01

    This work has been performed in conjunction with the University of Detroit Mercy's (UDM) ECE Department autonomous vehicle entry in the 2006 Intelligent Ground Vehicle Competition (www.igvc.org). The IGVC challenges engineering students to design autonomous vehicles and compete in a variety of unmanned mobility competitions. The course to be traversed in the competition consists of a lane demarcated by painted lines on grass with the possibility of one of the two lines being deliberately left out over segments of the course. The course also consists of other challenging artifacts such as sandpits, ramps, potholes, and colored tarps that alter the color composition of scenes, and obstacles set up using orange and white construction barrels. This paper describes a composite lane edge detection approach that uses three algorithms to implement noise filters enabling increased removal of noise prior to the application of image thresholding. The first algorithm uses a row-adaptive statistical filter to establish an intensity floor followed by a global threshold based on a reverse cumulative intensity histogram and a priori knowledge about lane thickness and separation. The second method first improves the contrast of the image by implementing an arithmetic combination of the blue plane (RGB format) and a modified saturation plane (HSI format). A global threshold is then applied based on the mean of the intensity image and a user-defined offset. The third method applies the horizontal component of the Sobel mask to a modified gray scale of the image, followed by a thresholding method similar to the one used in the second method. The Hough transform is applied to each of the resulting binary images to select the most probable line candidates. Finally, a heuristics-based confidence interval is determined, and the results sent on to a separate fuzzy polar-based navigation algorithm, which fuses the image data with that produced by a laser scanner (for obstacle detection).

  5. Measure of the accuracy of navigational sensors for autonomous path tracking

    NASA Astrophysics Data System (ADS)

    Motazed, Ben

    1994-02-01

    Outdoor mobile robot path tracking for an extended period of time and distance is a formidable task. The difficulty lies in the ability of robot navigation systems to reliably and accurately report on the position and orientation of the vehicle. This paper addresses the accurate navigation of mobile robots in the context of non-line of sight autonomous convoying. Dead-reckoning, GPS and vision based autonomous road following navigational schemes are integrated through a Kalman filter formulation to derive mobile robot position and orientation. The accuracy of these navigational schemes and their sufficiency to achieve autonomous path tracking for long duration are examined.

  6. Immune systems are not just for making you feel better: they are for controlling autonomous robots

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark

    2005-05-01

    The typical algorithm for robot autonomous navigation in off-road complex environments involves building a 3D map of the robot's surrounding environment using a 3D sensing modality such as stereo vision or active laser scanning, and generating an instantaneous plan to navigate around hazards. Although there has been steady progress using these methods, these systems suffer from several limitations that cannot be overcome with 3D sensing and planning alone. Geometric sensing alone has no ability to distinguish between compressible and non-compressible materials. As a result, these systems have difficulty in heavily vegetated environments and require sensitivity adjustments across different terrain types. On the planning side, these systems have no ability to learn from their mistakes and avoid problematic environmental situations on subsequent encounters. We have implemented an adaptive terrain classification system based on the Artificial Immune System (AIS) computational model, which is loosely based on the biological immune system, that combines various forms of imaging sensor inputs to produce a "feature labeled" image of the scene categorizing areas as benign or detrimental for autonomous robot navigation. Because of the qualities of the AIS computation model, the resulting system will be able to learn and adapt on its own through interaction with the environment by modifying its interpretation of the sensor data. The feature labeled results from the AIS analysis are inserted into a map and can then be used by a planner to generate a safe route to a goal point. The coupling of diverse visual cues with the malleable AIS computational model will lead to autonomous robotic ground vehicles that require less human intervention for deployment in novel environments and more robust operation as a result of the system's ability to improve its performance through interaction with the environment.

  7. Real-time map building and navigation for autonomous robots in unknown environments.

    PubMed

    Oriolo, G; Ulivi, G; Vendittelli, M

    1998-01-01

    An algorithmic solution method is presented for the problem of autonomous robot motion in completely unknown environments. Our approach is based on the alternate execution of two fundamental processes: map building and navigation. In the former, range measures are collected through the robot exteroceptive sensors and processed in order to build a local representation of the surrounding area. This representation is then integrated in the global map so far reconstructed by filtering out insufficient or conflicting information. In the navigation phase, an A*-based planner generates a local path from the current robot position to the goal. Such a path is safe inside the explored area and provides a direction for further exploration. The robot follows the path up to the boundary of the explored area, terminating its motion if unexpected obstacles are encountered. The most peculiar aspects of our method are the use of fuzzy logic for the efficient building and modification of the environment map, and the iterative application of A*, a complete planning algorithm which takes full advantage of local information. Experimental results for a NOMAD 200 mobile robot show the real-time performance of the proposed method, both in static and moderately dynamic environments. PMID:18255950

  8. Remotely Manipulated And Autonomous Robotic Welding Fabrication In Space

    NASA Astrophysics Data System (ADS)

    Agapakis, John E.; Masubuchi, Koichi

    1985-12-01

    The results of a National Aeronautics and Space Administration (NASA) sponsored study, performed in order to establish the feasibility of remotely manipulated or unmanned welding fabrication systems for space construction, are first presented in this paper. Possible space welding fabrication tasks and operational modes are classified and the capabilities and limitations of human operators and machines are outlined. The human performance in remote welding tasks is experimentally tested under the sensing and actuation constraints imposed by remote manipulation in outer space environments. Proposals for the development of space welding technology are made and necessary future research and development (R&D) efforts are identified. The development of improved visual sensing strategies and computer encoding of the human welding engineering expertise are identified as essential, both for human operator assistance and for autonomous operation in all phases of welding fabrication. Results of a related follow-up study are then briefly presented. Novel uses of machine vision for the determination of the weld joint and bead geometry are proposed and implemented, and a first prototype of a rule-based expert system is developed for the interpretation of the visually detected weld features and defects.

  9. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    SciTech Connect

    Podder, Tarun K.; Buzurovic, Ivan; Huang Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-15

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane's model and the Army Material Systems Analysis Activity, i.e., Crow's model, were applied. The MTBF was used as an important measure for assessing the system's reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane's postulation as well as Crow's postulation of reliability growth. The Laplace test index was -3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved reliability. The MTBF

  10. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    PubMed Central

    Podder, Tarun K.; Buzurovic, Ivan; Huang, Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-01

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane’s model and the Army Material Systems Analysis Activity, i.e., Crow’s model, were applied. The MTBF was used as an important measure for assessing the system’s reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane’s postulation as well as Crow’s postulation of reliability growth. The Laplace test index was −3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved

  11. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations. PMID:24760949

  12. Genetic Fuzzy Control for Path-Tracking of an Autonomous Robotic Bicycle

    NASA Astrophysics Data System (ADS)

    Chen, Chih-Keng; Dao, Thanh-Son

    Due to its non-holonomic constraints and a highly unstable nature, the autonomous bicycle is difficult to be controlled for tracking a target path while retaining its balance. As a result of the non-holonomic constraint conditions, the instantaneous velocity of the vehicle is limited to certain directions. Constraints of this kind occur under the no-slip condition. In this study, the problem of optimization of fuzzy logic controllers (FLCs) for path-tracking of an autonomous robotic bicycle using genetic algorithm (GA) is focused. In order to implement path-tracking algorithm, strategies for balancing and tracking a given roll-angle are also addressed. The proposed strategy optimizes FLCs by keeping the rule-table fixed and tuning their membership functions by introducing the scaling factors (SFs) and deforming coefficients (DCs). The numerical simualtions prove the effectiveness of the proposed structure of the genetic fuzzy controller for the developed bicycle system.

  13. Simulation of autonomous robotic multiple-core biopsy by 3D ultrasound guidance.

    PubMed

    Liang, Kaicheng; Rogers, Albert J; Light, Edward D; Von Allmen, Daniel; Smith, Stephen W

    2010-04-01

    An autonomous multiple-core biopsy system guided by real-time 3D ultrasound and operated by a robotic arm with 6+1 degrees of freedom has been developed. Using a specimen of turkey breast as a tissue phantom, our system was able to first autonomously locate the phantom in the image volume and then perform needle sticks in each of eight sectors in the phantom in a single session, with no human intervention required. Based on the fraction of eight sectors successfully sampled in an experiment of five trials, a success rate of 93% was recorded. This system could have relevance in clinical procedures that involve multiple needle-core sampling such as prostate or breast biopsy. PMID:20687279

  14. Development and training of a learning expert system in an autonomous mobile robot via simulation

    SciTech Connect

    Spelt, P.F.; Lyness, E.; DeSaussure, G. . Center for Engineering Systems Advanced Research)

    1989-11-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using a computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.

  15. Concept for practical exercises for studying autonomous flying robots in a university environment: part II

    NASA Astrophysics Data System (ADS)

    Gageik, Nils; Dilger, Erik; Montenegro, Sergio; Schön, Stefan; Wildenhein, Rico; Creutzburg, Reiner; Fischer, Arno

    2015-03-01

    The present paper demonstrates the application of quadcopters as educational material for students in aerospace computer science, as it is already in usage today. The work with quadrotors teaches students theoretical and practical knowledge in the fields of robotics, control theory, aerospace and electrical engineering as well as embedded programming and computer science. For this the material, concept, realization and future view of such a course is discussed in this paper. Besides that, the paper gives a brief overview of student research projects following the course, which are related to the research and development of fully autonomous quadrotors.

  16. Autonomous global sky monitoring with real-time robotic follow-up

    SciTech Connect

    Vestrand, W Thomas; Davis, H; Wren, J; Wozniak, P; Norman, B; White, R; Bloch, J; Fenimore, E; Hodge, Barry; Jah, Moriba; Rast, Richard

    2008-01-01

    We discuss the development of prototypes for a global grid of advanced 'thinking' sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  17. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  18. Localization of Non-Linearly Modeled Autonomous Mobile Robots Using Out-of-Sequence Measurements

    PubMed Central

    Besada-Portas, Eva; Lopez-Orozco, Jose A.; Lanillos, Pablo; de la Cruz, Jesus M.

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  19. Localization of non-linearly modeled autonomous mobile robots using out-of-sequence measurements.

    PubMed

    Besada-Portas, Eva; Lopez-Orozco, Jose A; Lanillos, Pablo; de la Cruz, Jesus M

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  20. Autonomous robotic capture of non-cooperative target by adaptive extended Kalman filter based visual servo

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Zheng H.

    2016-05-01

    This paper presents a real-time, vision-based algorithm for the pose and motion estimation of non-cooperative targets and its application in visual servo robotic manipulator to perform autonomous capture. A hybrid approach of adaptive extended Kalman filter and photogrammetry is developed for the real-time pose and motion estimation of non-cooperative targets. Based on the pose and motion estimates, the desired pose and trajectory of end-effector is defined and the corresponding desired joint angles of the robotic manipulator are derived by inverse kinematics. A close-loop visual servo control scheme is then developed for the robotic manipulator to track, approach and capture the target. Validating experiments are designed and performed on a custom-built six degrees of freedom robotic manipulator with an eye-in-hand configuration. The experimental results demonstrate the feasibility, effectiveness and robustness of the proposed adaptive extended Kalman filter enabled pose and motion estimation and visual servo strategy.

  1. Optical 3D laser measurement system for navigation of autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Básaca-Preciado, Luis C.; Sergiyenko, Oleg Yu.; Rodríguez-Quinonez, Julio C.; García, Xochitl; Tyrsa, Vera V.; Rivas-Lopez, Moises; Hernandez-Balbuena, Daniel; Mercorelli, Paolo; Podrygalo, Mikhail; Gurko, Alexander; Tabakova, Irina; Starostenko, Oleg

    2014-03-01

    In our current research, we are developing a practical autonomous mobile robot navigation system which is capable of performing obstacle avoiding task on an unknown environment. Therefore, in this paper, we propose a robot navigation system which works using a high accuracy localization scheme by dynamic triangulation. Our two main ideas are (1) integration of two principal systems, 3D laser scanning technical vision system (TVS) and mobile robot (MR) navigation system. (2) Novel MR navigation scheme, which allows benefiting from all advantages of precise triangulation localization of the obstacles, mostly over known camera oriented vision systems. For practical use, mobile robots are required to continue their tasks with safety and high accuracy on temporary occlusion condition. Presented in this work, prototype II of TVS is significantly improved over prototype I of our previous publications in the aspects of laser rays alignment, parasitic torque decrease and friction reduction of moving parts. The kinematic model of the MR used in this work is designed considering the optimal data acquisition from the TVS with the main goal of obtaining in real time, the necessary values for the kinematic model of the MR immediately during the calculation of obstacles based on the TVS data.

  2. An enhanced dynamic Delaunay triangulation-based path planning algorithm for autonomous mobile robot navigation

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Tang, Yipeng

    2010-01-01

    An enhanced dynamic Delaunay Triangulation-based (DT) path planning approach is proposed for mobile robots to plan and navigate a path successfully in the context of the Autonomous Challenge of the Intelligent Ground Vehicle Competition (www.igvc.org). The Autonomous Challenge course requires the application of vision techniques since it involves path-based navigation in the presence of a tightly clustered obstacle field. Course artifacts such as switchbacks, ramps, dashed lane lines, trap etc. are present which could turn the robot around or cause it to exit the lane. The main contribution of this work is a navigation scheme based on dynamic Delaunay Triangulation (DDT) that is heuristically enhanced on the basis of a sense of general lane direction. The latter is computed through a "GPS (Global Positioning System) tail" vector obtained from the immediate path history of the robot. Using processed data from a LADAR, camera, compass and GPS unit, a composite local map containing both obstacles and lane line segments is built up and Delaunay Triangulation is continuously run to plan a path. This path is heuristically corrected, when necessary, by taking into account the "GPS tail" . With the enhancement of the Delaunay Triangulation by using the "GPS tail", goal selection is successfully achieved in a majority of situations. The robot appears to follow a very stable path while navigating through switchbacks and dashed lane line situations. The proposed enhanced path planning and GPS tail technique has been successfully demonstrated in a Player/Stage simulation environment. In addition, tests on an actual course are very promising and reveal the potential for stable forward navigation.

  3. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  4. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  5. Behaviorist-based control of an autonomous skid-steer robot using threshold fuzzy systems

    NASA Astrophysics Data System (ADS)

    Overholt, James L.; Cheok, K. C.; Smid, G. Edzko

    2001-09-01

    This paper describes a method of acquiring behaviorist-based reactive control strategies for an autonomous skid-steer robot operating in an unknown environment. First, a detailed interactive simulation of the robot (including simplified vehicle kinematics, sensors and a randomly generated environment) is developed with the capability of a human driver supplying all control actions. We then introduce a new modular, neural-fuzzy system called Threshold Fuzzy Systems (TFS). A TFS has two unique features that distinguish it from traditional fuzzy logic and neural network systems; (1) the rulebase of a TFS contains only single antecedent, single consequence rules, called a Behaviorist Fuzzy Rulebase (BFR) and (2) a highly structured adaptive node network, called a Rule Dominance Network (RDN), is added to the fuzzy logic inference engine. Each rule in the BFR is a direct mapping of an input sensor to a system output. Connection nodes in the RDN occur when rules in the BFR are conflicting. The nodes of the RDN contain functions that are used to suppress the output of other conflicting rules in the BFR. Supervised training, using error backpropagation, is used to find the optimal parameters of the dominance functions. The usefulness of the TFS approach becomes evident when examining an autonomous vehicle system (AVS). In this paper, a TFS controller is developed for a skid-steer AVS. Several hundred simulations are conducted and results for the AVS with a traditional fuzzy controller and with a TFS controller are compared.

  6. Tank-automotive robotics

    NASA Astrophysics Data System (ADS)

    Lane, Gerald R.

    1999-07-01

    To provide an overview of Tank-Automotive Robotics. The briefing will contain program overviews & inter-relationships and technology challenges of TARDEC managed unmanned and robotic ground vehicle programs. Specific emphasis will focus on technology developments/approaches to achieve semi- autonomous operation and inherent chassis mobility features. Programs to be discussed include: DemoIII Experimental Unmanned Vehicle (XUV), Tactical Mobile Robotics (TMR), Intelligent Mobility, Commanders Driver Testbed, Collision Avoidance, International Ground Robotics Competition (ICGRC). Specifically, the paper will discuss unique exterior/outdoor challenges facing the IGRC competing teams and the synergy created between the IGRC and ongoing DoD semi-autonomous Unmanned Ground Vehicle and DoT Intelligent Transportation System programs. Sensor and chassis approaches to meet the IGRC challenges and obstacles will be shown and discussed. Shortfalls in performance to meet the IGRC challenges will be identified.

  7. Demonstration of a Spoken Dialogue Interface for Planning Activities of a Semi-autonomous Robot

    NASA Technical Reports Server (NTRS)

    Dowding, John; Frank, Jeremy; Hockey, Beth Ann; Jonsson, Ari; Aist, Gregory

    2002-01-01

    Planning and scheduling in the face of uncertainty and change pushes the capabilities of both planning and dialogue technologies by requiring complex negotiation to arrive at a workable plan. Planning for use of semi-autonomous robots involves negotiation among multiple participants with competing scientific and engineering goals to co-construct a complex plan. In NASA applications this plan construction is done under severe time pressure so having a dialogue interface to the plan construction tools can aid rapid completion of the process. But, this will put significant demands on spoken dialogue technology, particularly in the areas of dialogue management and generation. The dialogue interface will need to be able to handle the complex dialogue strategies that occur in negotiation dialogues, including hypotheticals and revisions, and the generation component will require an ability to summarize complex plans. This demonstration will describe a work in progress towards building a spoken dialogue interface to the EUROPA planner for the purposes of planning and scheduling the activities of a semi-autonomous robot. A prototype interface has been built for planning the schedule of the Personal Satellite Assistant (PSA), a mobile robot designed for micro-gravity environments that is intended for use on the Space Shuttle and International Space Station. The spoken dialogue interface gives the user the capability to ask for a description of the plan, ask specific questions about the plan, and update or modify the plan. We anticipate that a spoken dialogue interface to the planner will provide a natural augmentation or alternative to the visualization interface, in situations in which the user needs very targeted information about the plan, in situations where natural language can express complex ideas more concisely than GUI actions, or in situations in which a graphical user interface is not appropriate.

  8. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    SciTech Connect

    EISLER, G. RICHARD

    2002-08-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstrate the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.

  9. Towards Autonomous Inspection of Space Systems Using Mobile Robotic Sensor Platforms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Saad, Ashraf; Litt, Jonathan S.

    2007-01-01

    The space transportation systems required to support NASA's Exploration Initiative will demand a high degree of reliability to ensure mission success. This reliability can be realized through autonomous fault/damage detection and repair capabilities. It is crucial that such capabilities are incorporated into these systems since it will be impractical to rely upon Extra-Vehicular Activity (EVA), visual inspection or tele-operation due to the costly, labor-intensive and time-consuming nature of these methods. One approach to achieving this capability is through the use of an autonomous inspection system comprised of miniature mobile sensor platforms that will cooperatively perform high confidence inspection of space vehicles and habitats. This paper will discuss the efforts to develop a small scale demonstration test-bed to investigate the feasibility of using autonomous mobile sensor platforms to perform inspection operations. Progress will be discussed in technology areas including: the hardware implementation and demonstration of robotic sensor platforms, the implementation of a hardware test-bed facility, and the investigation of collaborative control algorithms.

  10. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  11. Autonomous charging to enable long-endurance missions for small aerial robots

    NASA Astrophysics Data System (ADS)

    Mulgaonkar, Yash; Kumar, Vijay

    2014-06-01

    The past decade has seen an increased interest towards research involving Autonomous Micro Aerial Vehicles (MAVs). The predominant reason for this is their agility and ability to perform tasks too difficult or dangerous for their human counterparts and to navigate into places where ground robots cannot reach. Among MAVs, rotary wing aircraft such as quadrotors have the ability to operate in confined spaces, hover at a given point in space and perch1 or land on a flat surface. This makes the quadrotor a very attractive aerial platform giving rise to a myriad of research opportunities. The potential of these aerial platforms is severely limited by the constraints on the flight time due to limited battery capacity. This in turn arises from limits on the payload of these rotorcraft. By automating the battery recharging process, creating autonomous MAVs that can recharge their on-board batteries without any human intervention and by employing a team of such agents, the overall mission time can be greatly increased. This paper describes the development, testing, and implementation of a system of autonomous charging stations for a team of Micro Aerial Vehicles. This system was used to perform fully autonomous long-term multi-agent aerial surveillance experiments with persistent station keeping. The scalability of the algorithm used in the experiments described in this paper was also tested by simulating a persistence surveillance scenario for 10 MAVs and charging stations. Finally, this system was successfully implemented to perform a 9½ hour multi-agent persistent flight test. Preliminary implementation of this charging system in experiments involving construction of cubic structures with quadrotors showed a three-fold increase in effective mission time.

  12. Using Simulation to Evaluate Scientific Impact of Autonomous Robotic Capabilities for Mars

    NASA Astrophysics Data System (ADS)

    Haldemann, A. F.; McHenry, M. C.; Castano, R. A.; Cameraon, J. M.; Estlin, T. A.; Farr, T. G.; Jain, A.; Lee, M.; Leff, C. E.; Lim, C.; Nesnas, I. A.; Petras, R. D.; Pomerantz, M.; Powell, M.; Shu, I.; Wood, J.; Volpe, R.; Gaines, D. M.

    2006-12-01

    The Science Operations On Planetary Surfaces (SOOPS) task was created with the goal of evaluating, developing and validating methods for increasing the productivity of science operations on planetary surfaces. The highly integrated spacecraft-instrument payload systems of planetary surface missions create operational constraints (e.g. power, data volume, number of ground control interactions) that can reduce the effective science capabilities. Technological solutions have been proposed to mitigate the impact of those constraints on science return. For example, enhanced mobility autonomy, robotic arm autonomous deployment, and on- board image analysis have been implemented on the Mars Exploration Rovers. Next generation improvements involve on-board science driven decision-making and data collection. SOOPS takes a systems level approach to science operations and thus to evaluating and demonstrating the potential benefits of technologies that are already in development at the `component level'. A simulation environment---"Field Test in a Box" or SOOPS-FTB---has been developed with realistic terrains and scientifically pertinent information content. The terrain can be explored with a simulated spacecraft and instruments that are operated using an activity planning software interface which closely resembles that used for actual surface spacecraft missions. The simulation environment provides flexibility and control over experiments that help answer "what if" questions about the performance of proposed autonomous technologies. The experiments also help evaluate operator interaction with the autonomous system, and improve the designs of the control tools. We will report the recent results of SOOPS-FTB experiments with an on-board feature mapping capability, which is effectively an autonomous compression scheme. This example illustrates a demonstration of a new software scheme to operate within a known hardware configuration. It is also conceivable that SOOPS-FTB could be

  13. A field robot for autonomous laser-based N2O flux measurements

    NASA Astrophysics Data System (ADS)

    Molstad, Lars; Reent Köster, Jan; Bakken, Lars; Dörsch, Peter; Lien, Torgrim; Overskeid, Øyvind; Utstumo, Trygve; Løvås, Daniel; Brevik, Anders

    2014-05-01

    N2O measurements in multi-plot field trials are usually carried out by chamber-based manual gas sampling and subsequent laboratory-based gas chromatographic N2O determination. Spatial and temporal resolution of these measurements are commonly limited by available manpower. However, high spatial and temporal variability of N2O fluxes within individual field plots can add large uncertainties to time- and area-integrated flux estimates. Detailed mapping of this variability would improve these estimates, as well as help our understanding of the factors causing N2O emissions. An autonomous field robot was developed to increase the sampling frequency and to operate outside normal working hours. The base of this system was designed as an open platform able to carry versatile instrumentation. It consists of an electrically motorized platform powered by a lithium-ion battery pack, which is capable of autonomous navigation by means of a combined high precision real-time kinematic (RTK) GPS and an inertial measurement unit (IMU) system. On this platform an elevator is mounted, carrying a lateral boom with a static chamber on each side of the robot. Each chamber is equipped with a frame of plastic foam to seal the chamber when lowered onto the ground by the elevator. N2O flux from the soil covered by the two chambers is sequentially determined by circulating air between each chamber and a laser spectrometer (DLT-100, Los Gatos Research, Mountain View, CA, USA), which monitors the increase in N2O concentration. The target enclosure time is 1 - 2 minutes, but may be longer when emissions are low. CO2 concentrations are determined by a CO2/H2O gas analyzer (LI-840A, LI-COR Inc., Lincoln, NE, USA). Air temperature and air pressure inside both chambers are continuously monitored and logged. Wind speed and direction are monitored by a 3D sonic anemometer on top of the elevator boom. This autonomous field robot can operate during day and night time, and its working hours are only

  14. Self-localization for an autonomous mobile robot based on an omni-directional vision system

    NASA Astrophysics Data System (ADS)

    Chiang, Shu-Yin; Lin, Kuang-Yu; Chia, Tsorng-Lin

    2013-12-01

    In this study, we designed an autonomous mobile robot based on the rules of the Federation of International Robotsoccer Association (FIRA) RoboSot category, integrating the techniques of computer vision, real-time image processing, dynamic target tracking, wireless communication, self-localization, motion control, path planning, and control strategy to achieve the contest goal. The self-localization scheme of the mobile robot is based on the algorithms featured in the images from its omni-directional vision system. In previous works, we used the image colors of the field goals as reference points, combining either dual-circle or trilateration positioning of the reference points to achieve selflocalization of the autonomous mobile robot. However, because the image of the game field is easily affected by ambient light, positioning systems exclusively based on color model algorithms cause errors. To reduce environmental effects and achieve the self-localization of the robot, the proposed algorithm is applied in assessing the corners of field lines by using an omni-directional vision system. Particularly in the mid-size league of the RobotCup soccer competition, selflocalization algorithms based on extracting white lines from the soccer field have become increasingly popular. Moreover, white lines are less influenced by light than are the color model of the goals. Therefore, we propose an algorithm that transforms the omni-directional image into an unwrapped transformed image, enhancing the extraction features. The process is described as follows: First, radical scan-lines were used to process omni-directional images, reducing the computational load and improving system efficiency. The lines were radically arranged around the center of the omni-directional camera image, resulting in a shorter computational time compared with the traditional Cartesian coordinate system. However, the omni-directional image is a distorted image, which makes it difficult to recognize the

  15. An algorithm for image clusters detection and identification based on color for an autonomous mobile robot

    SciTech Connect

    Uy, D.L.

    1996-02-01

    An algorithm for detection and identification of image clusters or {open_quotes}blobs{close_quotes} based on color information for an autonomous mobile robot is developed. The input image data are first processed using a crisp color fuszzyfier, a binary smoothing filter, and a median filter. The processed image data is then inputed to the image clusters detection and identification program. The program employed the concept of {open_quotes}elastic rectangle{close_quotes}that stretches in such a way that the whole blob is finally enclosed in a rectangle. A C-program is develop to test the algorithm. The algorithm is tested only on image data of 8x8 sizes with different number of blobs in them. The algorithm works very in detecting and identifying image clusters.

  16. Autonomous trajectory generation for mobile robots with non-holonomic and steering angle constraints

    SciTech Connect

    Pin, F.G.; Vasseur, H.A.

    1990-01-01

    This paper presents an approach to the trajectory planning of mobile platforms characterized by non-holonomic constraints and constraints on the steering angle and steering angle rate. The approach is based on geometric reasoning and provides deterministic trajectories for all pairs of initial and final configurations (position x, y, and orientation {theta}) of the robot. Furthermore, the method generates trajectories taking into account the forward and reverse mode of motion of the vehicle, or combination of these when complex maneuvering is involved or when the environment is obstructed with obstacles. The trajectory planning algorithm is described, and examples of trajectories generated for a variety of environmental conditions are presented. The generation of the trajectories only takes a few milliseconds of run time on a micro Vax, making the approach quite attractive for use as a real-time motion planner for teleoperated or sensor-based autonomous vehicles in complex environments. 10 refs., 11 figs.

  17. Multiresolutional schemata for unsupervised learning of autonomous robots for 3D space operation

    NASA Technical Reports Server (NTRS)

    Lacaze, Alberto; Meystel, Michael; Meystel, Alex

    1994-01-01

    This paper describes a novel approach to the development of a learning control system for autonomous space robot (ASR) which presents the ASR as a 'baby' -- that is, a system with no a priori knowledge of the world in which it operates, but with behavior acquisition techniques that allows it to build this knowledge from the experiences of actions within a particular environment (we will call it an Astro-baby). The learning techniques are rooted in the recursive algorithm for inductive generation of nested schemata molded from processes of early cognitive development in humans. The algorithm extracts data from the environment and by means of correlation and abduction, it creates schemata that are used for control. This system is robust enough to deal with a constantly changing environment because such changes provoke the creation of new schemata by generalizing from experiences, while still maintaining minimal computational complexity, thanks to the system's multiresolutional nature.

  18. Portable robot for autonomous venipuncture using 3D near infrared image guidance

    PubMed Central

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2015-01-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model. PMID:26120592

  19. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    NASA Astrophysics Data System (ADS)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  20. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  1. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  2. Teaching and implementing autonomous robotic lab walkthroughs in a biotech laboratory through model-based visual tracking

    NASA Astrophysics Data System (ADS)

    Wojtczyk, Martin; Panin, Giorgio; Röder, Thorsten; Lenz, Claus; Nair, Suraj; Heidemann, Rüdiger; Goudar, Chetan; Knoll, Alois

    2010-01-01

    After utilizing robots for more than 30 years for classic industrial automation applications, service robots form a constantly increasing market, although the big breakthrough is still awaited. Our approach to service robots was driven by the idea of supporting lab personnel in a biotechnology laboratory. After initial development in Germany, a mobile robot platform extended with an industrial manipulator and the necessary sensors for indoor localization and object manipulation, has been shipped to Bayer HealthCare in Berkeley, CA, USA, a global player in the sector of biopharmaceutical products, located in the San Francisco bay area. The determined goal of the mobile manipulator is to support the off-shift staff to carry out completely autonomous or guided, remote controlled lab walkthroughs, which we implement utilizing a recent development of our computer vision group: OpenTL - an integrated framework for model-based visual tracking.

  3. On the design of neuro-controllers for individual and social learning behaviour in autonomous robots: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Pini, Giovanni; Tuci, Elio

    2008-06-01

    In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).

  4. Biomimetic evolutionary analysis: testing the adaptive value of vertebrate tail stiffness in autonomous swimming robots.

    PubMed

    Long, J H; Koob, T J; Irving, K; Combie, K; Engel, V; Livingston, N; Lammert, A; Schumacher, J

    2006-12-01

    For early vertebrates, a long-standing hypothesis is that vertebrae evolved as a locomotor adaptation, stiffening the body axis and enhancing swimming performance. While supported by biomechanical data, this hypothesis has not been tested using an evolutionary approach. We did so by extending biomimetic evolutionary analysis (BEA), which builds physical simulations of extinct systems, to include use of autonomous robots as proxies of early vertebrates competing in a forage navigation task. Modeled after free-swimming larvae of sea squirts (Chordata, Urochordata), three robotic tadpoles (;Tadros'), each with a propulsive tail bearing a biomimetic notochord of variable spring stiffness, k (N m(-1)), searched for, oriented to, and orbited in two dimensions around a light source. Within each of ten generations, we selected for increased swimming speed, U (m s(-1)) and decreased time to the light source, t (s), average distance from the source, R (m) and wobble maneuvering, W (rad s(-2)). In software simulation, we coded two quantitative trait loci (QTL) that determine k: bending modulus, E (Nm(-2)) and length, L (m). Both QTL were mutated during replication, independently assorted during meiosis and, as haploid gametes, entered into the gene pool in proportion to parental fitness. After random mating created three new diploid genotypes, we fabricated three new offspring tails. In the presence of both selection and chance events (mutation, genetic drift), the phenotypic means of this small population evolved. The classic hypothesis was supported in that k was positively correlated (r(2)=0.40) with navigational prowess, NP, the dimensionless ratio of U to the product of R, t and W. However, the plausible adaptive scenario, even in this simplified system, is more complex, since the remaining variance in NP was correlated with the residuals of R and U taken with respect to k, suggesting that changes in k alone are insufficient to explain the evolution of NP. PMID:17114406

  5. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  6. Remote Sensing of Radiation Dose Rate by a Robot for Outdoor Usage

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Doi, K.; Kanematsu, H.; Utsumi, Y.; Hashimoto, R.; Takashina, T.

    2013-04-01

    In the present paper, the design and prototyping of a telemetry system, in which GPS, camera, and scintillation counter were mounted on a crawler type traveling vehicle, were conducted for targeting outdoor usage such as school playground. As a result, the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt. The results were as follows: (1) It was confirmed that the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt (running speed: 17[m/min]). (2) It was confirmed that the location information captured by GPS is visible on the Google map, and that the incorporation of video information is also possible to play. (3)A radiation dose rate of 0.09[μSv / h] was obtained in the ground. The value is less than the 1/40 ([3.8μSv / h]) allowable radiation dose rate for children in Fukushima Prefecture.(4)As a further work, modifying to program traveling, the measurement of the distribution of the radiation dose rate in a school of Fukushima Prefecture, and class delivery on radiation measurement will be carried out.

  7. Adjustably Autonomous Multi-agent Plan Execution with an Internal Spacecraft Free-Flying Robot Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Nicewarner, Keith

    2006-01-01

    We present an multi-agent model-based autonomy architecture with monitoring, planning, diagnosis, and execution elements. We discuss an internal spacecraft free-flying robot prototype controlled by an implementation of this architecture and a ground test facility used for development. In addition, we discuss a simplified environment control life support system for the spacecraft domain also controlled by an implementation of this architecture. We discuss adjustable autonomy and how it applies to this architecture. We describe an interface that provides the user situation awareness of both autonomous systems and enables the user to dynamically edit the plans prior to and during execution as well as control these agents at various levels of autonomy. This interface also permits the agents to query the user or request the user to perform tasks to help achieve the commanded goals. We conclude by describing a scenario where these two agents and a human interact to cooperatively detect, diagnose and recover from a simulated spacecraft fault.

  8. Autonomous Scheduling of the 1.3-meter Robotically Controlled Telescope (RCT)

    NASA Astrophysics Data System (ADS)

    Strolger, Louis-Gregory; Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    The 1.3-meter telescope at Kitt Peak operates as a fully robotic instrument for optical imaging. An autonomous scheduling algorithm is an essential component of this observatory, and has been designed to manage numerous requests in various imaging modes in a manner similar to how requests are managed at queue-scheduled observatories, but with greater efficiency. Built from the INSGEN list generator and process spawner originally developed for the Berkeley Automatic Imaging Telescope, the RCT scheduler manages and integrates multi-user observations in real time, according to target and exposure information and program-specific constraints (e.g., user assigned priority, moon avoidance, airmass, or temporal constraints), while accounting for instrument limitations, meteorologic conditions, and other technical constraints. The robust system supports time-critical requests, such as with coordinated observations, while also providing short-term (hours) and long-term (days) monitoring capabilities, and one-off observations. We discuss the RCT scheduler, its current decision tree, and future prospects including integration with active partner-share monitoring (which factor into future observation requests) to insure fairness and parity of requests.

  9. CLIPS implementation of a knowledge-based distributed control of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Bou-Ghannam, Akram A.; Doty, Keith L.

    1991-03-01

    We implement an architecture for the planning and control of an intelligent autonomous mobile robot which consists of concurrently running modules forming a hierarchy of control in which lower-level modules perform 'reflexive' tasks while higher-level modules perform tasks requiring greater processing of sensor data. A knowledge-based system performs the task planning and arbitration of lower-level behaviors. This system reasons about behavior selection (fusion) based on its current knowledge and the situation at hand provided by monitoring the status from lower-level behaviors and the map builder. We implement this knowledge-based planning module in CLIPS (C Language Implementation Production System), a rule-based expert systems shell. CLIPS is written in and fully integrated with the C language providing high probability and ease of integration with external systems. We discuss implementation issues including the implementation of control strategy in CLIPS rules and interfacing to other modules through the use of CLIPS user-defined external functions.

  10. The VIPER project (Visualization Integration Platform for Exploration Research): a biologically inspired autonomous reconfigurable robotic platform for diverse unstructured environments

    NASA Astrophysics Data System (ADS)

    Schubert, Oliver J.; Tolle, Charles R.

    2004-09-01

    Over the last decade the world has seen numerous autonomous vehicle programs. Wheels and track designs are the basis for many of these vehicles. This is primarily due to four main reasons: a vast preexisting knowledge base for these designs, energy efficiency of power sources, scalability of actuators, and the lack of control systems technologies for handling alternate highly complex distributed systems. Though large efforts seek to improve the mobility of these vehicles, many limitations still exist for these systems within unstructured environments, e.g. limited mobility within industrial and nuclear accident sites where existing plant configurations have been extensively changed. These unstructured operational environments include missions for exploration, reconnaissance, and emergency recovery of objects within reconfigured or collapsed structures, e.g. bombed buildings. More importantly, these environments present a clear and present danger for direct human interactions during the initial phases of recovery operations. Clearly, the current classes of autonomous vehicles are incapable of performing in these environments. Thus the next generation of designs must include highly reconfigurable and flexible autonomous robotic platforms. This new breed of autonomous vehicles will be both highly flexible and environmentally adaptable. Presented in this paper is one of the most successful designs from nature, the snake-eel-worm (SEW). This design implements shape memory alloy (SMA) actuators which allow for scaling of the robotic SEW designs from sub-micron scale to heavy industrial implementations without major conceptual redesigns as required in traditional hydraulic, pneumatic, or motor driven systems. Autonomous vehicles based on the SEW design posses the ability to easily move between air based environments and fluid based environments with limited or no reconfiguration. Under a SEW designed vehicle, one not only achieves vastly improved maneuverability within a

  11. Control Algorithms and Simulated Environment Developed and Tested for Multiagent Robotics for Autonomous Inspection of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Wong, Edmond

    2005-01-01

    The NASA Glenn Research Center and academic partners are developing advanced multiagent robotic control algorithms that will enable the autonomous inspection and repair of future propulsion systems. In this application, on-wing engine inspections will be performed autonomously by large groups of cooperative miniature robots that will traverse the surfaces of engine components to search for damage. The eventual goal is to replace manual engine inspections that require expensive and time-consuming full engine teardowns and allow the early detection of problems that would otherwise result in catastrophic component failures. As a preliminary step toward the long-term realization of a practical working system, researchers are developing the technology to implement a proof-of-concept testbed demonstration. In a multiagent system, the individual agents are generally programmed with relatively simple controllers that define a limited set of behaviors. However, these behaviors are designed in such a way that, through the localized interaction among individual agents and between the agents and the environment, they result in self-organized, emergent group behavior that can solve a given complex problem, such as cooperative inspection. One advantage to the multiagent approach is that it allows for robustness and fault tolerance through redundancy in task handling. In addition, the relatively simple agent controllers demand minimal computational capability, which in turn allows for greater miniaturization of the robotic agents.

  12. Assessing the Impact of an Autonomous Robotics Competition for STEM Education

    ERIC Educational Resources Information Center

    Chung, C. J. ChanJin; Cartwright, Christopher; Cole, Matthew

    2014-01-01

    Robotics competitions for K-12 students are popular, but are students really learning and improving their STEM scores through robotics competitions? If not, why not? If they are, how much more effective is learning through competitions than traditional classes? Is there room for improvement? What is the best robotics competition model to maximize…

  13. Current challenges in autonomous vehicle development

    NASA Astrophysics Data System (ADS)

    Connelly, J.; Hong, W. S.; Mahoney, R. B., Jr.; Sparrow, D. A.

    2006-05-01

    The field of autonomous vehicles is a rapidly growing one, with significant interest from both government and industry sectors. Autonomous vehicles represent the intersection of artificial intelligence (AI) and robotics, combining decision-making with real-time control. Autonomous vehicles are desired for use in search and rescue, urban reconnaissance, mine detonation, supply convoys, and more. The general adage is to use robots for anything dull, dirty, dangerous or dumb. While a great deal of research has been done on autonomous systems, there are only a handful of fielded examples incorporating machine autonomy beyond the level of teleoperation, especially in outdoor/complex environments. In an attempt to assess and understand the current state of the art in autonomous vehicle development, a few areas where unsolved problems remain became clear. This paper outlines those areas and provides suggestions for the focus of science and technology research. The first step in evaluating the current state of autonomous vehicle development was to develop a definition of autonomy. A number of autonomy level classification systems were reviewed. The resulting working definitions and classification schemes used by the authors are summarized in the opening sections of the paper. The remainder of the report discusses current approaches and challenges in decision-making and real-time control for autonomous vehicles. Suggested research focus areas for near-, mid-, and long-term development are also presented.

  14. Autonomous mobile robot exploration based on the generalized Voronoi graph in the presence of localization error

    NASA Astrophysics Data System (ADS)

    Nagatani, Keiji; Choset, Howie M.

    1999-01-01

    Sensor based exploration is a task which enables a robot to explore and map an unknown environment, using sensor information. The map used in this paper is the generalized Voronoi graph (GVG). The robot explores an unknown environment using an already developed incremental construction procedure to generate the GVG using sensor information. This paper presents some initial results which uses the GVG for robot localization, while mitigating the need to update encoder values. Experimental result verify the described work.

  15. Operator-centered control of a semi-autonomous industrial robot

    SciTech Connect

    Spelt, P.F.; Jones, S.L.

    1994-12-31

    This paper presents work done by Oak Ridge National Laboratory and Remotec, Inc., to develop a new operator-centered control system for Remotec`s Andros telerobot. Andros robots are presently used by numerous electric utilities, the armed forces, and numerous law enforcement agencies to perform tasks which are hazardous for human operators. This project has automated task components and enhanced the video graphics display of the robot`s position in the environment to significantly reduce operator workload. The procedure of automating a telerobot requires the addition of computer power to the robot, along with a variety of sensors and encoders to provide information about the robots performance in and relationship to its environment The resulting vehicle serves as a platform for research on strategies to integrate automated tasks with those performed by a human operator. The addition of these capabilities will greatly enhance the safety and efficiency of performance in hazardous environments.

  16. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  17. The Embudito Mission: A Case Study of the Systematics of Autonomous Ground Mobile Robots

    SciTech Connect

    EICKER,PATRICK J.

    2001-02-01

    Ground mobile robots are much in the mind of defense planners at this time, being considered for a significant variety of missions with a diversity ranging from logistics supply to reconnaissance and surveillance. While there has been a very large amount of basic research funded in the last quarter century devoted to mobile robots and their supporting component technologies, little of this science base has been fully developed and deployed--notable exceptions being NASA's Mars rover and several terrestrial derivatives. The material in this paper was developed as a first exemplary step in the development of a more systematic approach to the R and D of ground mobile robots.

  18. Multi-sensor integration for autonomous robots in nuclear power plants

    SciTech Connect

    Mann, R.C.; Jones, J.P.; Beckerman, M.; Glover, C.W.; Farkas, L.; Bilbro, G.L.; Snyder, W.

    1989-01-01

    As part of a concerted RandD program in advanced robotics for hazardous environments, scientists and engineers at the Oak Ridge National Laboratory (ORNL) are performing research in the areas of systems integration, range-sensor-based 3-D world modeling, and multi-sensor integration. This program features a unique teaming arrangement that involves the universities of Florida, Michigan, Tennessee, and Texas; Odetics Corporation; and ORNL. This paper summarizes work directed at integrating information extracted from data collected with range sensors and CCD cameras on-board a mobile robot, in order to produce reliable descriptions of the robot's environment. Specifically, the paper describes the integration of two-dimensional vision and sonar range information, and an approach to integrate registered luminance and laser range images. All operations are carried out on-board the mobile robot using a 16-processor hypercube computer. 14 refs., 4 figs.

  19. DC Motor Drive for Small Autonomous Robots with Educational and Research Purpose

    NASA Astrophysics Data System (ADS)

    Krklješ, Damir; Babković, Kalman; Nagy, László; Borovac, Branislav; Nikolić, Milan

    Many student robot competitions have been established during the last decade. One of them, and the most popular in Europe, is the European competition EUROBOT. The basic aim of this competition is to promote the robotics among young people, mostly students and high school pupils. The additional outcome of the competition is the development of faculty curriculums that are based on this competition. Such curriculum has been developed at the Faculty of Technical Science in Novi Sad. The curriculum duration is two semesters. During the first semester the theoretical basis is presented to the students. During the second semester the students, divided into teams of three to five students, develop the robots which will take part in the incoming EUROBOT competition. Since the time for the robot development is short, the basic electronic kit is provided for the students. The basic parts of the kit are two DC motor drives dedicated to the robot locomotion. The drives will also be used in the research concerning the multi segment robot foot. This paper presents the DC motor drive and its features. The experimental results concerning speed and position regulations and also the current limiting is presented too.

  20. Robotics.

    ERIC Educational Resources Information Center

    Waddell, Steve; Doty, Keith L.

    1999-01-01

    "Why Teach Robotics?" (Waddell) suggests that the United States lags behind Europe and Japan in use of robotics in industry and teaching. "Creating a Course in Mobile Robotics" (Doty) outlines course elements of the Intelligent Machines Design Lab. (SK)

  1. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self

  2. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  3. Experiments in autonomous navigation and control of multi-manipulator, free-flying space robots

    NASA Astrophysics Data System (ADS)

    Ullman, Marc Albert

    Although space presents an exciting frontier for science and manufacturing, it has proven to be a costly and dangerous place for humans. It is an ideal environment for sophisticated robots capable of performing tasks that currently require the active participation of astronauts. The Aerospace Robotics Laboratory, working with NASA, has developed an experimental model of a multimanipulator, free-flying space robot capable of capturing and manipulating free-floating objects without human assistance. The experimental robot model uses air-cushion technology to simulate, in two dimensions, the drag-free, zero-g characteristics of space. Fully self-contained, the vehicle/manipulator system is equipped with gas-jet thrusters, two two-link manipulators, an electrical power system, digital and analog I/0 capabilities, high speed vision, and a multiprocessor real-time computer. These subsystems have been carefully integrated in a modular architecture that facilitates maintenance and ease of use. A sophisticated control system was designed and implemented to manage and coordinate the actions of the vehicle/manipulator system. A custom on-board vision system is used for closed-loop endpoint control and object tracking in the robot's local reference frame. A multicamera off-board vision system provides global positioning information to the robot via a wireless communication link. Successful rendezvous, tracking, and capture of free-flying, spinning objects is facilitated by simultaneously controlling the robot base position and manipulator motions. These actions are coordinated by a sophisticated event-driven finite-state machine. A graphical user interface enables a remotely situated operator to provide high-level task description commands to the robot and to monitor the robot's activities while it carries out these assignments. The user interface allows a task to be fully specified before any action takes place, thereby eliminating problems associated with communications

  4. Creative Engineering Based Education with Autonomous Robots Considering Job Search Support

    NASA Astrophysics Data System (ADS)

    Takezawa, Satoshi; Nagamatsu, Masao; Takashima, Akihiko; Nakamura, Kaeko; Ohtake, Hideo; Yoshida, Kanou

    The Robotics Course in our Mechanical Systems Engineering Department offers “Robotics Exercise Lessons” as one of its Problem-Solution Based Specialized Subjects. This is intended to motivate students learning and to help them acquire fundamental items and skills on mechanical engineering and improve understanding of Robotics Basic Theory. Our current curriculum was established to accomplish this objective based on two pieces of research in 2005: an evaluation questionnaire on the education of our Mechanical Systems Engineering Department for graduates and a survey on the kind of human resources which companies are seeking and their expectations for our department. This paper reports the academic results and reflections of job search support in recent years as inherited and developed from the previous curriculum.

  5. Ground Simulation of an Autonomous Satellite Rendezvous and Tracking System Using Dual Robotic Systems

    NASA Technical Reports Server (NTRS)

    Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.

    2012-01-01

    A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.

  6. Behavior generation strategy of artificial behavioral system by self-learning paradigm for autonomous robot tasks

    NASA Astrophysics Data System (ADS)

    Dağlarli, Evren; Temeltaş, Hakan

    2008-04-01

    In this study, behavior generation and self-learning paradigms are investigated for the real-time applications of multi-goal mobile robot tasks. The method is capable to generate new behaviors and it combines them in order to achieve multi goal tasks. The proposed method is composed from three layers: Behavior Generating Module, Coordination Level and Emotion -Motivation Level. Last two levels use Hidden Markov models to manage dynamical structure of behaviors. The kinematics and dynamic model of the mobile robot with non-holonomic constraints are considered in the behavior based control architecture. The proposed method is tested on a four-wheel driven and four-wheel steered mobile robot with constraints in simulation environment and results are obtained successfully.

  7. Generating Self-Reliant Teams of Autonomous Cooperating Robots: Desired design Characteristics

    SciTech Connect

    Parker, L.E.

    1999-05-01

    The difficulties in designing a cooperative team are significant. Several of the key questions that must be resolved when designing a cooperative control architecture include: How do we formulate, describe, decompose, and allocate problems among a group of intelligent agents? How do we enable agents to communicate and interact? How do we ensure that agents act coherently in their actions? How do we allow agents to recognize and reconcile conflicts? However, in addition to these key issues, the software architecture must be designed to enable multi-robot teams to be robust, reliable, and flexible. Without these capabilities, the resulting robot team will not be able to successfully deal with the dynamic and uncertain nature of the real world. In this extended abstract, we first describe these desired capabilities. We then briefly describe the ALLIANCE software architecture that we have previously developed for multi-robot cooperation. We then briefly analyze the ALLIANCE architecture in terms of the desired design qualities identified.

  8. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  9. First observations of teleseismic P-waves with autonomous underwater robots: towards future global network of mobile seismometers

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexei; Nolet, Guust; Hello, Yann; Simons, Frederik; Bonnieux, Sébastien

    2013-04-01

    We report here the first successful observations of underwater acoustic signals generated by teleseismic P-waves recorded by autonomous robots MERMAID (short for Mobile Earthquake Recording in Marine Areas by Independent Divers). During 2011-2012 we have conducted three test campaigns for a total duration of about 8 weeks in the Ligurian Sea which have allowed us to record nine teleseismic events (distance more than 60 degree) of magnitudes higher than 6 and one closer event (distance 23 degree) of magnitude 5.5. Our results indicate that no simple relation exists between the magnitude of the source event and the signal-to-noise ratio (SNR) of the corresponding acoustic signals. Other factors, such as fault orientation and meteorological conditions, play an important role in the detectability of the seismic events. We also show examples of the events recorded during these test runs and how their frequency characteristics allow them to be recognized automatically by an algorithm based on the wavelet transform. We shall also report on more recent results obtained during the first fully autonomous run (currently ongoing) of the final MERMAID design in the Mediterranean Sea.

  10. Autonomous intelligent assembly systems LDRD 105746 final report.

    SciTech Connect

    Anderson, Robert J.

    2013-04-01

    This report documents a three-year to develop technology that enables mobile robots to perform autonomous assembly tasks in unstructured outdoor environments. This is a multi-tier problem that requires an integration of a large number of different software technologies including: command and control, estimation and localization, distributed communications, object recognition, pose estimation, real-time scanning, and scene interpretation. Although ultimately unsuccessful in achieving a target brick stacking task autonomously, numerous important component technologies were nevertheless developed. Such technologies include: a patent-pending polygon snake algorithm for robust feature tracking, a color grid algorithm for uniquely identification and calibration, a command and control framework for abstracting robot commands, a scanning capability that utilizes a compact robot portable scanner, and more. This report describes this project and these developed technologies.

  11. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot

    PubMed Central

    Chen, Alvin I.; Balter, Max L.; Maguire, Timothy J.; Yarmush, Martin L.

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture. PMID:26779381

  12. Approaching Complexity through Planful Play: Kindergarten Children's Strategies in Constructing an Autonomous Robot's Behavior

    ERIC Educational Resources Information Center

    Levy, S. T.; Mioduser, D.

    2010-01-01

    This study investigates how young children master, construct and understand intelligent rule-based robot behaviors, focusing on their strategies in gradually meeting the tasks' complexity. The wider aim is to provide a comprehensive map of the kinds of transitions and learning that take place in constructing simple emergent behaviors, particularly…

  13. Novel Microbial Diversity Retrieved by Autonomous Robotic Exploration of the World's Deepest Vertical Phreatic Sinkhole

    NASA Astrophysics Data System (ADS)

    Sahl, Jason W.; Fairfield, Nathaniel; Harris, J. Kirk; Wettergreen, David; Stone, William C.; Spear, John R.

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (˜318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  14. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment. PMID:20298146

  15. A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion.

    PubMed

    Manfredi, L; Assaf, T; Mintchev, S; Marrazza, S; Capantini, L; Orofino, S; Ascari, L; Grillner, S; Wallén, P; Ekeberg, O; Stefanini, C; Dario, P

    2013-10-01

    The bioinspired approach has been key in combining the disciplines of robotics with neuroscience in an effective and promising fashion. Indeed, certain aspects in the field of neuroscience, such as goal-directed locomotion and behaviour selection, can be validated through robotic artefacts. In particular, swimming is a functionally important behaviour where neuromuscular structures, neural control architecture and operation can be replicated artificially following models from biology and neuroscience. In this article, we present a biomimetic system inspired by the lamprey, an early vertebrate that locomotes using anguilliform swimming. The artefact possesses extra- and proprioceptive sensory receptors, muscle-like actuation, distributed embedded control and a vision system. Experiments on optimised swimming and on goal-directed locomotion are reported, as well as the assessment of the performance of the system, which shows high energy efficiency and adaptive behaviour. While the focus is on providing a robotic platform for testing biological models, the reported system can also be of major relevance for the development of engineering system applications. PMID:24030051

  16. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  17. An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers

    NASA Technical Reports Server (NTRS)

    Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun

    2007-01-01

    One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the

  18. Dissociated emergent-response system and fine-processing system in human neural network and a heuristic neural architecture for autonomous humanoid robots.

    PubMed

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence. PMID:21331371

  19. Dissociated Emergent-Response System and Fine-Processing System in Human Neural Network and a Heuristic Neural Architecture for Autonomous Humanoid Robots

    PubMed Central

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence. PMID:21331371

  20. Vector Field Driven Design for Lightweight Signal Processing and Control Schemes for Autonomous Robotic Navigation

    NASA Astrophysics Data System (ADS)

    Mathai, Nebu John; Zourntos, Takis; Kundur, Deepa

    2009-12-01

    We address the problem of realizing lightweight signal processing and control architectures for agents in multirobot systems. Motivated by the promising results of neuromorphic engineering which suggest the efficacy of analog as an implementation substrate for computation, we present the design of an analog-amenable signal processing scheme. We use control and dynamical systems theory both as a description language and as a synthesis toolset to rigorously develop our computational machinery; these mechanisms are mated with structural insights from behavior-based robotics to compose overall algorithmic architectures. Our perspective is that robotic behaviors consist of actions taken by an agent to cause its sensory perception of the environment to evolve in a desired manner. To provide an intuitive aid for designing these behavioral primitives we present a novel visual tool, inspired vector field design, that helps the designer to exploit the dynamics of the environment. We present simulation results and animation videos to demonstrate the signal processing and control architecture in action.

  1. CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2009-12-01

    While artificial vision prostheses are quickly becoming a reality, actual testing time with visual prosthesis carriers is at a premium. Moreover, it is helpful to have a more realistic functional approximation of a blind subject. Instead of a normal subject with a healthy retina looking at a low-resolution (pixelated) image on a computer monitor or head-mounted display, a more realistic approximation is achieved by employing a subject-independent mobile robotic platform that uses a pixelated view as its sole visual input for navigation purposes. We introduce CYCLOPS: an AWD, remote controllable, mobile robotic platform that serves as a testbed for real-time image processing and autonomous navigation systems for the purpose of enhancing the visual experience afforded by visual prosthesis carriers. Complete with wireless Internet connectivity and a fully articulated digital camera with wireless video link, CYCLOPS supports both interactive tele-commanding via joystick, and autonomous self-commanding. Due to its onboard computing capabilities and extended battery life, CYCLOPS can perform complex and numerically intensive calculations, such as image processing and autonomous navigation algorithms, in addition to interfacing to additional sensors. Its Internet connectivity renders CYCLOPS a worldwide accessible testbed for researchers in the field of artificial vision systems. CYCLOPS enables subject-independent evaluation and validation of image processing and autonomous navigation systems with respect to the utility and efficiency of supporting and enhancing visual prostheses, while potentially reducing to a necessary minimum the need for valuable testing time with actual visual prosthesis carriers. PMID:19651459

  2. Design and implementation of a mechanically heterogeneous robot group

    NASA Astrophysics Data System (ADS)

    Sukhatme, Gaurav S.; Montgomery, James F.; Mataric, Maja J.

    1999-08-01

    This paper describes the design and construction of a cooperative, heterogeneous robot group comprised of one semi-autonomous aerial robot and two autonomous ground robots. The robots are designed to perform automated surveillance and reconnaissance of an urban outdoor area using onboard sensing. The ground vehicles have GPS, sonar for obstacle detection and avoidance, and a simple color- based vision system. Navigation is performed using an optimal mixture of odometry and GPS. The helicopter is equipped with a GPS/INS system, a camera, and a framegrabber. Each robot has an embedded 486 PC/104 processor running the QNX real-time operating system. Individual robot controllers are behavior-based and decentralized. We describe a control strategy and architecture that coordinates the robots with minimal top- down planning. The overall system is controlled at high level by a single human operator using a specially designed control unit. The operator is able to task the group with a mission using a minimal amount of training. The group can re-task itself based on sensor inputs and can also be re- tasked by the operator. We describe a particular reconnaissance mission that the robots have been tested with, and lessons learned during the design and implementation. Our initial results with these experiments are encouraging given the challenging mechanics of the aerial robot. We conclude the paper with a discussion of ongoing and future work.

  3. Vertical stream curricula integration of problem-based learning using an autonomous vacuum robot in a mechatronics course

    NASA Astrophysics Data System (ADS)

    Chin, Cheng; Yue, Keng

    2011-10-01

    Difficulties in teaching a multi-disciplinary subject such as the mechatronics system design module in Departments of Mechatronics Engineering at Temasek Polytechnic arise from the gap in experience and skill among staff and students who have different backgrounds in mechanical, computer and electrical engineering within the Mechatronics Department. The departments piloted a new vertical stream curricula model (VSCAM) to enhance student learning in mechatronics system design through integration of educational activities from the first to the second year of the course. In this case study, a problem-based learning (PBL) method on an autonomous vacuum robot in the mechatronics systems design module was proposed to allow the students to have hands-on experience in the mechatronics system design. The proposed works included in PBL consist of seminar sessions, weekly works and project presentation to provide holistic assessment on teamwork and individual contributions. At the end of VSCAM, an integrative evaluation was conducted using confidence logs, attitude surveys and questionnaires. It was found that the activities were quite appreciated by the participating staff and students. Hence, PBL has served as an effective pedagogical framework for teaching multidisciplinary subjects in mechatronics engineering education if adequate guidance and support are given to staff and students.

  4. Compact 3D lidar based on optically coupled horizontal and vertical scanning mechanism for the autonomous navigation of robots

    NASA Astrophysics Data System (ADS)

    Lee, Min-Gu; Baeg, Seung-Ho; Lee, Ki-Min; Lee, Hae-Seok; Baeg, Moon-Hong; Park, Jong-Ok; Kim, Hong-Ki

    2011-06-01

    The purpose of this research is to develop a new 3D LIDAR sensor, named KIDAR-B25, for measuring 3D image information with high range accuracy, high speed and compact size. To measure a distance to the target object, we developed a range measurement unit, which is implemented by the direct Time-Of-Flight (TOF) method using TDC chip, a pulsed laser transmitter as an illumination source (pulse width: 10 ns, wavelength: 905 nm, repetition rate: 30kHz, peak power: 20W), and an Si APD receiver, which has high sensitivity and wide bandwidth. Also, we devised a horizontal and vertical scanning mechanism, climbing in a spiral and coupled with the laser optical path. Besides, control electronics such as the motor controller, the signal processing unit, the power distributor and so on, are developed and integrated in a compact assembly. The key point of the 3D LIDAR design proposed in this paper is to use the compact scanning mechanism, which is coupled with optical module horizontally and vertically. This KIDAR-B25 has the same beam propagation axis for emitting pulse laser and receiving reflected one with no optical interference each other. The scanning performance of the KIDAR-B25 has proven with the stable operation up to 20Hz (vertical), 40Hz (horizontal) and the time is about 1.7s to reach the maximum speed. The range of vertical plane can be available up to +/-10 degree FOV (Field Of View) with a 0.25 degree angular resolution. The whole horizontal plane (360 degree) can be also available with 0.125 degree angular resolution. Since the KIDAR-B25 sensor has been planned and developed to be used in mobile robots for navigation, we conducted an outdoor test for evaluating its performance. The experimental results show that the captured 3D imaging data can be usefully applicable to the navigation of the robot for detecting and avoiding the moving objects with real time.

  5. Semi-autonomous robots for reactor containments. Annual summary report, [1993--1994

    SciTech Connect

    Not Available

    1994-05-06

    During 1993, the activity at the University was split into two primary groups. One group provided direct support for the development and testing of the RVIR vehicle. This effort culminated in a demonstration of the vehicle at ORNL during December. The second group of researchers focused attention on pushing the technology forward in the areas of radiation imaging, navigation, and sensing modalities. A major effort in technology transfer took place during this year. All of these efforts reflected in the periodic progress reports which are attached. During 1994, our attention will change from the Nuclear Energy program to the Environmental Restoration and Waste Management office. The immediate needs of the Robotics Technology Development Program within the Office of Technology Development of EM drove this change in target applications. The University will be working closely with the national laboratories to further develop and transfer existing technologies to mobile platforms which are currently being designed and employed in seriously hazardous environments.

  6. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  7. A novel autonomous, bioinspired swimming robot developed by neuroscientists and bioengineers.

    PubMed

    Stefanini, C; Orofino, S; Manfredi, L; Mintchev, S; Marrazza, S; Assaf, T; Capantini, L; Sinibaldi, E; Grillner, S; Wallén, P; Dario, P

    2012-06-01

    This paper describes the development of a new biorobotic platform inspired by the lamprey. Design, fabrication and implemented control are all based on biomechanical and neuroscientific findings on this eel-like fish. The lamprey model has been extensively studied and characterized in recent years because it possesses all basic functions and control mechanisms of higher vertebrates, while at the same time having fewer neurons and simplified neural structures. The untethered robot has a flexible body driven by compliant actuators with proprioceptive feedback. It also has binocular vision for vision-based navigation. The platform has been successfully and extensively experimentally tested in aquatic environments, has high energy efficiency and is ready to be used as investigation tool for high level motor tasks. PMID:22619181

  8. Image processing for navigation on a mobile embedded platform: design of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Loose, Harald; Lemke, Christiane; Papazov, Chavdar

    2006-02-01

    This paper deals with intelligent mobile platforms connected to a camera controlled by a small hardware-platform called RCUBE. This platform is able to provide features of a typical actuator-sensor board with various inputs and outputs as well as computing power and image recognition capabilities. Several intelligent autonomous RCBUE devices can be equipped and programmed to participate in the BOSPORUS network. These components form an intelligent network for gathering sensor and image data, sensor data fusion, navigation and control of mobile platforms. The RCUBE platform provides a standalone solution for image processing, which will be explained and presented. It plays a major role for several components in a reference implementation of the BOSPORUS system. On the one hand, intelligent cameras will be positioned in the environment, analyzing the events from a fixed point of view and sharing their perceptions with other components in the system. On the other hand, image processing results will contribute to a reliable navigation of a mobile system, which is crucially important. Fixed landmarks and other objects appropriate for determining the position of a mobile system can be recognized. For navigation other methods are added, i.e. GPS calculations and odometers.

  9. Volumetric mapping of tubeworm colonies in Kagoshima Bay through autonomous robotic surveys

    NASA Astrophysics Data System (ADS)

    Maki, Toshihiro; Kume, Ayaka; Ura, Tamaki

    2011-07-01

    We developed and tested a comprehensive method for measuring the three-dimensional distribution of tubeworm colonies using an autonomous underwater vehicle (AUV). We derived volumetric measurements such as the volume, area, average height, and number of tubes for colonies of Lamellibrachia satsuma, the world's shallowest-dwelling vestimentiferan tubeworm discovered at a depth of 82 m, at the Haorimushi site in Kagoshima Bay, Japan, by processing geometric and visual data obtained through low-altitude surveys using the AUV Tri-Dog 1. According to the results, the tubeworm colonies cover an area of 151.9 m 2, accounting for 5.8% of the observed area (2600 m 2). The total number of tubes was estimated to be 99,500. Morphological parameters such as area, volume, and average height were estimated for each colony. On the basis of average height, colonies could be clearly separated into two groups, short (0.1-0.3 m) and tall (0.6-0.7 m), independent of the area.

  10. Ultra-miniature omni-directional camera for an autonomous flying micro-robot

    NASA Astrophysics Data System (ADS)

    Ferrat, Pascal; Gimkiewicz, Christiane; Neukom, Simon; Zha, Yingyun; Brenzikofer, Alain; Baechler, Thomas

    2008-04-01

    CSEM presents a highly integrated ultra-miniature camera module with omni-directional view dedicated to autonomous micro flying devices. Very tight design and integration requirements (related to size, weight, and power consumption) for the optical, microelectronic and electronic components are fulfilled. The presented ultra-miniature camera platform is based on two major components: a catadioptric lens system and a dedicated image sensor. The optical system consists of a hyperbolic mirror and an imaging lens. The vertical field of view is +10° to -35°.The CMOS image sensor provides a polar pixel field with 128 (horizontal) by 64 (vertical) pixels. Since the number of pixels for each circle is constant, the unwrapped panoramic image achieves a constant resolution in polar direction for all image regions. The whole camera module, delivering 40 frames per second, contains optical image preprocessing for effortless re-mapping of the acquired image into undistorted cylindrical coordinates. The total weight of the complete camera is less than 5 g. The system's outer dimensions are 14.4 mm in height, with a 11.4 mm x 11.4 mm foot print. Thanks to the innovative PROGLOGTM, a dynamic range of over 140 dB is achieved.

  11. Automatic learning rate adjustment for self-supervising autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    Described is an application in which an Artificial Neural Network (ANN) controls the positioning of a robot arm with five degrees of freedom by using visual feedback provided by two cameras. This application and the specific ANN model, local liner maps, are based on the work of Ritter, Martinetz, and Schulten. We extended their approach by generating a filtered, average positioning error from the continuous camera feedback and by coupling the learning rate to this error. When the network learns to position the arm, the positioning error decreases and so does the learning rate until the system stabilizes at a minimum error and learning rate. This abolishes the need for a predetermined cooling schedule. The automatic cooling procedure results in a closed loop control with no distinction between a learning phase and a production phase. If the positioning error suddenly starts to increase due to an internal failure such as a broken joint, or an environmental change such as a camera moving, the learning rate increases accordingly. Thus, learning is automatically activated and the network adapts to the new condition after which the error decreases again and learning is 'shut off'. The automatic cooling is therefore a prerequisite for the autonomy and the fault tolerance of the system.

  12. Adaptive fuzzy approach to modeling of operational space for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Musilek, Petr; Gupta, Madan M.

    1998-10-01

    Robots operating in an unstructured environment need high level of modeling of their operational space in order to plan a suitable path from an initial position to a desired goal. From this perspective, operational space modeling seems to be crucial to ensure a sufficient level of autonomy. In order to compile the information from various sources, we propose a fuzzy approach to evaluate each unit region on a grid map by a certain value of transition cost. This value expresses the cost of movement over the unit region: the higher the value, the more expensive the movement through the region in terms of energy, time, danger, etc. The approach for modeling, proposed in this paper, employs fuzzy granulation of information on various terrain features and their combination based on a fuzzy neural network. In order to adapt to the changing environmental conditions, and to improve the validity of constructed cost maps on-line, the system can be endowed with learning abilities. The learning subsystem would change parameters of the fuzzy neural network based decision system by reinforcements derived from comparisons of the actual cost of transition with the cost obtained from the model.

  13. The Summer Robotic Autonomy Course

    NASA Technical Reports Server (NTRS)

    Nourbakhsh, Illah R.

    2002-01-01

    We offered a first Robotic Autonomy course this summer, located at NASA/Ames' new NASA Research Park, for approximately 30 high school students. In this 7-week course, students worked in ten teams to build then program advanced autonomous robots capable of visual processing and high-speed wireless communication. The course made use of challenge-based curricula, culminating each week with a Wednesday Challenge Day and a Friday Exhibition and Contest Day. Robotic Autonomy provided a comprehensive grounding in elementary robotics, including basic electronics, electronics evaluation, microprocessor programming, real-time control, and robot mechanics and kinematics. Our course then continued the educational process by introducing higher-level perception, action and autonomy topics, including teleoperation, visual servoing, intelligent scheduling and planning and cooperative problem-solving. We were able to deliver such a comprehensive, high-level education in robotic autonomy for two reasons. First, the content resulted from close collaboration between the CMU Robotics Institute and researchers in the Information Sciences and Technology Directorate and various education program/project managers at NASA/Ames. This collaboration produced not only educational content, but will also be focal to the conduct of formative and summative evaluations of the course for further refinement. Second, CMU rapid prototyping skills as well as the PI's low-overhead perception and locomotion research projects enabled design and delivery of affordable robot kits with unprecedented sensory- locomotory capability. Each Trikebot robot was capable of both indoor locomotion and high-speed outdoor motion and was equipped with a high-speed vision system coupled to a low-cost pan/tilt head. As planned, follow the completion of Robotic Autonomy, each student took home an autonomous, competent robot. This robot is the student's to keep, as she explores robotics with an extremely capable tool in the

  14. Master's in Autonomous Systems: An Overview of the Robotics Curriculum and Outcomes at ISEP, Portugal

    ERIC Educational Resources Information Center

    Silva, E.; Almeida, J.; Martins, A.; Baptista, J. P.; Campos Neves, B.

    2013-01-01

    Robotics research in Portugal is increasing every year, but few students embrace it as one of their first choices for study. Until recently, job offers for engineers were plentiful, and those looking for a degree in science and technology would avoid areas considered to be demanding, like robotics. At the undergraduate level, robotics programs are…

  15. Small robot autonomy in an integrated environment

    NASA Astrophysics Data System (ADS)

    O'Brien, Barry J.; Young, Stuart H.

    2008-04-01

    The U.S. Army Research Laboratory's (ARL) Computational and Information Sciences Directorate (CISD) has long been involved in autonomous asset control, specifically as it relates to small robots. Over the past year, CISD has demonstrated the ability to control and view streaming video from an FCS-surrogate PackBot robotic system over multiple network types (Soldier Radio Waveform (SRW), 802.11), as well as tasking the robot to follow both manually (ARL DigitalInk) and autonomously planned (CERDEC C2ORE) GPS waypoint routes. These capabilities remove the "stand alone system" limitations of traditional small robot systems and allow any and all data produced by such platforms to be available to anyone on the network, while at the same time reducing the amount of operator intervention required to utilize a robot. However, assumptions were made about the paths the robot was to traverse, specifically that they would be free from major obstacles. To address these system limitations, CISD is implementing obstacle detection and avoidance (OD/OA) on the PackBot. The OD/OA utilizes COTS ranging sensors with indoor and/or outdoor capabilities, and leverages existing software algorithm components into the existing CISD robotic control architecture. These new capabilities are available in an integrated environment consisting of common command and control (C2) and network interfaces and on multiple platforms (ARL ATRV, LynchBot, PackBot, etc.) due to the modular and platform/network independent architecture that ARL employs. This paper will describe the current robotic control architecture employed by ARL and provide brief descriptions of existing capabilities. Further, the paper will discuss the small robot obstacle detection/avoidance integration effort performed by ARL, along with some preliminary results on its performance and benefits.

  16. Outdoor allergens.

    PubMed Central

    Burge, H A; Rogers, C A

    2000-01-01

    Outdoor allergens are an important part of the exposures that lead to allergic disease. Understanding the role of outdoor allergens requires a knowledge of the nature of outdoor allergen-bearing particles, the distributions of their source, and the nature of the aerosols (particle types, sizes, dynamics of concentrations). Primary sources for outdoor allergens include vascular plants (pollen, fern spores, soy dust), and fungi (spores, hyphae). Nonvascular plants, algae, and arthropods contribute small numbers of allergen-bearing particles. Particles are released from sources into the air by wind, rain, mechanical disturbance, or active discharge mechanisms. Once airborne, they follow the physical laws that apply to all airborne particles. Although some outdoor allergens penetrate indoor spaces, exposure occurs mostly outdoors. Even short-term peak outdoor exposures can be important in eliciting acute symptoms. Monitoring of airborne biological particles is usually by particle impaction and microscopic examination. Centrally located monitoring stations give regional-scale measurements for aeroallergen levels. Evidence for the role of outdoor allergens in allergic rhinitis is strong and is rapidly increasing for a role in asthma. Pollen and fungal spore exposures have both been implicated in acute exacerbations of asthma, and sensitivity to some fungal spores predicts the existence of asthma. Synergism and/or antagonism probably occurs with other outdoor air particles and gases. Control involves avoidance of exposure (staying indoors, preventing entry of outdoor aerosols) as well as immunotherapy, which is effective for pollen but of limited effect for spores. Outdoor allergens have been the subject of only limited studies with respect to the epidemiology of asthma. Much remains to be studied with respect to prevalence patterns, exposure and disease relationships, and control. PMID:10931783

  17. Performance of a scanning laser line striper in outdoor lighting

    NASA Astrophysics Data System (ADS)

    Mertz, Christoph

    2013-05-01

    For search and rescue robots and reconnaissance robots it is important to detect objects in their vicinity. We have developed a scanning laser line striper that can produce dense 3D images using active illumination. The scanner consists of a camera and a MEMS-micro mirror based projector. It can also detect the presence of optically difficult material like glass and metal. The sensor can be used for autonomous operation or it can help a human operator to better remotely control the robot. In this paper we will evaluate the performance of the scanner under outdoor illumination, i.e. from operating in the shade to operating in full sunlight. We report the range, resolution and accuracy of the sensor and its ability to reconstruct objects like grass, wooden blocks, wires, metal objects, electronic devices like cell phones, blank RPG, and other inert explosive devices. Furthermore we evaluate its ability to detect the presence of glass and polished metal objects. Lastly we report on a user study that shows a significant improvement in a grasping task. The user is tasked with grasping a wire with the remotely controlled hand of a robot. We compare the time it takes to complete the task using the 3D scanner with using a traditional video camera.

  18. Does It "Want" or "Was It Programmed to..."? Kindergarten Children's Explanations of an Autonomous Robot's Adaptive Functioning

    ERIC Educational Resources Information Center

    Levy, Sharona T.; Mioduser, David

    2008-01-01

    This study investigates young children's perspectives in explaining a self-regulating mobile robot, as they learn to program its behaviors from rules. We explore their descriptions of a robot in action to determine the nature of their explanatory frameworks: psychological or technological. We have also studied the role of an adult's intervention…

  19. Effects of automation and task load on task switching during human supervision of multiple semi-autonomous robots in a dynamic environment.

    PubMed

    Squire, P N; Parasuraman, R

    2010-08-01

    The present study assessed the impact of task load and level of automation (LOA) on task switching in participants supervising a team of four or eight semi-autonomous robots in a simulated 'capture the flag' game. Participants were faster to perform the same task than when they chose to switch between different task actions. They also took longer to switch between different tasks when supervising the robots at a high compared to a low LOA. Task load, as manipulated by the number of robots to be supervised, did not influence switch costs. The results suggest that the design of future unmanned vehicle (UV) systems should take into account not simply how many UVs an operator can supervise, but also the impact of LOA and task operations on task switching during supervision of multiple UVs. The findings of this study are relevant for the ergonomics practice of UV systems. This research extends the cognitive theory of task switching to inform the design of UV systems and results show that switching between UVs is an important factor to consider. PMID:20658389

  20. Development of dog-like retrieving capability in a ground robot

    NASA Astrophysics Data System (ADS)

    MacKenzie, Douglas C.; Ashok, Rahul; Rehg, James M.; Witus, Gary

    2013-01-01

    This paper presents the Mobile Intelligence Team's approach to addressing the CANINE outdoor ground robot competition. The competition required developing a robot that provided retrieving capabilities similar to a dog, while operating fully autonomously in unstructured environments. The vision team consisted of Mobile Intelligence, the Georgia Institute of Technology, and Wayne State University. Important computer vision aspects of the project were the ability to quickly learn the distinguishing characteristics of novel objects, searching images for the object as the robot drove a search pattern, identifying people near the robot for safe operations, correctly identify the object among distractors, and localizing the object for retrieval. The classifier used to identify the objects will be discussed, including an analysis of its performance, and an overview of the entire system architecture presented. A discussion of the robot's performance in the competition will demonstrate the system's successes in real-world testing.

  1. Proceedings of the 1989 CESAR/CEA (Center for Engineering Systems Advanced Research/Commissariat a l'Energie Atomique) workshop on autonomous mobile robots (May 30--June 1, 1989)

    SciTech Connect

    Harber, K.S.; Pin, F.G. . Center for Engineering Systems Advanced Research)

    1990-03-01

    The US DOE Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) and the Commissariat a l'Energie Atomique's (CEA) Office de Robotique et Productique within the Directorat a la Valorization are working toward a long-term cooperative agreement and relationship in the area of Intelligent Systems Research (ISR). This report presents the proceedings of the first CESAR/CEA Workshop on Autonomous Mobile Robots which took place at ORNL on May 30, 31 and June 1, 1989. The purpose of the workshop was to present and discuss methodologies and algorithms under development at the two facilities in the area of perception and navigation for autonomous mobile robots in unstructured environments. Experimental demonstration of the algorithms and comparison of some of their features were proposed to take place within the framework of a previously mutually agreed-upon demonstration scenario or base-case.'' The base-case scenario described in detail in Appendix A, involved autonomous navigation by the robot in an a priori unknown environment with dynamic obstacles, in order to reach a predetermined goal. From the intermediate goal location, the robot had to search for and locate a control panel, move toward it, and dock in front of the panel face. The CESAR demonstration was successfully accomplished using the HERMIES-IIB robot while subsets of the CEA demonstration performed using the ARES robot simulation and animation system were presented. The first session of the workshop focused on these experimental demonstrations and on the needs and considerations for establishing benchmarks'' for testing autonomous robot control algorithms.

  2. High reliability outdoor sonar prototype based on efficient signal coding.

    PubMed

    Alvarez, Fernando J; Ureña, Jesús; Mazo, Manuel; Hernández, Alvaro; García, Juan J; de Marziani, Carlos

    2006-10-01

    Many mobile robots and autonomous vehicles designed for outdoor operation have incorporated ultrasonic sensors in their navigation systems, whose function is mainly to avoid possible collisions with very close obstacles. The use of these systems in more precise tasks requires signal encoding and the incorporation of pulse compression techniques that have already been used with success in the design of high-performance indoor sonars. However, the transmission of ultrasonic encoded signals outdoors entails a new challenge because of the effects of atmospheric turbulence. This phenomenon causes random fluctuations in the phase and amplitude of traveling acoustic waves, a fact that can make the encoded signal completely unrecognizable by its matched receiver. Atmospheric turbulence is investigated in this work, with the aim of determining the conditions under which it is possible to assure the reliable outdoor operation of an ultrasonic pulse compression system. As a result of this analysis, a novel sonar prototype based on complementary sequences coding is developed and experimentally tested. This encoding scheme provides the system with very useful additional features, namely, high robustness to noise, multi-mode operation capability (simultaneous emissions with minimum cross talk interference), and the possibility of applying an efficient detection algorithm that notably decreases the hardware resource requirements. PMID:17036794

  3. CASSY Robot

    NASA Astrophysics Data System (ADS)

    Pittman, Anna; Wright, Ann; Rice, Aaron; Shyaka, Claude

    2014-03-01

    The CASSY Robot project involved two square robots coded in RobotC. The goal was to code a robot to do a certain set of tasks autonomously. To begin with, our task was to code the robot so that it would roam a certain area, marked off by black tape. When the robot hit the black tape, it knew to back up and turn around. It was able to do this thanks to the light sensor that was attached to the bottom of the robot. Also, whenever the robot hit an obstacle, it knew to stop, back up, and turn around. This was primarily to prevent the robot from hurting itself if it hit an obstacle. This was accomplished by using touch sensors set up as bumpers. Once that was accomplished, we attached sonar sensors and created code so that one robot was able to find and track the other robot in a sort of intruder/police scenario. The overall goal of this project was to code the robot so that we can test it against a robot coded exactly the same, but using Layered Mode Selection Logic. Professor.

  4. vSLAM: vision-based SLAM for autonomous vehicle navigation

    NASA Astrophysics Data System (ADS)

    Goncalves, Luis; Karlsson, Niklas; Ostrowski, Jim; Di Bernardo, Enrico; Pirjanian, Paolo

    2004-09-01

    Among the numerous challenges of building autonomous/unmanned vehicles is that of reliable and autonomous localization in an unknown environment. In this paper we present a system that can efficiently and autonomously solve the robotics 'SLAM' problem, where a robot placed in an unknown environment, simultaneously must localize itself and make a map of the environment. The system is vision-based, and makes use of Evolution Robotic's powerful object recognition technology. As the robot explores the environment, it is continuously performing four tasks, using information from acquired images and the drive system odometry. The robot: (1) recognizes previously created 3-D visual landmarks; (2) builds new 3-D visual landmarks; (3) updates the current estimate of its location, using the map; (4) updates the landmark map. In indoor environments, the system can build a map of a 5m by 5m area in approximately 20 minutes, and can localize itself with an accuracy of approximately 15 cm in position and 3 degrees in orientation relative to the global reference frame of the landmark map. The same system can be adapted for outdoor, vehicular use.

  5. Autonomous mobile communication relays

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Everett, Hobart R.; Manouk, Narek; Verma, Ambrish

    2002-07-01

    Maintaining a solid radio communication link between a mobile robot entering a building and an external base station is a well-recognized problem. Modern digital radios, while affording high bandwidth and Internet-protocol-based automatic routing capabilities, tend to operate on line-of-sight links. The communication link degrades quickly as a robot penetrates deeper into the interior of a building. This project investigates the use of mobile autonomous communication relay nodes to extend the effective range of a mobile robot exploring a complex interior environment. Each relay node is a small mobile slave robot equipped with sonar, ladar, and 802.11b radio repeater. For demonstration purposes, four Pioneer 2-DX robots are used as autonomous mobile relays, with SSC-San Diego's ROBART III acting as the lead robot. The relay robots follow the lead robot into a building and are automatically deployed at various locations to maintain a networked communication link back to the remote operator. With their on-board external sensors, they also act as rearguards to secure areas already explored by the lead robot. As the lead robot advances and RF shortcuts are detected, relay nodes that become unnecessary will be reclaimed and reused, all transparent to the operator. This project takes advantage of recent research results from several DARPA-funded tasks at various institutions in the areas of robotic simulation, ad hoc wireless networking, route planning, and navigation. This paper describes the progress of the first six months of the project.

  6. Habituation: a non-associative learning rule design for spiking neurons and an autonomous mobile robots implementation.

    PubMed

    Cyr, André; Boukadoum, Mounir

    2013-03-01

    This paper presents a novel bio-inspired habituation function for robots under control by an artificial spiking neural network. This non-associative learning rule is modelled at the synaptic level and validated through robotic behaviours in reaction to different stimuli patterns in a dynamical virtual 3D world. Habituation is minimally represented to show an attenuated response after exposure to and perception of persistent external stimuli. Based on current neurosciences research, the originality of this rule includes modulated response to variable frequencies of the captured stimuli. Filtering out repetitive data from the natural habituation mechanism has been demonstrated to be a key factor in the attention phenomenon, and inserting such a rule operating at multiple temporal dimensions of stimuli increases a robot's adaptive behaviours by ignoring broader contextual irrelevant information. PMID:23385344

  7. Simulation of the outdoor energy efficiency of an autonomous solar kit based on meteorological data for a site in Central Europa

    NASA Astrophysics Data System (ADS)

    Bouzaki, Mohammed Moustafa; Chadel, Meriem; Benyoucef, Boumediene; Petit, Pierre; Aillerie, Michel

    2016-07-01

    This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europa (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In the proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.

  8. Demonstration of a Semi-Autonomous Hybrid Brain-Machine Interface using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

    PubMed Central

    McMullen, David P.; Hotson, Guy; Katyal, Kapil D.; Wester, Brock A.; Fifer, Matthew S.; McGee, Timothy G.; Harris, Andrew; Johannes, Matthew S.; Vogelstein, R. Jacob; Ravitz, Alan D.; Anderson, William S.; Thakor, Nitish V.; Crone, Nathan E.

    2014-01-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 seconds for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs. PMID:24760914

  9. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    PubMed

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs. PMID:24760914

  10. A Qualitative Approach to Mobile Robot Navigation Using RFID

    NASA Astrophysics Data System (ADS)

    Hossain, M.; Rashid, M. M.; Bhuiyan, M. M. I.; Ahmed, S.; Akhtaruzzaman, M.

    2013-12-01

    Radio Frequency Identification (RFID) system allows automatic identification of items with RFID tags using radio-waves. As the RFID tag has its unique identification number, it is also possible to detect a specific region where the RFID tag lies in. Recently it is widely been used in mobile robot navigation, localization, and mapping both in indoor and outdoor environment. This paper represents a navigation strategy for autonomous mobile robot using passive RFID system. Conventional approaches, such as landmark or dead-reckoning with excessive number of sensors, have complexities in establishing the navigation and localization process. The proposed method satisfies less complexity in navigation strategy as well as estimation of not only the position but also the orientation of the autonomous robot. In this research, polar coordinate system is adopted on the navigation surface where RFID tags are places in a grid with constant displacements. This paper also presents the performance comparisons among various grid architectures through simulation to establish a better solution of the navigation system. In addition, some stationary obstacles are introduced in the navigation environment to satisfy the viability of the navigation process of the autonomous mobile robot.

  11. On approximate reasoning and minimal models for the development of robust outdoor vehicle navigation schemes

    SciTech Connect

    Pin, F.G.

    1993-11-01

    Outdoor sensor-based operation of autonomous robots has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. Two basic principles, or philosophies, and their associated methodologies are proposed in an attempt to remedy some of these difficulties. The first principle is based on the concept of ``minimal model`` for accomplishing given tasks and proposes to utilize only the minimum level of information and precision necessary to accomplish elemental functions of complex tasks. This approach diverges completely from the direction taken by most artificial vision studies which conventionally call for crisp and detailed analysis of every available component in the perception data. The paper will first review the basic concepts of this approach and will discuss its pragmatic feasibility when embodied in a behaviorist framework. The second principle which is proposed deals with implicit representation of uncertainties using Fuzzy Set Theory-based approximations and approximate reasoning, rather than explicit (crisp) representation through calculation and conventional propagation techniques. A framework which merges these principles and approaches is presented, and its application to the problem of sensor-based outdoor navigation of a mobile robot is discussed. Results of navigation experiments with a real car in actual outdoor environments are also discussed to illustrate the feasibility of the overall concept.

  12. Robotics

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert O.

    2007-01-01

    Lunar robotic functions include: 1. Transport of crew and payloads on the surface of the moon; 2. Offloading payloads from a lunar lander; 3. Handling the deployment of surface systems; with 4. Human commanding of these functions from inside a lunar vehicle, habitat, or extravehicular (space walk), with Earth-based supervision. The systems that will perform these functions may not look like robots from science fiction. In fact, robotic functions may be automated trucks, cranes and winches. Use of this equipment prior to the crew s arrival or in the potentially long periods without crews on the surface, will require that these systems be computer controlled machines. The public release of NASA's Exploration plans at the 2nd Space Exploration Conference (Houston, December 2006) included a lunar outpost with as many as four unique mobility chassis designs. The sequence of lander offloading tasks involved as many as ten payloads, each with a unique set of geometry, mass and interface requirements. This plan was refined during a second phase study concluded in August 2007. Among the many improvements to the exploration plan were a reduction in the number of unique mobility chassis designs and a reduction in unique payload specifications. As the lunar surface system payloads have matured, so have the mobility and offloading functional requirements. While the architecture work continues, the community can expect to see functional requirements in the areas of surface mobility, surface handling, and human-systems interaction as follows: Surface Mobility 1. Transport crew on the lunar surface, accelerating construction tasks, expanding the crew s sphere of influence for scientific exploration, and providing a rapid return to an ascent module in an emergency. The crew transport can be with an un-pressurized rover, a small pressurized rover, or a larger mobile habitat. 2. Transport Extra-Vehicular Activity (EVA) equipment and construction payloads. 3. Transport habitats and

  13. Outdoor Integration

    ERIC Educational Resources Information Center

    Tatarchuk, Shawna; Eick, Charles

    2011-01-01

    An outdoor classroom is an exciting way to connect the learning of science to nature and the environment. Many school grounds include gardens, grassy areas, courtyards, and wooded areas. Some even have nearby streams or creeks. These are built-in laboratories for inquiry! In the authors' third-grade classroom, they align and integrate…

  14. Outdoor Environments

    ERIC Educational Resources Information Center

    Tomascoff, Rocky

    2009-01-01

    In this article, the author describes an art project in which students create their own outdoor environments using a tri-wall frame--a triple-layered cardboard, which is very lightweight and strong. Then the students compose a few sentences describing the scene or place.

  15. Outdoor Activities.

    ERIC Educational Resources Information Center

    Minneapolis Independent School District 275, Minn.

    Twenty-four activities suitable for outdoor use by elementary school children are outlined. Activities designed to make children aware of their environment include soil painting, burr collecting, insect and pond water collecting, studies of insect galls and field mice, succession studies, and a model of natural selection using dyed toothpicks. A…

  16. Outdoor Living.

    ERIC Educational Resources Information Center

    Cotter, Kathy

    Course objectives and learning activities are contained in this curriculum guide for a 16-week home economics course which teaches cooking and sewing skills applicable to outdoor living. The course goals include increasing male enrollment in the home economics program, developing students' self-confidence and ability to work in groups, and…

  17. Fault diagnostic system for a mobile robot

    NASA Astrophysics Data System (ADS)

    Nikam, Umesh; Hall, Ernest L.

    1997-09-01

    This paper describes the development of a robot fault diagnosis system (RFDS). Though designed ostensibly for the University of Cincinnati's autonomous, unmanned, mobile robot for a national competition, it has the flexibility to be adapted for industrial applications as well. Using a top-down approach the robot is sub-divided into different functional units, such as the vision guidance system, the ultrasonic obstacle avoidance system, the steering mechanism, the speed control system, the braking system and the power unit. The techniques of potential failure mode and effects analysis (PFMEA) are used to analyze faults, their visible symptoms, and probable causes and remedies. The relationships obtained therefrom are mapped in a database framework. This is then coded in a user-friendly interactive Visual BasicTM program that guides the user to the likely cause(s) of failure through a question-answer format. A provision is made to ensure better accuracy of the system by incorporating historical data on failures as it becomes available. The RFDS thus provides a handy trouble-shooting tool that cuts down the time involved in diagnosing failures in the complex robot consisting of mechanical, electric, electronic and optical systems. This has been of great help in diagnosing failures and ensuring maximum performance from the robot during the contest in the face of pressure of the competition and the outdoor conditions.

  18. Comparison of optical modeling and neural networks for robot guidance

    NASA Astrophysics Data System (ADS)

    Parasnis, Sameer; Velidandla, Sasanka; Hall, Ernest L.; Anand, Sam

    1998-10-01

    A truly autonomous robot must sense its environment and react appropriately. These issues attain greater importance in an outdoor, variable environment. Previous mobile robot perception systems have relied on hand-coded algorithms for processing sensor information. Recent techniques involve the use of artificial neural networks to process sensor data for mobile robot guidance. A comparison of a fuzzy logic control for an AGV and a neural network perception is described in this work. A mobile robot test bed has been constructed using a golf cart base. The test bed has a fuzzy logic controller which uses both vision and obstacle information and provides the steering and speed controls to the robot. A feed-forward neural network is described to guide the robot using vision and range data. Suitable criteria for comparison will be formulated and the hand-coded system compared with a connectionist model. A comparison of the two systems, with performance, efficiency and reliability as the criteria, will be achieved. The significance of this work is that it provides comparative tradeoffs on two important robot guidance methods.

  19. Autonomous system for cross-country navigation

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony; Brumitt, Barry L.; Coulter, R. C.; Kelly, Alonzo

    1993-05-01

    Autonomous cross-country navigation is essential for outdoor robots moving about in unstructured environments. Most existing systems use range sensors to determine the shape of the terrain, plan a trajectory that avoids obstacles, and then drive the trajectory. Performance has been limited by the range and accuracy of sensors, insufficient vehicle-terrain interaction models, and the availability of high-speed computers. As these elements improve, higher- speed navigation on rougher terrain becomes possible. We have developed a software system for autonomous navigation that provides for greater capability. The perception system supports a large braking distance by fusing multiple range images to build a map of the terrain in front of the vehicle. The system identifies range shadows and interpolates undersamples regions to account for rough terrain effects. The motion planner reduces computational complexity by investigating a minimum number of trajectories. Speeds along the trajectory are set to provide for dynamic stability. The entire system was tested in simulation, and a subset of the capability was demonstrated on a real vehicle. Results to date include a continuous 5.1 kilometer run across moderate terrain with obstacles. This paper begins with the applications, prior work, limitations, and current paradigms for autonomous cross-country navigation, and then describes our contribution to the area.

  20. Robotics

    NASA Technical Reports Server (NTRS)

    Rothschild, Lynn J.

    2012-01-01

    Earth's upper atmosphere is an extreme environment: dry, cold, and irradiated. It is unknown whether our aerobiosphere is limited to the transport of life, or there exist organisms that grow and reproduce while airborne (aerophiles); the microenvironments of suspended particles may harbor life at otherwise uninhabited altitudes[2]. The existence of aerophiles would significantly expand the range of planets considered candidates for life by, for example, including the cooler clouds of a hot Venus-like planet. The X project is an effort to engineer a robotic exploration and biosampling payload for a comprehensive survey of Earth's aerobiology. While many one-shot samples have been retrieved from above 15 km, their results are primarily qualitative; variations in method confound comparisons, leaving such major gaps in our knowledge of aerobiology as quantification of populations at different strata and relative species counts[1]. These challenges and X's preliminary solutions are explicated below. X's primary balloon payload is undergoing a series of calibrations before beginning flights in Spring 2012. A suborbital launch is currently planned for Summer 2012. A series of ground samples taken in Winter 2011 is being used to establish baseline counts and identify likely background contaminants.

  1. Walking control of small size humanoid robot: HAJIME ROBOT 18

    NASA Astrophysics Data System (ADS)

    Sakamoto, Hajime; Nakatsu, Ryohei

    2007-12-01

    HAJIME ROBOT 18 is a fully autonomous biped robot. It has been developed for RoboCup which is a worldwide soccer competition of robots. It is necessary for a robot to have high mobility to play soccer. High speed walking and all directional walking are important to approach and to locate in front of a ball. HAJIME ROBOT achieved these walking. This paper describes walking control of a small size humanoid robot 'HAJIME ROBOT 18' and shows the measurement result of ZMP (Zero Moment Point). HAJIME ROBOT won the Robotics Society of Japan Award in RoboCup 2005 and in RoboCup 2006 Japan Open.

  2. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  3. Flip-chip electronic system assembly process and issues for the NanoWalker: a small wireless autonomous instrumented robot

    NASA Astrophysics Data System (ADS)

    Martel, Sylvain M.; Riley, George A.; Merchant, Monisha; Hunter, Ian W.; Lafontaine, Serge

    1999-08-01

    The integration of complex electronic systems onto small- scale robots requires advanced assembly methods. The NanoWalker is an example of such a robot where a large amount of electronics must be embedded in the smallest possible space. To make a space-efficient implementation, electronic chips are mounted using flip chip technology on a pre-bumped flexible printed circuit (FPC). A 3D structure is obtained by mounting the FPC vertically in a triangular fashion above a tripod built with three small piezo-actuated legs used for the walking and rotational motions. Advanced computer aided design systems are used for the design and to generate manufacturing files. Unlike other commercial products such as cellular phones, watches, pagers, cameras, and disk drives that use flip chip technology to achieve the smallest form factor, the assembly process of the NanoWalker is directly dependent on other characteristics of the system. Minimization of coupling noises through proper FPC layout and die placement within temperature constraints due to the proximity of sensitive instrument was a critical factor. The effect of vibration caused by the piezo- actuators and the weight of each die were also other important issues to consider to determine the final placement in order to maintain proper sub-atomic motion behavior.

  4. Cooperative robotics: bringing autonomy to explosive ordnance disposal robots

    NASA Astrophysics Data System (ADS)

    Del Signore, Michael J.; Czop, Andrew; Hacker, Kurt

    2008-04-01

    An ongoing effort within the US Naval EOD Technology Division (NAVEODTECHDIV) is exploring the integration of autonomous robotic technologies onto current and future Explosive Ordnance Disposal (EOD) robot platforms. The Cooperative Robotics program, though the support of the Joint Ground Robotics Enterprise (JGRE), has identified several autonomous robotic technologies useful to the EOD operator, and with the collaboration of academia and industry is in the process of bringing these technologies to EOD robot operators in the field. Initiated in January 2007, the Cooperative Robotics program includes the demonstration of various autonomous technologies to the EOD user community, and the optimization of these technologies for use on small EOD Unmanned Ground Vehicles (UGVs) in relevant environments. Through close interaction with actual EOD operators, these autonomous behaviors will be designed to work within the bounds of current EOD Tactics, Techniques, and Procedures (TTP). This paper will detail the ongoing and future efforts encompassing the Cooperative Robotics program including: technology demonstrations of autonomous robotic capabilities, development of autonomous capability requirements based on user focus groups, optimization of autonomous UGV behaviors to enable use in relevant environments based on current EOD TTP, and finally the transition of these technologies to current and future EOD robotic systems.

  5. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot.

    PubMed

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  6. Intelligent control in mobile robotics: the PANORAMA project

    NASA Astrophysics Data System (ADS)

    Greenway, Phil

    1994-03-01

    The European Community's strategic research initiative in information technology has been in place for seven years. A good example of the pan-European collaborative projects conducted under this initiative is PANORAMA: Perception and Navigation for Autonomous Mobile Robot Applications. This four-and-a-half-year project, completed in October 1993, aimed to prove the feasibility of an autonomous mobile robotic system replacing a human-operated vehicle working outdoors in a partially structured environment. The autonomous control of a mobile rock drilling machine was chosen as a challenging and representative test scenario. This paper presents an overview of intelligent mobile robot control architectures. Goals and objectives of the project are described, together with the makeup of the consortium and the roles of the members within it. The main technical achievements from PANORAMA are then presented, with emphasis given to the problems of realizing intelligent control. In particular, the planning and replanning of a mission, and the corresponding architectural choices and infrastructure required to support the chosen task oriented approach, are discussed. Specific attention is paid to the functional decomposition of the system, and how the requirements for `intelligent control' impact on the organization of the identified system components. Future work and outstanding problems are considered in some concluding remarks.

  7. Planning Flight Paths of Autonomous Aerobots

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric; Elfes, Alberto; Sharma, Shivanjli

    2009-01-01

    Algorithms for planning flight paths of autonomous aerobots (robotic blimps) to be deployed in scientific exploration of remote planets are undergoing development. These algorithms are also adaptable to terrestrial applications involving robotic submarines as well as aerobots and other autonomous aircraft used to acquire scientific data or to perform surveying or monitoring functions.

  8. Outdoor Education -- Edinburgh

    ERIC Educational Resources Information Center

    Parker, Terry

    1974-01-01

    In Scotland, outdoor education is seen as a combination of outdoor pursuits and environmental studies. The article describes various centres in the Edinburgh area, outdoor education expeditions, and programs, such as mountaineering, rock climbing, orienteering, and canoeing. (KM)

  9. The Outdoor Leader.

    ERIC Educational Resources Information Center

    Loewen, Lily

    1989-01-01

    Discusses some characteristics and skills of the effective outdoor leader: confidence, feeling "at home" in the outdoors, continuing awareness of and curiosity about the outdoors, good communication skills, creative thinking, and team-building abilities. (SV)

  10. Autonomous surveillance for biosecurity.

    PubMed

    Jurdak, Raja; Elfes, Alberto; Kusy, Branislav; Tews, Ashley; Hu, Wen; Hernandez, Emili; Kottege, Navinda; Sikka, Pavan

    2015-04-01

    The global movement of people and goods has increased the risk of biosecurity threats and their potential to incur large economic, social, and environmental costs. Conventional manual biosecurity surveillance methods are limited by their scalability in space and time. This article focuses on autonomous surveillance systems, comprising sensor networks, robots, and intelligent algorithms, and their applicability to biosecurity threats. We discuss the spatial and temporal attributes of autonomous surveillance technologies and map them to three broad categories of biosecurity threat: (i) vector-borne diseases; (ii) plant pests; and (iii) aquatic pests. Our discussion reveals a broad range of opportunities to serve biosecurity needs through autonomous surveillance. PMID:25744760

  11. A positional estimation technique for an autonomous land vehicle in an unstructured environment

    NASA Technical Reports Server (NTRS)

    Talluri, Raj; Aggarwal, J. K.

    1990-01-01

    This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.

  12. Some Outdoor Educators' Experiences of Outdoor Education

    ERIC Educational Resources Information Center

    Gunn, Terry

    2006-01-01

    The phenomenological study presented in this paper attempts to determine, from outdoor educators, what it meant for them to be teaching outdoor education in Victorian secondary schools during 2004. In 1999, Lugg and Martin surveyed Victorian secondary schools to determine the types of outdoor education programs being run, the objectives of those…

  13. Robotic intelligence kernel

    SciTech Connect

    Bruemmer, David J.

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  14. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  15. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  16. Robotic transportation.

    PubMed

    Lob, W S

    1990-09-01

    Mobile robots perform fetch-and-carry tasks autonomously. An intelligent, sensor-equipped mobile robot does not require dedicated pathways or extensive facility modification. In the hospital, mobile robots can be used to carry specimens, pharmaceuticals, meals, etc. between supply centers, patient areas, and laboratories. The HelpMate (Transitions Research Corp.) mobile robot was developed specifically for hospital environments. To reach a desired destination, Help-Mate navigates with an on-board computer that continuously polls a suite of sensors, matches the sensor data against a pre-programmed map of the environment, and issues drive commands and path corrections. A sender operates the robot with a user-friendly menu that prompts for payload insertion and desired destination(s). Upon arrival at its selected destination, the robot prompts the recipient for a security code or physical key and awaits acknowledgement of payload removal. In the future, the integration of HelpMate with robot manipulators, test equipment, and central institutional information systems will open new applications in more localized areas and should help overcome difficulties in filling transport staff positions. PMID:2208684

  17. Outdoors classes

    NASA Astrophysics Data System (ADS)

    Szymanska-Markowska, Barbara

    2016-04-01

    Why should students be trapped within the four walls of the classroom when there are a lot of ideas to have lessons led in the different way? I am not a fan of having lessons at school. For many students it is also boring to stay only at school, too. So I decided to organize workshops and trips to Universities or outdoors. I created KMO ( Discoverer's Club for Teenagers) at my school where students gave me some ideas and we started to make them real. I teach at school where students don't like science. I try hard to change their point of view about it. That's why I started to take parts in different competitions with my students. Last year we measured noise everywhere by the use of applications on a tablet to convince them that noise is very harmful for our body and us. We examined that the most harmful noises were at school's breaks, near the motorways and in the households. We also proved that acoustic screens, which were near the motorways, didn't protect us from noise. We measured that 30 meters from the screens the noise is the same as the motorway. We won the main prize for these measurements. We also got awards for calculating the costs of a car supplied by powered by a solar panel. We measured everything by computer. This year we decided to write an essay about trees and weather. We went to the forest and found the cut trees because we wanted to read the age of tree from the stump. I hadn't known earlier that we could read the weather from the tree's grain. We examined a lot of trees and we can tell that trees are good carriers of information about weather and natural disasters. I started studies safety education and I have a lot of ideas how to get my students interested in this subject that is similar to P.E., physics and chemistry, too. I hope that I will use my abilities from European Space Education Resource Office and GIFT workshop. I plan to use satellite and space to teach my students how they can check information about terrorism, floods or other

  18. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  19. Aerial Explorers and Robotic Ecosystems

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Pisanich, Greg

    2004-01-01

    A unique bio-inspired approach to autonomous aerial vehicle, a.k.a. aerial explorer technology is discussed. The work is focused on defining and studying aerial explorer mission concepts, both as an individual robotic system and as a member of a small robotic "ecosystem." Members of this robotic ecosystem include the aerial explorer, air-deployed sensors and robotic symbiotes, and other assets such as rovers, landers, and orbiters.

  20. Autonomous Exploration for 3D Map Learning

    NASA Astrophysics Data System (ADS)

    Joho, Dominik; Stachniss, Cyrill; Pfaff, Patrick; Burgard, Wolfram

    Autonomous exploration is a frequently addressed problem in the robotics community. This paper presents an approach to mobile robot exploration that takes into account that the robot acts in the three-dimensional space. Our approach can build compact three-dimensional models autonomously and is able to deal with negative obstacles such as abysms. It applies a decision-theoretic framework which considers the uncertainty in the map to evaluate potential actions. Thereby, it trades off the cost of executing an action with the expected information gain taking into account possible sensor measurements. We present experimental results obtained with a real robot and in simulation.

  1. Outdoor Environments. Beginnings Workshop.

    ERIC Educational Resources Information Center

    Child Care Information Exchange, 2003

    2003-01-01

    Presents seven articles on outdoor play environments: "Are We Losing Ground?" (Greenman); "Designing and Creating Natural Play Environments for Young Children" (Keeler); "Adventure Playgrounds and Outdoor Safety Issues" (McGinnis); "Trust, the Earth and Children: Birth to Three" (Young); "Outdoor Magic for Family Child Care Providers" (Osborn); "A…

  2. A cognitive robotic system based on the Soar cognitive architecture for mobile robot navigation, search, and mapping missions

    NASA Astrophysics Data System (ADS)

    Hanford, Scott D.

    Most unmanned vehicles used for civilian and military applications are remotely operated or are designed for specific applications. As these vehicles are used to perform more difficult missions or a larger number of missions in remote environments, there will be a great need for these vehicles to behave intelligently and autonomously. Cognitive architectures, computer programs that define mechanisms that are important for modeling and generating domain-independent intelligent behavior, have the potential for generating intelligent and autonomous behavior in unmanned vehicles. The research described in this presentation explored the use of the Soar cognitive architecture for cognitive robotics. The Cognitive Robotic System (CRS) has been developed to integrate software systems for motor control and sensor processing with Soar for unmanned vehicle control. The CRS has been tested using two mobile robot missions: outdoor navigation and search in an indoor environment. The use of the CRS for the outdoor navigation mission demonstrated that a Soar agent could autonomously navigate to a specified location while avoiding obstacles, including cul-de-sacs, with only a minimal amount of knowledge about the environment. While most systems use information from maps or long-range perceptual capabilities to avoid cul-de-sacs, a Soar agent in the CRS was able to recognize when a simple approach to avoiding obstacles was unsuccessful and switch to a different strategy for avoiding complex obstacles. During the indoor search mission, the CRS autonomously and intelligently searches a building for an object of interest and common intersection types. While searching the building, the Soar agent builds a topological map of the environment using information about the intersections the CRS detects. The agent uses this topological model (along with Soar's reasoning, planning, and learning mechanisms) to make intelligent decisions about how to effectively search the building. Once the

  3. Autonomous Science Decisions for Mars Sample Return

    NASA Astrophysics Data System (ADS)

    Roush, T. L.; Gulick, V.; Morris, R.; Gazis, P.; Benedix, G.; Glymour, C.; Ramsey, J.; Pedersen, L.; Ruzon, M.; Buntine, W.; Oliver, J.

    1999-03-01

    Robotic explorers, e.g. rovers, need to make crucial science decisions autonomously that are distinct from control, health, and navigation issues. We have investigated potential tools that can be applied to imaging and near-infrared spectral data.

  4. An intelligent inspection and survey robot

    SciTech Connect

    Byrd, J.; Holland, J.M.

    1994-11-01

    ARIES (Autonomous Robotic Inspection Experimental System) is a semi-autonomous robotic system intended for use in the automatic inspection of stored containers of low-level nuclear waste. This article describes the technology and how it could be used. 3 refs., 3 figs.

  5. Semi autonomous mine detection system

    SciTech Connect

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.

  6. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  7. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  8. NASA's Robotic Lander Takes Flight

    NASA Video Gallery

    On Wednesday, June 8, the lander prototype managed by the Robotic Lunar Lander Development Project at NASA's Marshall Space Flight Center in Huntsville, Ala., hovered autonomously for 15 seconds at...

  9. Using fuzzy behaviors for the outdoor navigation of a car with low-resolution sensors

    SciTech Connect

    Pin, F.G.; Watanabe, Y.

    1993-12-31

    Vehicle control in a priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. A proposed approach using superposition of elemental fuzzy behaviors to emulate human-like qualitative reasoning schemes is first discussed. We then describe how a previously developed navigation scheme implemented on custom-designed VLSI fuzzy inferencing boards for indoor navigation of a small laboratory-type robot was progressively enhanced to investigate two control modes for driving a car in a priori unknown environments on the basis of sparse and imprecise sensor data. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver`s aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility of outdoor navigation using fuzzy behaviors operating on possibly very inaccurate sensor data.

  10. Using fuzzy behaviors for the outdoor navigation of a car with low-resolution sensors

    SciTech Connect

    Pin, F.G.; Watanabe, Y.

    1993-01-01

    Vehicle control in a priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. A proposed approach using superposition of elemental fuzzy behaviors to emulate human-like qualitative reasoning schemes is first discussed. We then describe how a previously developed navigation scheme implemented on custom-designed VLSI fuzzy inferencing boards for indoor navigation of a small laboratory-type robot was progressively enhanced to investigate two control modes for driving a car in a priori unknown environments on the basis of sparse and imprecise sensor data. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility of outdoor navigation using fuzzy behaviors operating on possibly very inaccurate sensor data.

  11. Teleautonomous guidance for mobile robots

    NASA Technical Reports Server (NTRS)

    Borenstein, J.; Koren, Y.

    1990-01-01

    Teleautonomous guidance (TG), a technique for the remote guidance of fast mobile robots, has been developed and implemented. With TG, the mobile robot follows the general direction prescribed by an operator. However, if the robot encounters an obstacle, it autonomously avoids collision with that obstacle while trying to match the prescribed direction as closely as possible. This type of shared control is completely transparent and transfers control between teleoperation and autonomous obstacle avoidance gradually. TG allows the operator to steer vehicles and robots at high speeds and in cluttered environments, even without visual contact. TG is based on the virtual force field (VFF) method, which was developed earlier for autonomous obstacle avoidance. The VFF method is especially suited to the accommodation of inaccurate sensor data (such as that produced by ultrasonic sensors) and sensor fusion, and allows the mobile robot to travel quickly without stopping for obstacles.

  12. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    ZAPATA ENGINEERING challenged our engineers and scientists, which included robotics expertise from Carnegie Mellon University, to design a solution to meet our client's requirements for rapid digital geophysical and radiological data collection of a munitions test range with no down-range personnel. A prime concern of the project was to minimize exposure of personnel to unexploded ordnance and radiation. The field season was limited by extreme heat, cold and snow. Geographical Information System (GIS) tools were used throughout this project to accurately define the limits of mapped areas, build a common mapping platform from various client products, track production progress, allocate resources and relate subsurface geophysical information to geographical features for use in rapidly reacquiring targets for investigation. We were hopeful that our platform could meet the proposed 35 acres per day, towing both a geophysical package and a radiological monitoring trailer. We held our breath and crossed our fingers as the autonomous Speedrower began to crawl across the playa lakebed. We met our proposed production rate, and we averaged just less than 50 acres per 12-hour day using the autonomous platform with a path tracking error of less than +/- 4 inches. Our project team mapped over 1,800 acres in an 8-week (4 days per week) timeframe. The expertise of our partner, Carnegie Mellon University, was recently demonstrated when their two autonomous vehicle entries finished second and third at the 2005 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. 'The Grand Challenge program was established to help foster the development of autonomous vehicle technology that will some day help save the lives of Americans who are protecting our country on the battlefield', said DARPA Grand Challenge Program Manager, Ron Kurjanowicz. Our autonomous remote-controlled vehicle (ARCV) was a modified New Holland 2550 Speedrower retrofitted to allow the machine

  13. Autonomous navigation system and method

    SciTech Connect

    Bruemmer, David J; Few, Douglas A

    2009-09-08

    A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.

  14. Examples of design and achievement of vision systems for mobile robotics applications

    NASA Astrophysics Data System (ADS)

    Bonnin, Patrick J.; Cabaret, Laurent; Raulet, Ludovic; Hugel, Vincent; Blazevic, Pierre; M'Sirdi, Nacer K.; Coiffet, Philippe

    2000-10-01

    Our goal is to design and to achieve a multiple purpose vision system for various robotics applications : wheeled robots (like cars for autonomous driving), legged robots (six, four (SONY's AIBO) legged robots, and humanoid), flying robots (to inspect bridges for example) in various conditions : indoor or outdoor. Considering that the constraints depend on the application, we propose an edge segmentation implemented either in software, or in hardware using CPLDs (ASICs or FPGAs could be used too). After discussing the criteria of our choice, we propose a chain of image processing operators constituting an edge segmentation. Although this chain is quite simple and very fast to perform, results appear satisfactory. We proposed a software implementation of it. Its temporal optimization is based on : its implementation under the pixel data flow programming model, the gathering of local processing when it is possible, the simplification of computations, and the use of fast access data structures. Then, we describe a first dedicated hardware implementation of the first part, which requires 9CPLS in this low cost version. It is technically possible, but more expensive, to implement these algorithms using only a signle FPGA.

  15. Rotorcraft and Enabling Robotic Rescue

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2010-01-01

    This paper examines some of the issues underlying potential robotic rescue devices (RRD) in the context where autonomous or manned rotorcraft deployment of such robotic systems is a crucial attribute for their success in supporting future disaster relief and emergency response (DRER) missions. As a part of this discussion, work related to proof-of-concept prototyping of two notional RRD systems is summarized.

  16. A fuzzy behaviorist approach to sensor-based robot control

    SciTech Connect

    Pin, F.G.

    1996-05-01

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-based approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.

  17. The Outdoor Classroom.

    ERIC Educational Resources Information Center

    Thomas, Dorothy E.

    An Outdoor Classroom to prepare pre-service and in-service teachers to utilize vital natural resources as an outdoor laboratory was established in 1974 by Elizabeth City State University. Because of its proximity to the Great Dismal Swamp and the Atlantic, the university's geographical location made it especially suitable for such a course of…

  18. Outdoor Education, Camp Casey.

    ERIC Educational Resources Information Center

    Stansberry, Steve; And Others

    A curriculum guide for the Camp Casey Outdoor Education program is contained in this document. Designed for fifth grade students and teachers in Northshore School District, Bothell, Washington, it emphasizes learning activities for use in the outdoors. General understandings relevant to the objectives of the program form the frame into which the…

  19. Outdoor Education Manual.

    ERIC Educational Resources Information Center

    Gooyers, Cobina; And Others

    Designed for teachers to provide students with an awareness of the world of nature which surrounds them, the manual presents the philosophy of outdoor education, goals and objectives of the school program, planning for outdoor education, the Wildwood Programs, sequential program planning for students, program booking and resource list. Content…

  20. Effective Thinking Outdoors.

    ERIC Educational Resources Information Center

    Hyde, Rod

    1997-01-01

    Effective Thinking Outdoors (ETO) is an organization that teaches thinking skills and strategies via significant outdoor experiences. Identifies the three elements of thinking as creativity, play, and persistence; presents a graphic depiction of the problem-solving process and aims; and describes an ETO exercise, determining old routes of travel…

  1. Maple Leaf Outdoor Centre.

    ERIC Educational Resources Information Center

    Maguire, Molly; Gunton, Ric

    2000-01-01

    Maple Leaf Outdoor Centre (Ontario) has added year-round outdoor education facilities and programs to help support its summer camp for disadvantaged children. Schools, youth centers, religious groups, and athletic teams conduct their own programs, collaborate with staff, or use staff-developed programs emphasizing adventure education and personal…

  2. Outdoor Classroom Coordinator

    ERIC Educational Resources Information Center

    Keeler, Rusty

    2010-01-01

    Everybody loves the idea of children playing outdoors. Outside, children get to experience the seasons, challenge their minds and bodies, connect with the natural world, and form a special relationship with the planet. But in order for children to get the most of their outdoor time it is important that the environment be prepared by caring adults…

  3. Outdoorsman: Outdoor Cooking.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    This Outdoor Cookery manual provides information and instruction on the basic outdoor skills of building suitable cooking fires, handling fires safely, and storing food. The necessity of having the right kind of fire is stressed (high flames for boiling, low for stewing, and coals for frying and broiling). Tips on gauging temperature, what types…

  4. Outdoor Recreation Management

    ERIC Educational Resources Information Center

    Jubenville, Alan

    The complex problems facing the manager of an outdoor recreation area are outlined and discussed. Eighteen chapters cover the following primary concerns of the manager of such a facility: (1) an overview of the management process; (2) the basic outdoor recreation management model; (3) the problem-solving process; (4) involvement of the public in…

  5. Hunting and Outdoor Education.

    ERIC Educational Resources Information Center

    Matthews, Bruce E.

    1991-01-01

    This article addresses the controversy over including hunting as a part of outdoor education. Historically, figures such as Julian Smith, of the Outdoor Education Project of the 1950's, advocated hunting as a critical element of educating children and youth about care and protection of natural resources. Henry David Thoreau saw hunting experiences…

  6. Fundamentals of Outdoor Enjoyment.

    ERIC Educational Resources Information Center

    Mitchell, Jim; Fear, Gene

    The purpose of this preventive search and rescue teachers guide is to help high school aged youth understand the complexities and priorities necessary to manage a human body in outdoor environments and the value of planning ahead to have on hand the skills and equipment needed for outdoor survival, comfort, and enjoyment. Separate sections present…

  7. Queering Outdoor Education.

    ERIC Educational Resources Information Center

    Russell, Connie; Sarick, Tema; Kennelly, Jackie

    2003-01-01

    Queer pedagogy is rife with possibilities for outdoor educators to challenge the status quo of heterosexism and sexism. From recognizing and addressing the heteronormative assumptions that influence the outdoor classroom, to subverting oppressive gender norms, to noticing the cultural constructs through which we view nature, queer pedagogy can…

  8. Canadian space robotic activities

    NASA Astrophysics Data System (ADS)

    Sallaberger, Christian; Space Plan Task Force, Canadian Space Agency

    The Canadian Space Agency has chosen space robotics as one of its key niche areas, and is currently preparing to deliver the first flight elements for the main robotic system of the international space station. The Mobile Servicing System (MSS) is the Canadian contribution to the international space station. It consists of three main elements. The Space Station Remote Manipulator System (SSRMS) is a 7-metre, 7-dof, robotic arm. The Special Purpose Dextrous Manipulator (SPDM), a smaller 2-metre, 7-dof, robotic arm can be used independently, or attached to the end of the SSRMS. The Mobile Base System (MBS) will be used as a support platform and will also provide power and data links for both the SSRMS and the SPDM. A Space Vision System (SVS) has been tested on Shuttle flights, and is being further developed to enhance the autonomous capabilities of the MSS. The CSA also has a Strategic Technologies in Automation and Robotics Program which is developing new technologies to fulfill future robotic space mission needs. This program is currently developing in industry technological capabilities in the areas of automation of operations, autonomous robotics, vision systems, trajectory planning and object avoidance, tactile and proximity sensors, and ground control of space robots. Within the CSA, a robotic testbed and several research programs are also advancing technologies such as haptic devices, control via head-mounted displays, predictive and preview displays, and the dynamic characterization of robotic arms. Canada is also now developing its next Long Term Space Plan. In this context, a planetary exploration program is being considered, which would utilize Canadian space robotic technologies in this new arena.

  9. Open Issues in Evolutionary Robotics.

    PubMed

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots. PMID:26581015

  10. ARIES: A mobile robot inspector

    SciTech Connect

    Byrd, J.S.

    1995-12-31

    ARIES (Autonomous Robotic Inspection Experimental System) is a mobile robot inspection system being developed for the Department of Energy (DOE) to survey and inspect drums containing mixed and low-level radioactive waste stored in warehouses at DOE facilities. The drums are typically stacked four high and arranged in rows with three-foot aisle widths. The robot will navigate through the aisles and perform an autonomous inspection operation, typically performed by a human operator. It will make real-time decisions about the condition of the drums, maintain a database of pertinent information about each drum, and generate reports.

  11. Architecture for robot intelligence

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard Alan (Inventor)

    2004-01-01

    An architecture for robot intelligence enables a robot to learn new behaviors and create new behavior sequences autonomously and interact with a dynamically changing environment. Sensory information is mapped onto a Sensory Ego-Sphere (SES) that rapidly identifies important changes in the environment and functions much like short term memory. Behaviors are stored in a DBAM that creates an active map from the robot's current state to a goal state and functions much like long term memory. A dream state converts recent activities stored in the SES and creates or modifies behaviors in the DBAM.

  12. Asteroid Exploration with Autonomic Systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. The prospective ANTS (Autonomous Nano Technology Swarm) mission comprises autonomous agents including worker agents (small spacecra3) designed to cooperate in asteroid exploration under the overall authoriq of at least one ruler agent (a larger spacecraft) whose goal is to cause science data to be returned to Earth. The ANTS team (ruler plus workers and messenger agents), but not necessarily any individual on the team, will exhibit behaviors that qualify it as an autonomic system, where an autonomic system is defined as a system that self-reconfigures, self-optimizes, self-heals, and self-protects. Autonomic system concepts lead naturally to realistic, scalable architectures rich in capabilities and behaviors. In-depth consideration of a major mission like ANTS in terms of autonomic systems brings new insights into alternative definitions of autonomic behavior. This paper gives an overview of the ANTS mission and discusses the autonomic properties of the mission.

  13. Exploratorium: Robots.

    ERIC Educational Resources Information Center

    Brand, Judith, Ed.

    2002-01-01

    This issue of Exploratorium Magazine focuses on the topic robotics. It explains how to make a vibrating robotic bug and features articles on robots. Contents include: (1) "Where Robot Mice and Robot Men Run Round in Robot Towns" (Ray Bradbury); (2) "Robots at Work" (Jake Widman); (3) "Make a Vibrating Robotic Bug" (Modesto Tamez); (4) "The Robot…

  14. Robotic surgery

    MedlinePlus

    Robot-assisted surgery; Robotic-assisted laparoscopic surgery; Laparoscopic surgery with robotic assistance ... computer station and directs the movements of a robot. Small surgical tools are attached to the robot's ...

  15. Environmental Outdoor Lighting

    NASA Astrophysics Data System (ADS)

    Nelson, D.

    2004-05-01

    Lighting for the outdoor environment presents challenges not usually found in interior lighting. Outdoors, the universal standard is the daytime sun, yet nighttime electric lighting falls far short of daylight. This presentation provides guidance in dealing with these shortcomings, allowing electric lighting systems to solve multiple needs while being responsive to the need for quality exterior lighting. Main aspects include visual issues, ordinances, sources, energy conservation, structure, softscape and hardscape, roadways and walkways, retail and parking lots, sports, and outdoor hospitality lighting. We will review many of the present recommendations for such lighting applications, ones designed to minimize any adverse effects.

  16. [Robots and intellectual property].

    PubMed

    Larrieu, Jacques

    2013-12-01

    This topic is part of the global issue concerning the necessity to adapt intellectual property law to constant changes in technology. The relationship between robots and IP is dual. On one hand, the robots may be regarded as objects of intellectual property. A robot, like any new machine, could qualify for a protection by a patent. A copyright may protect its appearance if it is original. Its memory, like a database, could be covered by a sui generis right. On the other hand, the question of the protection of the outputs of the robot must be raised. The robots, as the physical embodiment of artificial intelligence, are becoming more and more autonomous. Robot-generated works include less and less human inputs. Are these objects created or invented by a robot copyrightable or patentable? To whom the ownership of these IP rights will be allocated? To the person who manufactured the machine ? To the user of the robot? To the robot itself? All these questions are worth discussing. PMID:24558740

  17. A design strategy for autonomous systems

    NASA Technical Reports Server (NTRS)

    Forster, Pete

    1989-01-01

    Some solutions to crucial issues regarding the competent performance of an autonomously operating robot are identified; namely, that of handling multiple and variable data sources containing overlapping information and maintaining coherent operation while responding adequately to changes in the environment. Support for the ideas developed for the construction of such behavior are extracted from speculations in the study of cognitive psychology, an understanding of the behavior of controlled mechanisms, and the development of behavior-based robots in a few robot research laboratories. The validity of these ideas is supported by some simple simulation experiments in the field of mobile robot navigation and guidance.

  18. Optic flow and autonomous navigation.

    PubMed

    Campani, M; Giachetti, A; Torre, V

    1995-01-01

    Many animals, especially insects, compute and use optic flow to control their motion direction and to avoid obstacles. Recent advances in computer vision have shown that an adequate optic flow can be computed from image sequences. Therefore studying whether artificial systems, such as robots, can use optic flow for similar purposes is of particular interest. Experiments are reviewed that suggest the possible use of optic flow for the navigation of a robot moving in indoor and outdoor environments. The optic flow is used to detect and localise obstacles in indoor scenes, such as corridors, offices, and laboratories. These routines are based on the computation of a reduced optic flow. The robot is usually able to avoid large obstacles such as a chair or a person. The avoidance performances of the proposed algorithm critically depend on the optomotor reaction of the robot. The optic flow can be used to understand the ego-motion in outdoor scenes, that is, to obtain information on the absolute velocity of the moving vehicle and to detect the presence of other moving objects. A critical step is the correction of the optic flow for shocks and vibrations present during image acquisition. The results obtained suggest that optic flow can be successfully used by biological and artificial systems to control their navigation. Moreover, both systems require fast and accurate optomotor reactions and need to compensate for the instability of the viewed world. PMID:7617428

  19. Toward autonomous spacecraft

    NASA Technical Reports Server (NTRS)

    Fogel, L. J.; Calabrese, P. G.; Walsh, M. J.; Owens, A. J.

    1982-01-01

    Ways in which autonomous behavior of spacecraft can be extended to treat situations wherein a closed loop control by a human may not be appropriate or even possible are explored. Predictive models that minimize mean least squared error and arbitrary cost functions are discussed. A methodology for extracting cyclic components for an arbitrary environment with respect to usual and arbitrary criteria is developed. An approach to prediction and control based on evolutionary programming is outlined. A computer program capable of predicting time series is presented. A design of a control system for a robotic dense with partially unknown physical properties is presented.

  20. Autonomic neuropathies

    NASA Technical Reports Server (NTRS)

    Low, P. A.

    1998-01-01

    A limited autonomic neuropathy may underlie some unusual clinical syndromes, including the postural tachycardia syndrome, pseudo-obstruction syndrome, heat intolerance, and perhaps chronic fatigue syndrome. Antibodies to autonomic structures are common in diabetes, but their specificity is unknown. The presence of autonomic failure worsens prognosis in the diabetic state. Some autonomic neuropathies are treatable. Familial amyloid polyneuropathy may respond to liver transplantation. There are anecdotal reports of acute panautonomic neuropathy responding to intravenous gamma globulin. Orthostatic hypotension may respond to erythropoietin or midodrine.

  1. Take Your Class Outdoors.

    ERIC Educational Resources Information Center

    Shellenberger, Barbara R.

    1981-01-01

    Offers suggestions for designing outdoor activities to provide students with opportunities for exploring, observing, and discovering. Outlines several science activities for each of the following topics: trees, rocks, soil, insects, wild flowers, grasses, lichens, and clouds. (DS)

  2. Forward Deployed Robotic Unit

    NASA Astrophysics Data System (ADS)

    Brendle, Bruce E., Jr.; Bornstein, Jonathan A.

    2000-07-01

    Forward Deployed Robotic Unit (FDRU) is a core science and technology objective of the US Army, which will demonstrate the impact of autonomous systems on all phases of future land warfare. It will develop, integrate and demonstrate technology required to achieve robotic and fire control capabilities for future land combat vehicles, e.g., Future Combat Systems, using a system of systems approach that culminates in a field demonstration in 2005. It will also provide the required unmanned assets and conduct the demonstration. Battle Lab Warfighting Experiments and data analysis required to understand the effects of unmanned assets on combat operations. The US Army Tank- Automotive & Armaments Command and the US Army Research Laboratory are teaming in an effort to leverage prior technology achievements in the areas of autonomous mobility, architecture, sensor and robotics system integration; advance the state-of-the-art in these areas; and to provide field demonstration/application of the technologies.

  3. Robot Would Reconfigure Modular Equipment

    NASA Technical Reports Server (NTRS)

    Purves, Lloyd R.

    1993-01-01

    Special-purpose sets of equipment, packaged in identical modules with identical interconnecting mechanisms, attached to or detached from each other by specially designed robot, according to proposal. Two-arm walking robot connects and disconnects modules, operating either autonomously or under remote supervision. Robot walks along row of connected modules by grasping successive attachment subassemblies in hand-over-hand motion. Intended application for facility or station in outer space; robot reconfiguration scheme makes it unnecessary for astronauts to venture outside spacecraft or space station. Concept proves useful on Earth in assembly, disassembly, or reconfiguration of equipment in such hostile environments as underwater, near active volcanoes, or in industrial process streams.

  4. Self-Reconfigurable Robots

    SciTech Connect

    HENSINGER, DAVID M.; JOHNSTON, GABRIEL A.; HINMAN-SWEENEY, ELAINE M.; FEDDEMA, JOHN T.; ESKRIDGE, STEVEN E.

    2002-10-01

    A distributed reconfigurable micro-robotic system is a collection of unlimited numbers of distributed small, homogeneous robots designed to autonomously organize and reorganize in order to achieve mission-specified geometric shapes and functions. This project investigated the design, control, and planning issues for self-configuring and self-organizing robots. In the 2D space a system consisting of two robots was prototyped and successfully displayed automatic docking/undocking to operate dependently or independently. Additional modules were constructed to display the usefulness of a self-configuring system in various situations. In 3D a self-reconfiguring robot system of 4 identical modules was built. Each module connects to its neighbors using rotating actuators. An individual component can move in three dimensions on its neighbors. We have also built a self-reconfiguring robot system consisting of 9-module Crystalline Robot. Each module in this robot is actuated by expansion/contraction. The system is fully distributed, has local communication (to neighbors) capabilities and it has global sensing capabilities.

  5. Mechanochemically Active Soft Robots.

    PubMed

    Gossweiler, Gregory R; Brown, Cameron L; Hewage, Gihan B; Sapiro-Gheiler, Eitan; Trautman, William J; Welshofer, Garrett W; Craig, Stephen L

    2015-10-14

    The functions of soft robotics are intimately tied to their form-channels and voids defined by an elastomeric superstructure that reversibly stores and releases mechanical energy to change shape, grip objects, and achieve complex motions. Here, we demonstrate that covalent polymer mechanochemistry provides a viable mechanism to convert the same mechanical potential energy used for actuation in soft robots into a mechanochromic, covalent chemical response. A bis-alkene functionalized spiropyran (SP) mechanophore is cured into a molded poly(dimethylsiloxane) (PDMS) soft robot walker and gripper. The stresses and strains necessary for SP activation are compatible with soft robot function. The color change associated with actuation suggests opportunities for not only new color changing or camouflaging strategies, but also the possibility for simultaneous activation of latent chemistry (e.g., release of small molecules, change in mechanical properties, activation of catalysts, etc.) in soft robots. In addition, mechanochromic stress mapping in a functional robotic device might provide a useful design and optimization tool, revealing spatial and temporal force evolution within the robot in a way that might be coupled to autonomous feedback loops that allow the robot to regulate its own activity. The demonstration motivates the simultaneous development of new combinations of mechanophores, materials, and soft, active devices for enhanced functionality. PMID:26390078

  6. Robotic systems in orthopaedic surgery.

    PubMed

    Lang, J E; Mannava, S; Floyd, A J; Goddard, M S; Smith, B P; Mofidi, A; Seyler, T M; Jinnah, R H

    2011-10-01

    Robots have been used in surgery since the late 1980s. Orthopaedic surgery began to incorporate robotic technology in 1992, with the introduction of ROBODOC, for the planning and performance of total hip replacement. The use of robotic systems has subsequently increased, with promising short-term radiological outcomes when compared with traditional orthopaedic procedures. Robotic systems can be classified into two categories: autonomous and haptic (or surgeon-guided). Passive surgery systems, which represent a third type of technology, have also been adopted recently by orthopaedic surgeons. While autonomous systems have fallen out of favour, tactile systems with technological improvements have become widely used. Specifically, the use of tactile and passive robotic systems in unicompartmental knee replacement (UKR) has addressed some of the historical mechanisms of failure of non-robotic UKR. These systems assist with increasing the accuracy of the alignment of the components and produce more consistent ligament balance. Short-term improvements in clinical and radiological outcomes have increased the popularity of robot-assisted UKR. Robot-assisted orthopaedic surgery has the potential for improving surgical outcomes. We discuss the different types of robotic systems available for use in orthopaedics and consider the indication, contraindications and limitations of these technologies. PMID:21969424

  7. Industrial robots and robotics

    SciTech Connect

    Kafrissen, S.; Stephens, M.

    1984-01-01

    This book discusses the study of robotics. It provides information of hardware, software, applications and economics. Eleven chapters examine the following: Minicomputers, Microcomputers, and Microprocessors; The Servo-Control System; The Activators; Robot Vision Systems; and Robot Workcell Environments. Twelve appendices supplement the data.

  8. Robotic control in knee joint replacement surgery.

    PubMed

    Davies, B L; Rodriguez y Baena, F M; Barrett, A R W; Gomes, M P S F; Harris, S J; Jakopec, M; Cobb, J P

    2007-01-01

    A brief history of robotic systems in knee arthroplasty is provided. The place of autonomous robots is then discussed and compared to more recent 'hands-on' robotic systems that can be more cost effective. The case is made for robotic systems to have a clear justification, with improved benefits compared to those from cheaper navigation systems. A number of more recent, smaller, robot systems for knee arthroplasty are also described. A specific example is given of an active constraint medical robot, the ACROBOT system, used in a prospective randomized controlled trial of unicondylar robotic knee arthroplasty in which the robot was compared to conventional surgery. The results of the trial are presented together with a discussion of the need for measures of accuracy to be introduced so that the efficacy of the robotic surgery can be immediately identified, rather than have to wait for a number of years before long-term clinical improvements can be demonstrated. PMID:17315770

  9. Types of verbal interaction with instructable robots

    NASA Technical Reports Server (NTRS)

    Crangle, C.; Suppes, P.; Michalowski, S.

    1987-01-01

    An instructable robot is one that accepts instruction in some natural language such as English and uses that instruction to extend its basic repertoire of actions. Such robots are quite different in conception from autonomously intelligent robots, which provide the impetus for much of the research on inference and planning in artificial intelligence. Examined here are the significant problem areas in the design of robots that learn from vebal instruction. Examples are drawn primarily from our earlier work on instructable robots and recent work on the Robotic Aid for the physically disabled. Natural-language understanding by machines is discussed as well as in the possibilities and limits of verbal instruction. The core problem of verbal instruction, namely, how to achieve specific concrete action in the robot in response to commands that express general intentions, is considered, as are two major challenges to instructability: achieving appropriate real-time behavior in the robot, and extending the robot's language capabilities.

  10. Laser radar in robotics

    SciTech Connect

    Carmer, D.C.; Peterson, L.M.

    1996-02-01

    In this paper the authors describe the basic operating principles of laser radar sensors and the typical algorithms used to process laser radar imagery for robotic applications. The authors review 12 laser radar sensors to illustrate the variety of systems that have been applied to robotic applications wherein information extracted from the laser radar data is used to automatically control a mechanism or process. Next, they describe selected robotic applications in seven areas: autonomous vehicle navigation, walking machine foot placement, automated service vehicles, manufacturing and inspection, automotive, military, and agriculture. They conclude with a discussion of the status of laser radar technology and suggest trends seen in the application of laser radar sensors to robotics. Many new applications are expected as the maturity level progresses and system costs are reduced.

  11. Evolutionary strategy for achieving autonomous navigation

    NASA Astrophysics Data System (ADS)

    Gage, Douglas W.

    1999-01-01

    An approach is presented for the evolutionary development of supervised autonomous navigation capabilities for small 'backpackable' ground robots, in the context of a DARPA- sponsored program to provide robotic support to small units of dismounted warfighters. This development approach relies on the implementation of a baseline visual serving navigation capability, including tools to support operator oversight and override, which is then enhanced with semantically referenced commands and a mission scripting structure. As current and future machine perception techniques are able to automatically designate visual serving goal points, this approach should provide a natural evolutionary pathway to higher levels of autonomous operation and reduced requirements for operator intervention.

  12. Robotic surgery

    MedlinePlus

    Robot-assisted surgery; Robotic-assisted laparoscopic surgery; Laparoscopic surgery with robotic assistance ... Robotic surgery is similar to laparoscopic surgery. It can be performed through smaller cuts than open surgery. ...

  13. Benchmark on outdoor scenes

    NASA Astrophysics Data System (ADS)

    Zhang, Hairong; Wang, Cheng; Chen, Yiping; Jia, Fukai; Li, Jonathan

    2016-03-01

    Depth super-resolution is becoming popular in computer vision, and most of test data is based on indoor data sets with ground-truth measurements such as Middlebury. However, indoor data sets mainly are acquired from structured light techniques under ideal conditions, which cannot represent the objective world with nature light. Unlike indoor scenes, the uncontrolled outdoor environment is much more complicated and is rich both in visual and depth texture. For that reason, we develop a more challenging and meaningful outdoor benchmark for depth super-resolution using the state-of-the-art active laser scanning system.

  14. Outdoor PV Degradation Comparison

    SciTech Connect

    Jordan, D. C.; Smith, R. M.; Osterwald, C. R.; Gelak, E.; Kurtz, S. R.

    2011-02-01

    As photovoltaic (PV) penetration of the power grid increases, it becomes vital to know how decreased power output; may affect cost over time. In order to predict power delivery, the decline or degradation rates must be determined; accurately. At the Performance and Energy Rating Testbed (PERT) at the Outdoor Test Facility (OTF) at the; National Renewable Energy Laboratory (NREL) more than 40 modules from more than 10 different manufacturers; were compared for their long-term outdoor stability. Because it can accommodate a large variety of modules in a; limited footprint the PERT system is ideally suited to compare modules side-by-side under the same conditions.

  15. Adaptive Behavior for Mobile Robots

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance

    2009-01-01

    The term "System for Mobility and Access to Rough Terrain" (SMART) denotes a theoretical framework, a control architecture, and an algorithm that implements the framework and architecture, for enabling a land-mobile robot to adapt to changing conditions. SMART is intended to enable the robot to recognize adverse terrain conditions beyond its optimal operational envelope, and, in response, to intelligently reconfigure itself (e.g., adjust suspension heights or baseline distances between suspension points) or adapt its driving techniques (e.g., engage in a crabbing motion as a switchback technique for ascending steep terrain). Conceived for original application aboard Mars rovers and similar autonomous or semi-autonomous mobile robots used in exploration of remote planets, SMART could also be applied to autonomous terrestrial vehicles to be used for search, rescue, and/or exploration on rough terrain.

  16. Autonomic hyperreflexia

    MedlinePlus

    The most common cause of autonomic hyperreflexia is spinal cord injury. The nervous system of people with this condition ... Flushed (red) skin above the level of the spinal cord injury High blood pressure Slow pulse or fast pulse ...

  17. Autonomic hyperreflexia

    MedlinePlus

    ... The most common cause of autonomic hyperreflexia is spinal cord injury. The nervous system of people with this condition ... Flushed (red) skin above the level of the spinal cord injury High blood pressure Slow pulse or fast pulse ...

  18. Door breaching robotic manipulator

    NASA Astrophysics Data System (ADS)

    Schoenfeld, Erik; Parrington, Lawrence; von Muehlen, Stephan

    2008-04-01

    As unmanned systems become more commonplace in military, police, and other security forces, they are tasked to perform missions that the original hardware was not designed for. Current military robots are built for rough outdoor conditions and have strong inflexible manipulators designed to handle a wide range of operations. However, these manipulators are not well suited for some essential indoor tasks, including opening doors. This is a complicated kinematic task that places prohibitively difficult control challenges on the robot and the operator. Honeybee and iRobot have designed a modular door-breaching manipulator that mechanically simplifies the demands upon operator and robot. The manipulator connects to the existing robotic arm of the iRobot PackBot EOD. The gripper is optimized for grasping a variety of door knobs, levers, and car-door handles. It works in conjunction with a compliant wrist and magnetic lock-out mechanism that allows the wrist to remain rigid until the gripper has a firm grasp of the handle and then bend with its rotation and the swing of the door. Once the door is unlatched, the operator simply drives the robot through the doorway while the wrist compensates for the complex, multiple degree-of-freedom motion of the door. Once in the doorway the operator releases the handle, the wrist pops back into place, and the robot is ready for the next door. The new manipulator dramatically improves a robot's ability to non-destructively breach doors and perform an inspection of a room's content, a capability that was previously out of reach of unmanned systems.

  19. Outdoor Recreation, Outdoor Education and the Economy of Scotland.

    ERIC Educational Resources Information Center

    Higgins, Peter

    2000-01-01

    Interviews and a literature review found that outdoor recreation contributes significantly to Scotland's tourist income, particularly in rural areas; outdoor education centers are significant employers in certain rural areas; the provision of outdoor education by secondary schools has decreased in the last 20 years; and therapeutic outdoor…

  20. Interactive autonomy and robotic skills

    NASA Technical Reports Server (NTRS)

    Kellner, A.; Maediger, B.

    1994-01-01

    Current concepts of robot-supported operations for space laboratories (payload servicing, inspection, repair, and ORU exchange) are mainly based on the concept of 'interactive autonomy' which implies autonomous behavior of the robot according to predefined timelines, predefined sequences of elementary robot operations and within predefined world models supplying geometrical and other information for parameter instantiation on the one hand, and the ability to override and change the predefined course of activities by human intervention on the other hand. Although in principle a very powerful and useful concept, in practice the confinement of the robot to the abstract world models and predefined activities appears to reduce the robot's stability within real world uncertainties and its applicability to non-predefined parts of the world, calling for frequent corrective interaction by the operator, which in itself may be tedious and time-consuming. Methods are presented to improve this situation by incorporating 'robotic skills' into the concept of interactive autonomy.

  1. The Dirt on Outdoor Classrooms.

    ERIC Educational Resources Information Center

    Rich, Steve

    2000-01-01

    Explains the planning procedure for outdoor classrooms and introduces an integrated unit on monarch butterflies called the Monarch Watch program. Makes recommendations to solve financial problems of outdoor classrooms. (YDS)

  2. Toward cognitive robotics

    NASA Astrophysics Data System (ADS)

    Laird, John E.

    2009-05-01

    Our long-term goal is to develop autonomous robotic systems that have the cognitive abilities of humans, including communication, coordination, adapting to novel situations, and learning through experience. Our approach rests on the recent integration of the Soar cognitive architecture with both virtual and physical robotic systems. Soar has been used to develop a wide variety of knowledge-rich agents for complex virtual environments, including distributed training environments and interactive computer games. For development and testing in robotic virtual environments, Soar interfaces to a variety of robotic simulators and a simple mobile robot. We have recently made significant extensions to Soar that add new memories and new non-symbolic reasoning to Soar's original symbolic processing, which should significantly improve Soar abilities for control of robots. These extensions include episodic memory, semantic memory, reinforcement learning, and mental imagery. Episodic memory and semantic memory support the learning and recalling of prior events and situations as well as facts about the world. Reinforcement learning provides the ability of the system to tune its procedural knowledge - knowledge about how to do things. Mental imagery supports the use of diagrammatic and visual representations that are critical to support spatial reasoning. We speculate on the future of unmanned systems and the need for cognitive robotics to support dynamic instruction and taskability.

  3. Journey to the Outdoors

    ERIC Educational Resources Information Center

    Boyd, Margaret

    2013-01-01

    A keen personal interest in natural history, involvement in environmental organisations, and experience, first as a secondary biology teacher and later as a field teacher, means that this author has spent many years working outdoors. Any part of the curriculum involving ecological concepts would lead her to open the door and go outside. She…

  4. Outdoor Education. Resource Catalogue.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education and Training, Winnipeg.

    The material in this catalog has been compiled to serve as a ready reference for teachers to assist them in locating outdoor education materials and obtaining environmental student project assistance available from government departments and private organizations within the province of Manitoba. Part 1 lists agencies that can provide speakers,…

  5. Women in the Outdoors.

    ERIC Educational Resources Information Center

    Johnson, Dale

    1990-01-01

    Women engaging in outdoor activities tend to be more supportive of each other and more willing to express their feelings and apprehensions about adventurous settings than are men. It is important for women to have strong female leaders as role models. Instructors should be aware that women's learning styles and learning curves differ from men's.…

  6. Vehicles for Outdoor Recreation.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1983

    1983-01-01

    The Wheelchair Motorcycle Association tests various motorized vehicles that might help the physically disabled child get about outdoors. Vehicles found to be practical for older children and adolescents include three-wheeled motorcycles and customized go-carts. An address for obtaining more information on the association is provided. (SW)

  7. Your Brain Outdoors

    ERIC Educational Resources Information Center

    MacEachren, Zabe

    2012-01-01

    The way technology influences a person's cognition is seldom recognized, but is of increasing interest among brain researchers. Outdoor educators tend to pay attention to the way different activities offer different perceptions of an environment. When natural spaces can no longer be accessed, they adapt and simulate natural activities in available…

  8. Outdoor Ecology School

    ERIC Educational Resources Information Center

    Cole, Anna Gahl

    2004-01-01

    In this article, the author describes how her high school environmental science students led third graders on a dynamic learning adventure as part of their first annual Outdoor Ecology School. At the water-monitoring site in a nearby national forest, the elementary students conducted field research and scavenger hunts, discovered animal habitats,…

  9. Take Math Outdoors.

    ERIC Educational Resources Information Center

    Schall, William E.

    1984-01-01

    Scavenger hunts, collecting bottle caps, observing shadows, and other outdoor activities can be developed into a mathematics unit that motivates students to acquire basic mathematical skills. A variety of natural ways to collect data are offered to help foster learning. (DF)

  10. Mapping of Outdoor Classrooms.

    ERIC Educational Resources Information Center

    Horvath, Victor G.

    Mapping symbols adopted by the Michigan Department of Natural Resources are presented with their explanations. In an effort to provide standardization and familiarity teachers and other school people involved in an outdoor education program are encouraged to utilize the same symbols in constructing maps. (DK)

  11. [Science in the Outdoors].

    ERIC Educational Resources Information Center

    Sarage, Joe; And Others

    Designed for instruction of emotionally handicapped children and youth, this resource guide presents science activities and concepts relative to rural and urban outdoor education. Included are 25 different articles, varying from broadly generalized to highly specific concept/activity suggestions which include film and book bibliographies and…

  12. Outdoor Education in Georgia.

    ERIC Educational Resources Information Center

    Georgia State Dept. of Education, Atlanta.

    Providing an overview of the Outdoor Education Workshop provided by the Georgia Migrant Education Program to give migrant students and staff an opportunity to learn new skills which they can then share with other migrant children upon their return to the regular school setting, the paper briefly discusses the administrative steps necessary when…

  13. Outdoor Education in Texas.

    ERIC Educational Resources Information Center

    Myers, Ray H.

    In Dallas in 1970, high school outdoor education began as a cocurricular woods and waters boys' club sponsored by a community sportsman. Within one year, it grew into a fully accredited, coeducational, academic course with a curriculum devoted to the study of wildlife in Texas, ecology, conservation, hunting, firearm safety, fishing, boating and…

  14. Outdoor Unified Studies.

    ERIC Educational Resources Information Center

    Liston, Louise

    Escalante (Utah) High School's outdoor unified studies field trip is a learning experience to be remembered. The four-day camping experience begins with pre-trip plans, pretests, and lecture/introductions to the Anasazi culture and to geologic formations to be visited. Horses (and equipment-carrying trucks) take the students into the desert to set…

  15. Children and the Outdoor Environment

    ERIC Educational Resources Information Center

    Niklasson, Laila; Sandberg, Anette

    2010-01-01

    In this article we will discuss the outdoor environment for younger children with the help of two different concepts. The first concept, affordance, is well known in the discussion about outdoor environments. What the affordance in the outdoor environment is perceived as can differ between actors. How the affordance is used can be another source…

  16. Contemporary Perspectives in Outdoor Education.

    ERIC Educational Resources Information Center

    Lewis, Charles A., Jr., Ed.; Carlson, Marcia K., Ed.

    Designed to provide the student of outdoor education with a synthesis of current literature in the field, this collection presents 26 articles which range from administrative to practical applications of outdoor education theory and philosophy. Articles include discussions of: (1) the philosophy of outdoor education; (2) a London school and its…

  17. Feminist Perspectives on Outdoor Leadership.

    ERIC Educational Resources Information Center

    Henderson, Karla

    Feminist perspectives provide a basis for examining the nature of participation in outdoor experiences, the goals of outdoor leadership, and the meanings associated with the outdoors. Feminism is concerned with the correction of both the invisibility and distortion of female experience in ways relevant to social change and removal of social…

  18. Vision + Community = Outdoor Learning Stations

    ERIC Educational Resources Information Center

    Eick, Charles; Tatarchuk, Shawna; Anderson, Amy

    2013-01-01

    Outdoor learning areas are becoming more popular as a means for community-based, cross-curricular learning where children study issues of local relevance (Sobel 2004). Outdoor learning areas, any place outside of the school building where children can observe and interact with the natural world around them, include outdoor structures for seating…

  19. Outdoor Programmes for Women Only?

    ERIC Educational Resources Information Center

    Nolan, Tammy Leigh; Priest, Simon

    1993-01-01

    Discusses the need for women-only outdoor programs as an alternative to mixed programs, socialization and stereotyping of gender roles and behavior in society, and barriers to outdoor participation for women. Describes some women-only outdoor programs and their benefits and disadvantages. Provides recommendations concerning program design and…

  20. Outdoor Education and Science Achievement

    ERIC Educational Resources Information Center

    Rios, José M.; Brewer, Jessica

    2014-01-01

    Elementary students have limited opportunities to learn science in an outdoor setting at school. Some suggest this is partially due to a lack of teacher efficacy teaching in an outdoor setting. Yet the research literature indicates that outdoor learning experiences develop positive environmental attitudes and can positively affect science…

  1. Outdoor Play and Play Equipment.

    ERIC Educational Resources Information Center

    Naylor, Heather

    1985-01-01

    Discusses aspects of the play environment and its effect on children's play behavior. Indoor and outdoor play spaces are considered along with factors affecting the use of outdoor environments for play. Children's preferences for different outdoor play environments and for various play structures are explored. Guides for choosing play equipment…

  2. JPL Robotics Technology Applicable to Agriculture

    NASA Technical Reports Server (NTRS)

    Udomkesmalee, Suraphol Gabriel; Kyte, L.

    2008-01-01

    This slide presentation describes several technologies that are developed for robotics that are applicable for agriculture. The technologies discussed are detection of humans to allow safe operations of autonomous vehicles, and vision guided robotic techniques for shoot selection, separation and transfer to growth media,

  3. Remote Control and Children's Understanding of Robots

    ERIC Educational Resources Information Center

    Somanader, Mark C.; Saylor, Megan M.; Levin, Daniel T.

    2011-01-01

    Children use goal-directed motion to classify agents as living things from early in infancy. In the current study, we asked whether preschoolers are flexible in their application of this criterion by introducing them to robots that engaged in goal-directed motion. In one case the robot appeared to move fully autonomously, and in the other case it…

  4. Robot and robot system

    NASA Technical Reports Server (NTRS)

    Behar, Alberto E. (Inventor); Marzwell, Neville I. (Inventor); Wall, Jonathan N. (Inventor); Poole, Michael D. (Inventor)

    2011-01-01

    A robot and robot system that are capable of functioning in a zero-gravity environment are provided. The robot can include a body having a longitudinal axis and having a control unit and a power source. The robot can include a first leg pair including a first leg and a second leg. Each leg of the first leg pair can be pivotally attached to the body and constrained to pivot in a first leg pair plane that is substantially perpendicular to the longitudinal axis of the body.

  5. Boudreaux the Robot (a.k.a. EVA Robotic Assistant)

    NASA Technical Reports Server (NTRS)

    Shillcutt, Kimberly; Burridge, Robert; Graham, Jeffrey

    2002-01-01

    The EVA Robotic Assistant is a prototype for an autonomous rover designed to assist human astronauts. The primary focus of the research is to explore the interaction between humans and robots, particularly in extreme environments, and to develop a software infrastructure that could be applied to any type of assistant robot, whether for planetary exploration or orbital missions. This paper describes the background and current status of the project, the types of scenarios addressed in field demonstrations, the hardware and software that comprise the current prototype, and future research plans.

  6. Integrated mobile robot control

    NASA Technical Reports Server (NTRS)

    Amidi, Omead; Thorpe, Charles

    1991-01-01

    This paper describes the structure, implementation, and operation of a real-time mobile robot controller which integrates capabilities such as: position estimation, path specification and tracking, human interfaces, fast communication, and multiple client support. The benefits of such high-level capabilities in a low-level controller was shown by its implementation for the Navlab autonomous vehicle. In addition, performance results from positioning and tracking systems are reported and analyzed.

  7. Science, technology and the future of small autonomous drones.

    PubMed

    Floreano, Dario; Wood, Robert J

    2015-05-28

    We are witnessing the advent of a new era of robots - drones - that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications. PMID:26017445

  8. Science, technology and the future of small autonomous drones

    NASA Astrophysics Data System (ADS)

    Floreano, Dario; Wood, Robert J.

    2015-05-01

    We are witnessing the advent of a new era of robots -- drones -- that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.

  9. Autonomous Soaring

    NASA Technical Reports Server (NTRS)

    Lin, Victor P.

    2007-01-01

    This viewgraph presentation reviews the autonomous soaring flight of unmanned aerial vehicles (UAV). It reviews energy sources for UAVs, and two examples of UAV's that used alternative energy sources, and thermal currents for soaring. Examples of flight tests, plans, and results are given. Ultimately, the concept of a UAV harvesting energy from the atmosphere has been shown to be feasible with existing technology.

  10. Parallel-distributed mobile robot simulator

    NASA Astrophysics Data System (ADS)

    Okada, Hiroyuki; Sekiguchi, Minoru; Watanabe, Nobuo

    1996-06-01

    The aim of this project is to achieve an autonomous learning and growth function based on active interaction with the real world. It should also be able to autonomically acquire knowledge about the context in which jobs take place, and how the jobs are executed. This article describes a parallel distributed movable robot system simulator with an autonomous learning and growth function. The autonomous learning and growth function which we are proposing is characterized by its ability to learn and grow through interaction with the real world. When the movable robot interacts with the real world, the system compares the virtual environment simulation with the interaction result in the real world. The system then improves the virtual environment to match the real-world result more closely. This the system learns and grows. It is very important that such a simulation is time- realistic. The parallel distributed movable robot simulator was developed to simulate the space of a movable robot system with an autonomous learning and growth function. The simulator constructs a virtual space faithful to the real world and also integrates the interfaces between the user, the actual movable robot and the virtual movable robot. Using an ultrafast CG (computer graphics) system (FUJITSU AG series), time-realistic 3D CG is displayed.

  11. Robot Lies in Health Care: When Is Deception Morally Permissible?

    PubMed

    Matthias, Andreas

    2015-06-01

    Autonomous robots are increasingly interacting with users who have limited knowledge of robotics and are likely to have an erroneous mental model of the robot's workings, capabilities, and internal structure. The robot's real capabilities may diverge from this mental model to the extent that one might accuse the robot's manufacturer of deceiving the user, especially in cases where the user naturally tends to ascribe exaggerated capabilities to the machine (e.g. conversational systems in elder-care contexts, or toy robots in child care). This poses the question, whether misleading or even actively deceiving the user of an autonomous artifact about the capabilities of the machine is morally bad and why. By analyzing trust, autonomy, and the erosion of trust in communicative acts as consequences of deceptive robot behavior, we formulate four criteria that must be fulfilled in order for robot deception to be morally permissible, and in some cases even morally indicated. PMID:26144538

  12. Robot control with biological cells.

    PubMed

    Tsuda, Soichiro; Zauner, Klaus-Peter; Gunji, Yukio-Pegio

    2007-02-01

    At present there exists a large gap in size, performance, adaptability and robustness between natural and artificial information processors for performing coherent perception-action tasks under real-time constraints. Even the simplest organisms have an enviable capability of coping with an unknown dynamic environment. Robots, in contrast, are still clumsy if confronted with such complexity. This paper presents a bio-hybrid architecture developed for exploring an alternate approach to the control of autonomous robots. Circuits prepared from amoeboid plasmodia of the slime mold Physarum polycephalum are interfaced with an omnidirectional hexapod robot. Sensory signals from the macro-physical environment of the robot are transduced to cellular scale and processed using the unique micro-physical features of intracellular information processing. Conversely, the response form the cellular computation is amplified to yield a macroscopic output action in the environment mediated through the robot's actuators. PMID:17188804

  13. Robotic control and inspection verification

    NASA Technical Reports Server (NTRS)

    Davis, Virgil Leon

    1991-01-01

    Three areas of possible commercialization involving robots at the Kennedy Space Center (KSC) are discussed: a six degree-of-freedom target tracking system for remote umbilical operations; an intelligent torque sensing end effector for operating hand valves in hazardous locations; and an automatic radiator inspection device, a 13 by 65 foot robotic mechanism involving completely redundant motors, drives, and controls. Aspects concerning the first two innovations can be integrated to enable robots or teleoperators to perform tasks involving orientation and panal actuation operations that can be done with existing technology rather than waiting for telerobots to incorporate artificial intelligence (AI) to perform 'smart' autonomous operations. The third robot involves the application of complete control hardware redundancy to enable performance of work over and near expensive Space Shuttle hardware. The consumer marketplace may wish to explore commercialization of similiar component redundancy techniques for applications when a robot would not normally be used because of reliability concerns.

  14. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  15. Active Monte Carlo Localization in Outdoor Terrains Using Multi-level Surface Maps

    NASA Astrophysics Data System (ADS)

    Kümmerle, Rainer; Pfaff, Patrick; Triebel, Rudolph; Burgard, Wolfram

    In this paper we consider the problem of active mobile robot localization with range sensors in outdoor environments. In contrast to passive approaches our approach actively selects the orientation of the laser range finder to improve the localization results. It applies a particle filter to estimate the full sixdimensional state of the robot. To represent the environment we utilize multi-level surface maps which allow the robot to represent vertical structures and multiple levels. To efficiently calculate the optimal orientation for the range scanner, we apply a clustering operation on the particles and only evaluate potential orientations based on these clusters. Experimental results obtained with a mobile robot in an outdoor environment indicate that the active control of the range sensor leads to more efficient localization results.

  16. Robots Show Us How to Teach Them: Feedback from Robots Shapes Tutoring Behavior during Action Learning

    PubMed Central

    Vollmer, Anna-Lisa; Mühlig, Manuel; Steil, Jochen J.; Pitsch, Karola; Fritsch, Jannik; Rohlfing, Katharina J.; Wrede, Britta

    2014-01-01

    Robot learning by imitation requires the detection of a tutor's action demonstration and its relevant parts. Current approaches implicitly assume a unidirectional transfer of knowledge from tutor to learner. The presented work challenges this predominant assumption based on an extensive user study with an autonomously interacting robot. We show that by providing feedback, a robot learner influences the human tutor's movement demonstrations in the process of action learning. We argue that the robot's feedback strongly shapes how tutors signal what is relevant to an action and thus advocate a paradigm shift in robot action learning research toward truly interactive systems learning in and benefiting from interaction. PMID:24646510

  17. Sensor selection for outdoor air quality monitoring

    NASA Astrophysics Data System (ADS)

    Dorsey, K. L.; Herr, John R.; Pisano, A. P.

    2014-06-01

    Gas chemical monitoring for next-generation robotics applications such as fire fighting, explosive gas detection, ubiquitous urban monitoring, and mine safety require high performance, reliable sensors. In this work, we discuss the performance requirements of fixed-location, mobile vehicle, and personal sensor nodes for outdoor air quality sensing. We characterize and compare the performance of a miniature commercial electrochemical and a metal oxide gas sensor and discuss their suitability for environmental monitoring applications. Metal oxide sensors are highly cross-sensitive to factors that affect chemical adsorption (e.g., air speed, pressure) and require careful enclosure design or compensation methods. In contrast, electrochemical sensors are less susceptible to environmental variations, have very low power consumption, and are well matched for mobile air quality monitoring.

  18. Design of a walking robot

    NASA Technical Reports Server (NTRS)

    Whittaker, William; Dowling, Kevin

    1994-01-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  19. Design of a walking robot

    NASA Astrophysics Data System (ADS)

    Whittaker, William; Dowling, Kevin

    1994-03-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  20. Human Supervision of Robotic Site Surveys

    NASA Astrophysics Data System (ADS)

    Schreckenghost, Debra; Fong, Terrence; Milam, Tod

    2008-01-01

    Ground operators will interact remotely with robots on the lunar surface to support site preparation and survey. Astronauts will interact with robots to support outpost buildup and maintenance, as well as mission operations. One mode of interaction required for such operations is the ability to supervise robots performing routine autonomous tasks. Supervision of autonomous robotic activities requires monitoring the robot's performance of tasks with minimal human effort. This includes understanding its progress on tasks, awareness when important milestones are achieved or problems impede tasks, and reconstructing situations after the fact by relating task events to recorded data. We are developing a software framework to support such interaction among distributed human teams and robots. We are evaluating our framework for human supervision of mobile robots performing routine site survey operations. We are prototyping a system that (1) monitors data from the K10 robot performing surveys to determine the depth of permafrost at the Haughton Crater on Devon Island, (2) computes performance measures about how well the survey is going, (3) builds summaries of these performance measures, and (4) notifies to appropriate personnel when milestones are achieved or performance indicates a problem. We will evaluate our prototype using data collected during Operational Readiness Tests for the Haughton Crater field test to be conducted in July 2007. In this paper we describe our approach for human supervision of robotic activities and report the results of our evaluation with the K10 robot.

  1. Indoor and Outdoor Allergies.

    PubMed

    Singh, Madhavi; Hays, Amy

    2016-09-01

    In last 30 to 40 years there has been a significant increase in the incidence of allergy. This increase cannot be explained by genetic factors alone. Increasing air pollution and its interaction with biological allergens along with changing lifestyles are contributing factors. Dust mites, molds, and animal allergens contribute to most of the sensitization in the indoor setting. Tree and grass pollens are the leading allergens in the outdoor setting. Worsening air pollution and increasing particulate matter worsen allergy symptoms and associated morbidity. Cross-sensitization of allergens is common. Treatment involves avoidance of allergens, modifying lifestyle, medical treatment, and immunotherapy. PMID:27545734

  2. Autonomic dysreflexia

    PubMed Central

    Milligan, James; Lee, Joseph; McMillan, Colleen; Klassen, Hilary

    2012-01-01

    Abstract Objective To raise family physicians’ awareness of autonomic dysreflexia (AD) in patients with spinal cord injury (SCI) and to provide some suggestions for intervention. Sources of information MEDLINE was searched from 1970 to July 2011 using the terms autonomic dysreflexia and spinal cord injury with family medicine or primary care. Other relevant guidelines and resources were reviewed and used. Main message Family physicians often lack confidence in treating patients with SCI, see them as complex and time-consuming, and feel undertrained to meet their needs. Family physicians provide a vital component of the health care of such patients, and understanding of the unique medical conditions related to SCI is important. Autonomic dysreflexia is an important, common, and potentially serious condition with which many family physicians are unfamiliar. This article will review the signs and symptoms of AD and offer some acute management options and preventive strategies for family physicians. Conclusion Family physicians should be aware of which patients with SCI are susceptible to AD and monitor those affected by it. Outlined is an approach to acute management. Family physicians play a pivotal role in prevention of AD through education (of the patient and other health care providers) and incorporation of strategies such as appropriate bladder, bowel, and skin care practices and warnings and management plans in the medical chart. PMID:22893332

  3. Autonomous vehicles

    SciTech Connect

    Meyrowitz, A.L.; Blidberg, D.R.; Michelson, R.C. |

    1996-08-01

    There are various kinds of autonomous vehicles (AV`s) which can operate with varying levels of autonomy. This paper is concerned with underwater, ground, and aerial vehicles operating in a fully autonomous (nonteleoperated) mode. Further, this paper deals with AV`s as a special kind of device, rather than full-scale manned vehicles operating unmanned. The distinction is one in which the AV is likely to be designed for autonomous operation rather than being adapted for it as would be the case for manned vehicles. The authors provide a survey of the technological progress that has been made in AV`s, the current research issues and approaches that are continuing that progress, and the applications which motivate this work. It should be noted that issues of control are pervasive regardless of the kind of AV being considered, but that there are special considerations in the design and operation of AV`s depending on whether the focus is on vehicles underwater, on the ground, or in the air. The authors have separated the discussion into sections treating each of these categories.

  4. Obstacle detection in range-image sequence for outdoor navigation

    NASA Astrophysics Data System (ADS)

    Garduno, M.; Vachon, Bertrand

    1994-08-01

    We deal with the conception of a perception system that's goal is to assist a mobile robot teleoperator by providing pertinent information about eventual obstacles appearing in the robot work space. This range image based perception system is to be embedded on a vehicle able to move at speeds up to 40 km/h in an outdoor environment. A method taking speed constraints into account is proposed. In the first step of this method, a segmentation algorithm is applied to the first range image scanned by the motionless robot to determine areas of interest. From these areas, distinctive attributes are computed and recorded as symbolic representations of each obstacle region. In the second and following steps, obstacles are localized in images scanned during robot motion. The difference between actual object position in the range image and its predicted value is used by an extended Kalman filter to correct the estimated robot configuration. A dynamic image segmentation using emergency and security criteria is carried out and new obstacles can now be detected from the range image and expressed in the robot coordinates system.

  5. Navigation of a care and welfare robot

    NASA Astrophysics Data System (ADS)

    Yukawa, Toshihiro; Hosoya, Osamu; Saito, Naoki; Okano, Hideharu

    2005-12-01

    In this paper, we propose the development of a robot that can perform nursing tasks in a hospital. In a narrow environment such as a sickroom or a hallway, the robot must be able to move freely in arbitrary directions. Therefore, the robot needs to have high controllability and the capability to make precise movements. Our robot can recognize a line by using cameras, and can be controlled in the reference directions by means of comparison with original cell map information; furthermore, it moves safely on the basis of an original center-line established permanently in the building. Correspondence between the robot and a centralized control center enables the robot's autonomous movement in the hospital. Through a navigation system using cell map information, the robot is able to perform nursing tasks smoothly by changing the camera angle.

  6. Fish-robot interactions in a free-swimming environment: Effects of speed and configuration of robots on live fish

    NASA Astrophysics Data System (ADS)

    Butail, Sachit; Polverino, Giovanni; Phamduy, Paul; Del Sette, Fausto; Porfiri, Maurizio

    2014-03-01

    We explore fish-robot interactions in a comprehensive set of experiments designed to highlight the effects of speed and configuration of bioinspired robots on live zebrafish. The robot design and movement is inspired by salient features of attraction in zebrafish and includes enhanced coloration, aspect ratio of a fertile female, and carangiform/subcarangiformlocomotion. The robots are autonomously controlled to swim in circular trajectories in the presence of live fish. Our results indicate that robot configuration significantly affects both the fish distance to the robots and the time spent near them.

  7. Can Robots and Humans Get Along?

    SciTech Connect

    Scholtz, Jean

    2007-06-01

    Now that robots have moved into the mainstream—as vacuum cleaners, lawn mowers, autonomous vehicles, tour guides, and even pets—it is important to consider how everyday people will interact with them. A robot is really just a computer, but many researchers are beginning to understand that human-robot interactions are much different than human-computer interactions. So while the metrics used to evaluate the human-computer interaction (usability of the software interface in terms of time, accuracy, and user satisfaction) may also be appropriate for human-robot interactions, we need to determine whether there are additional metrics that should be considered.

  8. 9 CFR 3.27 - Facilities, outdoor.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Pigs and Hamsters Facilities and Operating Standards § 3.27 Facilities, outdoor. (a) Hamsters shall not be housed in outdoor facilities. (b) Guinea pigs shall not be housed in outdoor facilities...

  9. 9 CFR 3.27 - Facilities, outdoor.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Pigs and Hamsters Facilities and Operating Standards § 3.27 Facilities, outdoor. (a) Hamsters shall not be housed in outdoor facilities. (b) Guinea pigs shall not be housed in outdoor facilities...

  10. Rover: Autonomous concepts for Mars exploration

    NASA Astrophysics Data System (ADS)

    Baiget, A.; Castets, B.; Chochon, H.; Hayard, M.; Lamarre, H.; Lamothe, A.

    1993-01-01

    The development of a mobile, autonomous vehicle that will be launched towards an unknown planet is considered. The rover significant constraints are: Ariane 5 compatibility, Earth/Mars transfer capability, 1000 km autonomous moving in Mars environment, on board localization, and maximum science capability. Two different types of subsystem were considered: classical subsystems (mechanical and mechanisms, thermal, telecommunications, power, onboard data processing) and robotics subsystem, (perception/navigation, autonomous displacement generation, autonomous localization). The needs of each subsystem were studied in terms of energy and data handling capability, in order to choose an on board architecture which best use the available capability, by means of specialized parts. A compromise must always be done between every subsystem in order to obtain the real need with respect to the goal, for example: between perception/navigation and the motion capability. A compromise must also be found between mechanical assembly and calibration need, which is a real problem.

  11. Outdoor Education Student Log Book.

    ERIC Educational Resources Information Center

    Garbutt, Barbara; And Others.

    A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…

  12. Preparing Effective Outdoor Pursuit Leaders.

    ERIC Educational Resources Information Center

    Priest, Simon

    Information related to selecting, training, and certifying outdoor leaders for high adventure pursuits, is provided by selected experts from five English-speaking nations (Great Britain, Australia, New Zealand, Canada and the United States). Patterns of differences and similarities among these nations regarding outdoor leadership components and…

  13. Cultural Adaptation in Outdoor Programming

    ERIC Educational Resources Information Center

    Fabrizio, Sheila M.; Neill, James

    2005-01-01

    Outdoor programs often intentionally provide a different culture and the challenge of working out how to adapt. Failure to adapt, however, can cause symptoms of culture shock, including homesickness, negative personal behavior, and interpersonal conflict. This article links cross-cultural and outdoor programming literature and provides case…

  14. Financing of Private Outdoor Recreation.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    A survey of financial institutions was undertaken by the Bureau of Outdoor Recreation to evaluate the demand and availability of private credit for enterprises that provide outdoor recreation. The survey provided basic information for (1) evaluating legislative proposals for loan guarantee programs, (2) nationwide planning, and (3) assessing the…

  15. Outdoor Education Areas and Facilities.

    ERIC Educational Resources Information Center

    American Association for Health, Physical Education, and Recreation, Washington, DC.

    The facilities described for outdoor education and camping areas are designed to be an integral part of the large college or university campus, and to serve the educational and recreational programs of the educational institution and the total community. The establishment of an outdoor resident center is followed from the rationale for site…

  16. Cultural Diversity in Outdoor Education

    ERIC Educational Resources Information Center

    Thompson, Graham; Horvath, Erin

    2007-01-01

    At first glance Sioux Lookout is a typical northern Ontario town, situated within an intricate lake and river system, socially focused on year-round outdoor activities, and enveloped by kilometres and more kilometres of undomesticated Canadian Shield landscape. One might think this would be an ideal spot for outdoor education, just as these…

  17. Outdoor Education: Definition and Philosophy.

    ERIC Educational Resources Information Center

    Ford, Phyllis

    Because outdoor education programs occur in every geographic location, are sponsored by all levels of educational institutions, state and local government agencies, and private entrepreneurs, and have no nationally standardized curriculum or measures of competency or knowledge, outdoor education may best be defines as "education in, about, and for…

  18. OBIS: Outdoor Biology Instructional Strategies.

    ERIC Educational Resources Information Center

    Donovan, Edward P.; Richmond, Robert F.

    The Outdoor Biology Instructional Strategies (OBIS) project began in 1972 to enable non-school youth groups (aged 10-15) to gain firsthand experiences in outdoor environments. This descriptive paper explains the program including its purpose and historical background. Specific objectives are to: (1) stimulate curiosity about local environments;…

  19. Outdoor Education Manual. Part II.

    ERIC Educational Resources Information Center

    Colorado City Independent School District, Texas.

    Many of the articles included in this supplement to the original "Outdoor Education Manual" were submitted by classroom teachers in Ontario. Major areas covered are photography in outdoor education, enrichment of the curriculum through use of the school yard, canoe tripping, winter activities, and forestry. Suggested activities in each of these…

  20. Creating Outdoor Play & Learning Environments.

    ERIC Educational Resources Information Center

    White, Randy; Stoecklin, Vicki L.

    Why typical playgrounds are designed the way they are by adults is discussed, including what the ideal outdoor play/learning environment for children is and how the outdoor space should be considered as an extension of the classroom. The paper emphasizes the importance of nature to children, discusses the criteria playground designers should…

  1. Technology Works in the Outdoors

    ERIC Educational Resources Information Center

    Zita, Adam

    2008-01-01

    Technology is all around us and no matter how hard educators promote the value of outdoor and experiential education (OEE) to adults and children alike, they are pulled away by a different reality--one might say, a virtual reality. Even when one is engaged in the outdoors either through a night hike or a stream study, technology is lingering…

  2. Group Cooperation in Outdoor Education

    ERIC Educational Resources Information Center

    Matthews, Bruce E.

    1978-01-01

    Utilizing the Beatles' Yellow Submarine fantasy (e.g., the Blue Meanies), this outdoor education program is designed for sixth graders and special education students. Activities developed at the Cortland Resident Outdoor Education Camp include a series of group stress/challenge activities to be accomplished by everyone in the group, as a group.…

  3. Wilderness Survival and Outdoor Education.

    ERIC Educational Resources Information Center

    Ball, Matt

    Outdoor education is often delivered through games and activities such as nature hikes or observing an ecosystem within a 1-foot circle on the ground. Often, participants look closely at the earth only for that brief moment. Wilderness survival is another way to teach about the outdoors. It offers skills that encourage participants to become more…

  4. Personality Preferences of Outdoor Participants.

    ERIC Educational Resources Information Center

    Cashel, Christine; Montgomery, Diane; Lane, Suzie

    A study investigated the personality type preferences of people who voluntarily chose to participate in a structured, field-based, outdoor education program. The Myers-Briggs Type Indicator (MBTI) was administered to 87 participants prior to beginning a 10-day Wilderness Education Association outdoor leadership trip. Participants were 18-46 years…

  5. Autonomous exploration and mapping of unknown environments

    NASA Astrophysics Data System (ADS)

    Owens, Jason; Osteen, Phil; Fields, MaryAnne

    2012-06-01

    Autonomous exploration and mapping is a vital capability for future robotic systems expected to function in arbitrary complex environments. In this paper, we describe an end-to-end robotic solution for remotely mapping buildings. For a typical mapping system, an unmanned system is directed to enter an unknown building at a distance, sense the internal structure, and, barring additional tasks, while in situ, create a 2-D map of the building. This map provides a useful and intuitive representation of the environment for the remote operator. We have integrated a robust mapping and exploration system utilizing laser range scanners and RGB-D cameras, and we demonstrate an exploration and metacognition algorithm on a robotic platform. The algorithm allows the robot to safely navigate the building, explore the interior, report significant features to the operator, and generate a consistent map - all while maintaining localization.

  6. Knowledge acquisition for autonomous systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry; Heer, Ewald

    1988-01-01

    Knowledge-based capabilities for autonomous aerospace systems, such as the NASA Space Station, must encompass conflict-resolution functions comparable to those of human operators, with all elements of the system working toward system goals in a concurrent, asynchronous-but-coordinated fashion. Knowledge extracted from a design database will support robotic systems by furnishing geometric, structural, and causal descriptions required for repair, disassembly, and assembly. The factual knowledge for these databases will be obtained from a master database through a technical management information system, and it will in many cases have to be augmented by domain-specific heuristic knowledge acquired from domain experts.

  7. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design

  8. Autonomous navigation for structured exterior environments

    SciTech Connect

    Pletta, J B

    1993-12-01

    The Telemanaged Mobile Security Station (TMSS) was developed at Sandia National Laboratories to investigate the role of mobile robotics in exterior perimeter security systems. A major feature of the system is its capability to perform autonomous patrols of the security site`s network of roads. Perimeter security sites are well known, structured environments; the locations of the roads, buildings, and fences are relatively static. A security robot has the advantage of being able to learn its new environment prior to autonomous travel. The TMSS robot combines information from a microwave beacon system and on-board dead reckoning sensors to determine its location within the site. The operator is required to teleoperate the robot in a teach mode over all desired paths before autonomous operations can commence. During this teach phase, TMSS stores points from its position location system at two meter intervals. This map data base is used for planning paths and for reference during path following. Details of the position location and path following systems will be described along with system performance and recommendations for future enhancements.

  9. Autonomous control

    NASA Technical Reports Server (NTRS)

    Brown, Barbara

    1990-01-01

    KSC has been developing the Knowledge-Based Autonomous Test Engineer (KATE), which is a tool for performing automated monitoring, diagnosis, and control of electromechanical devices. KATE employs artificial intelligence computing techniques to perform these functions. The KATE system consists of a generic shell and a knowledge base. The KATE shell is the portion of the system which performs the monitoring, diagnosis, and control functions. It is generic in the sense that it is application independent. This means that the monitoring activity, for instance, will be performed with the same algorithms regardless of the particular physical device being used. The knowledge base is the portion of the system which contains specific functional and behavorial information about the physical device KATE is working with. Work is nearing completion on a project at KSC to interface a Texas Instruments Explorer running a LISP version of KATE with a Generic Checkout System (GCS) test-bed to control a physical simulation of a shuttle tanking system (humorously called the Red Wagon because of its color and mobility). The Autonomous Control System (ACS) project supplements and extends the KATE/GCS project by adding three other major activities. The activities include: porting KATE from the Texas Instruments Explorer machine to an Intel 80386-based UNIX workstation in the LISP language; rewriting KATE as necessary to run on the same 80386 workstation but in the Ada language; and investigating software and techniques to translate ANSI Standard Common LISP to Mil Standard Ada. Primary goals of this task are as follows: (1) establish the advantages of using expert systems to provide intelligent autonomous software for Space Station Freedom applications; (2) determine the feasibility of using Ada as the run-time environment for model-based expert systems; (3) provide insight into the advantages and disadvantagesof using LISP or Ada in the run-time environment for expert systems; and (4

  10. Autonomous Systems and Robotics: 2000-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes technologies to monitor, maintain, and where possible, repair complex space systems. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  11. A biologically-inspired autonomous robot

    NASA Astrophysics Data System (ADS)

    Beer, Randall D.

    1993-12-01

    A treadmill has been developed to support our cockroach locomotion studies. We have developed a small treadmill with a transparent belt for studying leg joint movements along with EMG's as the animal walks or runs at various speeds. This allows us to match the electrical activity in muscles with the kinematics of joint movement. Along with intracellular stimulation studies performed previously, the tools are now in place to make major advances in understanding how the insect's walking movements are actually accomplished.

  12. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...

  13. A feedback-trained autonomous control system for heterogeneous search and rescue applications

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2012-06-01

    Due to the environment in which operation occurs, earch and rescue (SAR) applications present a challenge to autonomous systems. A control technique for a heterogeneous multi-robot group is discussed. The proposed methodology is not fully autonomous; however, human operators are freed from most control tasks and allowed to focus on perception tasks while robots execute a collaborative search and identification plan. Robotic control combines a centralized dispatch and learning system (which continuously refines heuristics used for planning) with local autonomous task ordering (based on existing task priority and proximity and local conditions). This technique was tested in a SAR analogous (from a control perspective) environment.

  14. Robots for Astrobiology!

    NASA Technical Reports Server (NTRS)

    Boston, Penelope J.

    2016-01-01

    The search for life and its study is known as astrobiology. Conducting that search on other planets in our Solar System is a major goal of NASA and other space agencies, and a driving passion of the community of scientists and engineers around the world. We practice for that search in many ways, from exploring and studying extreme environments on Earth, to developing robots to go to other planets and help us look for any possible life that may be there or may have been there in the past. The unique challenges of space exploration make collaborations between robots and humans essential. The products of those collaborations will be novel and driven by the features of wholly new environments. For space and planetary environments that are intolerable for humans or where humans present an unacceptable risk to possible biologically sensitive sites, autonomous robots or telepresence offer excellent choices. The search for life signs on Mars fits within this category, especially in advance of human landed missions there, but also as assistants and tools once humans reach the Red Planet. For planetary destinations where we do not envision humans ever going in person, like bitterly cold icy moons, or ocean worlds with thick ice roofs that essentially make them planetary-sized ice caves, we will rely on robots alone to visit those environments for us and enable us to explore and understand any life that we may find there. Current generation robots are not quite ready for some of the tasks that we need them to do, so there are many opportunities for roboticists of the future to advance novel types of mobility, autonomy, and bio-inspired robotic designs to help us accomplish our astrobiological goals. We see an exciting partnership between robotics and astrobiology continually strengthening as we jointly pursue the quest to find extraterrestrial life.

  15. Virtual reality: an intuitive approach to robotics

    NASA Astrophysics Data System (ADS)

    Natonek, Emerico; Flueckiger, Lorenzo; Zimmerman, Thierry; Baur, Charles

    1995-12-01

    Tasks definition for manipulators or robotic systems (conventional or mobile) usually lack on performance and are sometimes impossible to design. The `On-line' programming methods are often time expensive or risky for the human operator or the robot itself. On the other hand, `Off-line' techniques are tedious and complex. In a virtual reality robotics environment (VRRE), users are not asked to write down complicated functions to specify robotic tasks. However a VRRE is only effective if all the environment changes and object movements are fed-back to the virtual manipulating system. Thus some kind of visual or multi-sensor feedback is needed. This paper describes a semi autonomous robot system composed of an industrial 5-axis robot and its virtual equivalent. The user is immersed in a 3-D space built out of the robot's environment models. He directly interacts with the virtual `components' in an intuitive way creating trajectories, tasks, and dynamically optimizing them. A vision system is used to recognize the position and orientation of the objects in the real robot workspace, and updates the VRRE through a bi-directional communication link. Once the tasks have been optimized on the VRRE, they are sent to the real robot and a semi autonomous process ensures their correct execution thanks to a camera directly mounted on the robot's end effector. Therefore, errors and drifts due to transmission delays can be locally processed and successfully avoided. The system can execute the tasks autonomously, independently of small environmental changes due to transmission delays. If the environmental changes are too important the robot stops re-actualizes the VRRE with the new environmental configuration and waits for task redesign.

  16. Outdoor heat exchanger section

    SciTech Connect

    Kessler, A.F.; Smiley, W.A. III; Wendt, M.E.

    1988-02-09

    An outdoor section for an air conditioning system is described comprising: a compressor; a heat exchanger; a cabinet having an upper cabinet section, a lower cabinet section and a louvered lower section top cover, the heat exchanger and the compressor being housed in the lower cabinet section and the upper cabinet section having a solid top which overlies the louvers in the lower section top cover; and a fan disposed in the lower cabinet section to draw air through the sides of the lower cabinet section and through the heat exchanger housed therein, the fan discharging air, after having been drawn through the heat exchanger, upward through the louvers in the lower cabinet section top cover and into the interior of the upper cabinet section.

  17. A Landing Platform with Robotic Self-Leveling Capability

    NASA Astrophysics Data System (ADS)

    Buchwald, R.

    2014-06-01

    A robotic concept for the autonomous touchdown, self-leveling and lowering of the landing platform to the ground for a simplified rover egress has been developed and tested using a terrestrial demonstrator of a full scale Mars landing platform.

  18. Control of free-flying space robot manipulator systems

    NASA Technical Reports Server (NTRS)

    Cannon, Robert H., Jr.

    1990-01-01

    New control techniques for self contained, autonomous free flying space robots were developed and tested experimentally. Free flying robots are envisioned as a key element of any successful long term presence in space. These robots must be capable of performing the assembly, maintenance, and inspection, and repair tasks that currently require human extravehicular activity (EVA). A set of research projects were developed and carried out using lab models of satellite robots and a flexible manipulator. The second generation space robot models use air cushion vehicle (ACV) technology to simulate in 2-D the drag free, zero g conditions of space. The current work is divided into 5 major projects: Global Navigation and Control of a Free Floating Robot, Cooperative Manipulation from a Free Flying Robot, Multiple Robot Cooperation, Thrusterless Robotic Locomotion, and Dynamic Payload Manipulation. These projects are examined in detail.

  19. Robots, systems, and methods for hazard evaluation and visualization

    DOEpatents

    Nielsen, Curtis W.; Bruemmer, David J.; Walton, Miles C.; Hartley, Robert S.; Gertman, David I.; Kinoshita, Robert A.; Whetten, Jonathan

    2013-01-15

    A robot includes a hazard sensor, a locomotor, and a system controller. The robot senses a hazard intensity at a location of the robot, moves to a new location in response to the hazard intensity, and autonomously repeats the sensing and moving to determine multiple hazard levels at multiple locations. The robot may also include a communicator to communicate the multiple hazard levels to a remote controller. The remote controller includes a communicator for sending user commands to the robot and receiving the hazard levels from the robot. A graphical user interface displays an environment map of the environment proximate the robot and a scale for indicating a hazard intensity. A hazard indicator corresponds to a robot position in the environment map and graphically indicates the hazard intensity at the robot position relative to the scale.

  20. CANINE: a robotic mine dog

    NASA Astrophysics Data System (ADS)

    Stancil, Brian A.; Hyams, Jeffrey; Shelley, Jordan; Babu, Kartik; Badino, Hernán.; Bansal, Aayush; Huber, Daniel; Batavia, Parag

    2013-01-01

    Neya Systems, LLC competed in the CANINE program sponsored by the U.S. Army Tank Automotive Research Development and Engineering Center (TARDEC) which culminated in a competition held at Fort Benning as part of the 2012 Robotics Rodeo. As part of this program, we developed a robot with the capability to learn and recognize the appearance of target objects, conduct an area search amid distractor objects and obstacles, and relocate the target object in the same way that Mine dogs and Sentry dogs are used within military contexts for exploration and threat detection. Neya teamed with the Robotics Institute at Carnegie Mellon University to develop vision-based solutions for probabilistic target learning and recognition. In addition, we used a Mission Planning and Management System (MPMS) to orchestrate complex search and retrieval tasks using a general set of modular autonomous services relating to robot mobility, perception and grasping.