Science.gov

Sample records for outdoor autonomous robots

  1. Vision-based semi-autonomous outdoor robot system to reduce soldier workload

    NASA Astrophysics Data System (ADS)

    Richardson, Al; Rodgers, Michael H.

    2001-09-01

    Sensors and computational capability have not reached the point to enable small robots to navigate autonomously in unconstrained outdoor environments at tactically useful speeds. This problem is greatly reduced, however, if a soldier can lead the robot through terrain that he knows it can traverse. An application of this concept is a small pack-mule robot that follows a foot soldier over outdoor terrain. The solder would be responsible to avoid situations beyond the robot's limitations when encountered. Having learned the route, the robot could autonomously retrace the path carrying supplies and munitions. This would greatly reduce the soldier's workload under normal conditions. This paper presents a description of a developmental robot sensor system using low-cost commercial 3D vision and inertial sensors to address this application. The robot moves at fast walking speed and requires only short-range perception to accomplish its task. 3D-feature information is recorded on a composite route map that the robot uses to negotiate its local environment and retrace the path taught by the soldier leader.

  2. Autonomous robot using infrared thermal camera to discriminate objects in outdoor scene

    NASA Technical Reports Server (NTRS)

    Caillas, C.

    1990-01-01

    A complete autonomous legged robot is beig designed at Carnegie Mellon University to perform planetary exploration without human supervision. This robot must traverse unknown and geographically diverse areas in order to collect samples of materials. This paper describes how thermal imaging can be used to identify materials in order to find good footfall positions and collection sites of material. First, a model developed for determining the temperature of materials in an outdoor scene is presented. By applying this model, it is shown that it is possible to determine a physical characteristic of the material: thermal inertia. Second, experimental results are described that consist in recording thermal images of an outdoor scene constituted with sand and rock. Third, results and limitations of applying the model to experimental images are analyzed. Finally, the paper analyzes how basic segmentation algorithms can be combined with the thermal inertia segmentation in order to improve the discrimination of different kinds of materials.

  3. An adaptive localization system for outdoor/indoor navigation for autonomous robots

    NASA Astrophysics Data System (ADS)

    Pacis, E. B.; Sights, B.; Ahuja, G.; Kogut, G.; Everett, H. R.

    2006-05-01

    Many envisioned applications of mobile robotic systems require the robot to navigate in complex urban environments. This need is particularly critical if the robot is to perform as part of a synergistic team with human forces in military operations. Historically, the development of autonomous navigation for mobile robots has targeted either outdoor or indoor scenarios, but not both, which is not how humans operate. This paper describes efforts to fuse component technologies into a complete navigation system, allowing a robot to seamlessly transition between outdoor and indoor environments. Under the Joint Robotics Program's Technology Transfer project, empirical evaluations of various localization approaches were conducted to assess their maturity levels and performance metrics in different exterior/interior settings. The methodologies compared include Markov localization, global positioning system, Kalman filtering, and fuzzy-logic. Characterization of these technologies highlighted their best features, which were then fused into an adaptive solution. A description of the final integrated system is discussed, including a presentation of the design, experimental results, and a formal demonstration to attendees of the Unmanned Systems Capabilities Conference II in San Diego in December 2005.

  4. Robotic Lander Completes Multiple Outdoor Flight

    NASA Video Gallery

    NASA’s Robotic Lander Development Project in Huntsville, Ala., has successfully completed seven autonomous outdoor flight tests of a lander prototype, dubbed Mighty Eagle. On Oct. 14, Mighty Eagl...

  5. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  6. Autonomous mobile robots: Vehicles with cognitive control

    SciTech Connect

    Meystel, A.

    1987-01-01

    This book explores a new rapidly developing area of robotics. It describes the state-of-the-art intelligence control, applied machine intelligence, and research and initial stages of manufacturing of autonomous mobile robots. A complete account of the theoretical and experimental results obtained during the last two decades together with some generalizations on Autonomous Mobile Systems are included in this book. Contents: Introduction; Requirements and Specifications; State-of-the-art in Autonomous Mobile Robots Area; Structure of Intelligent Mobile Autonomous System; Planner, Navigator; Pilot; Cartographer; Actuation Control; Computer Simulation of Autonomous Operation; Testing the Autonomous Mobile Robot; Conclusions; Bibliography.

  7. Autonomous caregiver following robotic wheelchair

    NASA Astrophysics Data System (ADS)

    Ratnam, E. Venkata; Sivaramalingam, Sethurajan; Vignesh, A. Sri; Vasanth, Elanthendral; Joans, S. Mary

    2011-12-01

    In the last decade, a variety of robotic/intelligent wheelchairs have been proposed to meet the need in aging society. Their main research topics are autonomous functions such as moving toward some goals while avoiding obstacles, or user-friendly interfaces. Although it is desirable for wheelchair users to go out alone, caregivers often accompany them. Therefore we have to consider not only autonomous functions and user interfaces but also how to reduce caregivers' load and support their activities in a communication aspect. From this point of view, we have proposed a robotic wheelchair moving with a caregiver side by side based on the MATLAB process. In this project we discussing about robotic wheel chair to follow a caregiver by using a microcontroller, Ultrasonic sensor, keypad, Motor drivers to operate robot. Using camera interfaced with the DM6437 (Davinci Code Processor) image is captured. The captured image are then processed by using image processing technique, the processed image are then converted into voltage levels through MAX 232 level converter and given it to the microcontroller unit serially and ultrasonic sensor to detect the obstacle in front of robot. In this robot we have mode selection switch Automatic and Manual control of robot, we use ultrasonic sensor in automatic mode to find obstacle, in Manual mode to use the keypad to operate wheel chair. In the microcontroller unit, c language coding is predefined, according to this coding the robot which connected to it was controlled. Robot which has several motors is activated by using the motor drivers. Motor drivers are nothing but a switch which ON/OFF the motor according to the control given by the microcontroller unit.

  8. Miniature Autonomous Robotic Vehicle (MARV)

    SciTech Connect

    Feddema, J.T.; Kwok, K.S.; Driessen, B.J.; Spletzer, B.L.; Weber, T.M.

    1996-12-31

    Sandia National Laboratories (SNL) has recently developed a 16 cm{sup 3} (1 in{sup 3}) autonomous robotic vehicle which is capable of tracking a single conducting wire carrying a 96 kHz signal. This vehicle was developed to assess the limiting factors in using commercial technology to build miniature autonomous vehicles. Particular attention was paid to the design of the control system to search out the wire, track it, and recover if the wire was lost. This paper describes the test vehicle and the control analysis. Presented in the paper are the vehicle model, control laws, a stability analysis, simulation studies and experimental results.

  9. A power autonomous monopedal robot

    NASA Astrophysics Data System (ADS)

    Krupp, Benjamin T.; Pratt, Jerry E.

    2006-05-01

    We present the design and initial results of a power-autonomous planar monopedal robot. The robot is a gasoline powered, two degree of freedom robot that runs in a circle, constrained by a boom. The robot uses hydraulic Series Elastic Actuators, force-controllable actuators which provide high force fidelity, moderate bandwidth, and low impedance. The actuators are mounted in the body of the robot, with cable drives transmitting power to the hip and knee joints of the leg. A two-stroke, gasoline engine drives a constant displacement pump which pressurizes an accumulator. Absolute position and spring deflection of each of the Series Elastic Actuators are measured using linear encoders. The spring deflection is translated into force output and compared to desired force in a closed loop force-control algorithm implemented in software. The output signal of each force controller drives high performance servo valves which control flow to each of the pistons of the actuators. In designing the robot, we used a simulation-based iterative design approach. Preliminary estimates of the robot's physical parameters were based on past experience and used to create a physically realistic simulation model of the robot. Next, a control algorithm was implemented in simulation to produce planar hopping. Using the joint power requirements and range of motions from simulation, we worked backward specifying pulley diameter, piston diameter and stroke, hydraulic pressure and flow, servo valve flow and bandwidth, gear pump flow, and engine power requirements. Components that meet or exceed these specifications were chosen and integrated into the robot design. Using CAD software, we calculated the physical parameters of the robot design, replaced the original estimates with the CAD estimates, and produced new joint power requirements. We iterated on this process, resulting in a design which was prototyped and tested. The Monopod currently runs at approximately 1.2 m/s with the weight of all

  10. [Mobile autonomous robots-Possibilities and limits].

    PubMed

    Maehle, E; Brockmann, W; Walthelm, A

    2002-02-01

    Besides industrial robots, which today are firmly established in production processes, service robots are becoming more and more important. They shall provide services for humans in different areas of their professional and everyday environment including medicine. Most of these service robots are mobile which requires an intelligent autonomous behaviour. After characterising the different kinds of robots the relevant paradigms of intelligent autonomous behaviour for mobile robots are critically discussed in this paper and illustrated by three concrete examples of robots realized in Lübeck. In addition a short survey of actual kinds of surgical robots as well as an outlook to future developments is given.

  11. Autonomous Robotic Inspection in Tunnels

    NASA Astrophysics Data System (ADS)

    Protopapadakis, E.; Stentoumis, C.; Doulamis, N.; Doulamis, A.; Loupos, K.; Makantasis, K.; Kopsiaftis, G.; Amditis, A.

    2016-06-01

    In this paper, an automatic robotic inspector for tunnel assessment is presented. The proposed platform is able to autonomously navigate within the civil infrastructures, grab stereo images and process/analyse them, in order to identify defect types. At first, there is the crack detection via deep learning approaches. Then, a detailed 3D model of the cracked area is created, utilizing photogrammetric methods. Finally, a laser profiling of the tunnel's lining, for a narrow region close to detected crack is performed; allowing for the deduction of potential deformations. The robotic platform consists of an autonomous mobile vehicle; a crane arm, guided by the computer vision-based crack detector, carrying ultrasound sensors, the stereo cameras and the laser scanner. Visual inspection is based on convolutional neural networks, which support the creation of high-level discriminative features for complex non-linear pattern classification. Then, real-time 3D information is accurately calculated and the crack position and orientation is passed to the robotic platform. The entire system has been evaluated in railway and road tunnels, i.e. in Egnatia Highway and London underground infrastructure.

  12. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel. PMID:26227680

  13. Spatial abstraction for autonomous robot navigation.

    PubMed

    Epstein, Susan L; Aroor, Anoop; Evanusa, Matthew; Sklar, Elizabeth I; Parsons, Simon

    2015-09-01

    Optimal navigation for a simulated robot relies on a detailed map and explicit path planning, an approach problematic for real-world robots that are subject to noise and error. This paper reports on autonomous robots that rely on local spatial perception, learning, and commonsense rationales instead. Despite realistic actuator error, learned spatial abstractions form a model that supports effective travel.

  14. Hierarchical loop detection for mobile outdoor robots

    NASA Astrophysics Data System (ADS)

    Lang, Dagmar; Winkens, Christian; Häselich, Marcel; Paulus, Dietrich

    2012-01-01

    Loop closing is a fundamental part of 3D simultaneous localization and mapping (SLAM) that can greatly enhance the quality of long-term mapping. It is essential for the creation of globally consistent maps. Conceptually, loop closing is divided into detection and optimization. Recent approaches depend on a single sensor to recognize previously visited places in the loop detection stage. In this study, we combine data of multiple sensors such as GPS, vision, and laser range data to enhance detection results in repetitively changing environments that are not sufficiently explained by a single sensor. We present a fast and robust hierarchical loop detection algorithm for outdoor robots to achieve a reliable environment representation even if one or more sensors fail.

  15. Tele-robotic/autonomous control using controlshell

    SciTech Connect

    Wilhelmsen, K.C.; Hurd, R.L.; Couture, S.

    1996-12-10

    A tele-robotic and autonomous controller architecture for waste handling and sorting has been developed which uses tele-robotics, autonomous grasping and image processing. As a starting point, prior work from LLNL and ORNL was restructured and ported to a special real-time development environment. Significant improvements in collision avoidance, force compliance, and shared control aspects were then developed. Several orders of magnitude improvement were made in some areas to meet the speed and robustness requirements of the application.

  16. Progress in outdoor navigation by the SAIL developmental robot

    NASA Astrophysics Data System (ADS)

    Zhang, Nan; Weng, John J.; Huang, Xiao

    2002-02-01

    A sensory mapping method, called Staggered Hierarchical Mapping (SHM), and its developmental algorithm are described in this paper. SHM is a model motivated by human early visual pathways including processing performed by the retina, Lateral Geniculate Nucleus (LGN) and the primary visual cortex. The work reported here concerns not only the design of such a series of processors but also their autonomous development. The primary goal is to address a long standing open problem of visual information processing in that processing elements that are dedicated to receptive fields of different retinal positions and different scales (sizes) must be concurrently functioning, in robotic and other applications in unstructured environments. A new Incremental Principal Component Analysis (IPCA) method is used to automatically develop orientation sensitive and other needed filters. For a fast convergence, the lateral inhibition of sensory neurons is modelled by what is called residual images. A set of staggered receptive fields models the pattern of positioning of processing cells. From sequentially sensed video frames, the proposed developing algorithm develops a hierarchy of filters, whose outputs are uncorrelated within each layer, but with increasing scale of receptive fields from low to higher layers. To study the completeness of the representation generated by the SHM, we experimentally show that the response produced at any layer is sufficient to corresponding retinal image. As an application domain, we describe out preliminary experiments of autonomous navigation by the SAIL robot, and why a mapping like the SHM is needed in our next phase of work of vision guided autonomous navigation in outdoor environments.

  17. Control algorithms for autonomous robot navigation

    SciTech Connect

    Jorgensen, C.C.

    1985-09-20

    This paper examines control algorithm requirements for autonomous robot navigation outside laboratory environments. Three aspects of navigation are considered: navigation control in explored terrain, environment interactions with robot sensors, and navigation control in unanticipated situations. Major navigation methods are presented and relevance of traditional human learning theory is discussed. A new navigation technique linking graph theory and incidental learning is introduced.

  18. Autonomous Student Experiences in Outdoor and Adventure Education

    ERIC Educational Resources Information Center

    Daniel, Brad; Bobilya, Andrew J.; Kalisch, Kenneth R.; McAvoy, Leo H.

    2014-01-01

    This article explores the current state of knowledge regarding the use of autonomous student experiences (ASE) in outdoor and adventure education (OAE) programs. ASE are defined as components (e.g., solo, final expedition) in which participants have a greater measure of choice and control over the planning, execution, and outcomes of their…

  19. Automatic learning by an autonomous mobile robot

    SciTech Connect

    de Saussure, G.; Spelt, P.F.; Killough, S.M.; Pin, F.G.; Weisbin, C.R.

    1989-01-01

    This paper describes recent research in automatic learning by the autonomous mobile robot HERMIES-IIB at the Center for Engineering Systems Advanced Research (CESAR). By acting on the environment and observing the consequences during a set of training examples, the robot learns a sequence of successful manipulations on a simulated control panel. The robot learns to classify panel configurations in order to deal with new configurations that are not part of the original training set. 5 refs., 2 figs.

  20. Tele/Autonomous Robot For Nuclear Facilities

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Tso, Kam S.

    1994-01-01

    Fail-safe tele/autonomous robotic system makes it unnecessary for human technicians to enter nuclear-fuel-reprocessing facilities and other high-radiation or otherwise hazardous industrial environments. Used to carry out experiments as exchanging equipment modules, turning bolts, cleaning surfaces, and grappling turning objects by use of mixture of autonomous actions and teleoperation with either single arm or two cooperating arms. System capable of fully autonomous operation, teleoperation or shared control.

  1. Mapping planetary caves with an autonomous, heterogeneous robot team

    NASA Astrophysics Data System (ADS)

    Husain, Ammar; Jones, Heather; Kannan, Balajee; Wong, Uland; Pimentel, Tiago; Tang, Sarah; Daftry, Shreyansh; Huber, Steven; Whittaker, William L.

    Caves on other planetary bodies offer sheltered habitat for future human explorers and numerous clues to a planet's past for scientists. While recent orbital imagery provides exciting new details about cave entrances on the Moon and Mars, the interiors of these caves are still unknown and not observable from orbit. Multi-robot teams offer unique solutions for exploration and modeling subsurface voids during precursor missions. Robot teams that are diverse in terms of size, mobility, sensing, and capability can provide great advantages, but this diversity, coupled with inherently distinct low-level behavior architectures, makes coordination a challenge. This paper presents a framework that consists of an autonomous frontier and capability-based task generator, a distributed market-based strategy for coordinating and allocating tasks to the different team members, and a communication paradigm for seamless interaction between the different robots in the system. Robots have different sensors, (in the representative robot team used for testing: 2D mapping sensors, 3D modeling sensors, or no exteroceptive sensors), and varying levels of mobility. Tasks are generated to explore, model, and take science samples. Based on an individual robot's capability and associated cost for executing a generated task, a robot is autonomously selected for task execution. The robots create coarse online maps and store collected data for high resolution offline modeling. The coordination approach has been field tested at a mock cave site with highly-unstructured natural terrain, as well as an outdoor patio area. Initial results are promising for applicability of the proposed multi-robot framework to exploration and modeling of planetary caves.

  2. Development of autonomous grasping and navigating robot

    NASA Astrophysics Data System (ADS)

    Kudoh, Hiroyuki; Fujimoto, Keisuke; Nakayama, Yasuichi

    2015-01-01

    The ability to find and grasp target items in an unknown environment is important for working robots. We developed an autonomous navigating and grasping robot. The operations are locating a requested item, moving to where the item is placed, finding the item on a shelf or table, and picking the item up from the shelf or the table. To achieve these operations, we designed the robot with three functions: an autonomous navigating function that generates a map and a route in an unknown environment, an item position recognizing function, and a grasping function. We tested this robot in an unknown environment. It achieved a series of operations: moving to a destination, recognizing the positions of items on a shelf, picking up an item, placing it on a cart with its hand, and returning to the starting location. The results of this experiment show the applicability of reducing the workforce with robots.

  3. INL Autonomous Navigation System

    SciTech Connect

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  4. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  5. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1989-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  6. Diagnosing faults in autonomous robot plan execution

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Doshi, Rajkumar S.; Atkinson, David J.; Lawson, Denise M.

    1988-01-01

    A major requirement for an autonomous robot is the capability to diagnose faults during plan execution in an uncertain environment. Many diagnostic researches concentrate only on hardware failures within an autonomous robot. Taking a different approach, the implementation of a Telerobot Diagnostic System that addresses, in addition to the hardware failures, failures caused by unexpected event changes in the environment or failures due to plan errors, is described. One feature of the system is the utilization of task-plan knowledge and context information to deduce fault symptoms. This forward deduction provides valuable information on past activities and the current expectations of a robotic event, both of which can guide the plan-execution inference process. The inference process adopts a model-based technique to recreate the plan-execution process and to confirm fault-source hypotheses. This technique allows the system to diagnose multiple faults due to either unexpected plan failures or hardware errors. This research initiates a major effort to investigate relationships between hardware faults and plan errors, relationships which were not addressed in the past. The results of this research will provide a clear understanding of how to generate a better task planner for an autonomous robot and how to recover the robot from faults in a critical environment.

  7. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1994-06-28

    An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.

  8. Autonomous mobile robot for radiologic surveys

    DOEpatents

    Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.

    1994-01-01

    An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.

  9. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-01-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  10. A mobile autonomous robot for radiological surveys

    SciTech Connect

    Dudar, A.M.; Wagner, D.G.; Teese, G.D.

    1992-10-01

    The Robotics Development Group at the Savannah River Site is developing an autonomous robot (SIMON) to perform radiological surveys of potentially contaminated floors. The robot scans floors at a speed of one-inch/second and stops, sounds an alarm, and flashes lights when contamination in a certain area is detected. The contamination of interest here is primarily alpha and beta-gamma. The robot, a Cybermotion K2A base, is radio controlled, uses dead reckoning to determine vehicle position, and docks with a charging station to replenish its batteries and calibrate its position. It uses an ultrasonic ranging system for collision avoidance. In addition, two safety bumpers located in the front and the back of the robot will stop the robots motion when they are depressed. Paths for the robot are preprogrammed and the robots motion can be monitored on a remote screen which shows a graphical map of the environment. The radiation instrument being used is an Eberline RM22A monitor. This monitor is microcomputer based with a serial I/0 interface for remote operation. Up to 30 detectors may be configured with the RM22A.

  11. Supervised autonomous robotic soft tissue surgery.

    PubMed

    Shademan, Azad; Decker, Ryan S; Opfermann, Justin D; Leonard, Simon; Krieger, Axel; Kim, Peter C W

    2016-05-01

    The current paradigm of robot-assisted surgeries (RASs) depends entirely on an individual surgeon's manual capability. Autonomous robotic surgery-removing the surgeon's hands-promises enhanced efficacy, safety, and improved access to optimized surgical techniques. Surgeries involving soft tissue have not been performed autonomously because of technological limitations, including lack of vision systems that can distinguish and track the target tissues in dynamic surgical environments and lack of intelligent algorithms that can execute complex surgical tasks. We demonstrate in vivo supervised autonomous soft tissue surgery in an open surgical setting, enabled by a plenoptic three-dimensional and near-infrared fluorescent (NIRF) imaging system and an autonomous suturing algorithm. Inspired by the best human surgical practices, a computer program generates a plan to complete complex surgical tasks on deformable soft tissue, such as suturing and intestinal anastomosis. We compared metrics of anastomosis-including the consistency of suturing informed by the average suture spacing, the pressure at which the anastomosis leaked, the number of mistakes that required removing the needle from the tissue, completion time, and lumen reduction in intestinal anastomoses-between our supervised autonomous system, manual laparoscopic surgery, and clinically used RAS approaches. Despite dynamic scene changes and tissue movement during surgery, we demonstrate that the outcome of supervised autonomous procedures is superior to surgery performed by expert surgeons and RAS techniques in ex vivo porcine tissues and in living pigs. These results demonstrate the potential for autonomous robots to improve the efficacy, consistency, functional outcome, and accessibility of surgical techniques. PMID:27147588

  12. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks. PMID:24852272

  13. Flocking algorithm for autonomous flying robots.

    PubMed

    Virágh, Csaba; Vásárhelyi, Gábor; Tarcai, Norbert; Szörényi, Tamás; Somorjai, Gergő; Nepusz, Tamás; Vicsek, Tamás

    2014-06-01

    Animal swarms displaying a variety of typical flocking patterns would not exist without the underlying safe, optimal and stable dynamics of the individuals. The emergence of these universal patterns can be efficiently reconstructed with agent-based models. If we want to reproduce these patterns with artificial systems, such as autonomous aerial robots, agent-based models can also be used in their control algorithms. However, finding the proper algorithms and thus understanding the essential characteristics of the emergent collective behaviour requires thorough and realistic modeling of the robot and also the environment. In this paper, we first present an abstract mathematical model of an autonomous flying robot. The model takes into account several realistic features, such as time delay and locality of communication, inaccuracy of the on-board sensors and inertial effects. We present two decentralized control algorithms. One is based on a simple self-propelled flocking model of animal collective motion, the other is a collective target tracking algorithm. Both algorithms contain a viscous friction-like term, which aligns the velocities of neighbouring agents parallel to each other. We show that this term can be essential for reducing the inherent instabilities of such a noisy and delayed realistic system. We discuss simulation results on the stability of the control algorithms, and perform real experiments to show the applicability of the algorithms on a group of autonomous quadcopters. In our case, bio-inspiration works in two ways. On the one hand, the whole idea of trying to build and control a swarm of robots comes from the observation that birds tend to flock to optimize their behaviour as a group. On the other hand, by using a realistic simulation framework and studying the group behaviour of autonomous robots we can learn about the major factors influencing the flight of bird flocks.

  14. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings. PMID:22893571

  15. Embodied cognition for autonomous interactive robots.

    PubMed

    Hoffman, Guy

    2012-10-01

    In the past, notions of embodiment have been applied to robotics mainly in the realm of very simple robots, and supporting low-level mechanisms such as dynamics and navigation. In contrast, most human-like, interactive, and socially adept robotic systems turn away from embodiment and use amodal, symbolic, and modular approaches to cognition and interaction. At the same time, recent research in Embodied Cognition (EC) is spanning an increasing number of complex cognitive processes, including language, nonverbal communication, learning, and social behavior. This article suggests adopting a modern EC approach for autonomous robots interacting with humans. In particular, we present three core principles from EC that may be applicable to such robots: (a) modal perceptual representation, (b) action/perception and action/cognition integration, and (c) a simulation-based model of top-down perceptual biasing. We describe a computational framework based on these principles, and its implementation on two physical robots. This could provide a new paradigm for embodied human-robot interaction based on recent psychological and neurological findings.

  16. Autonomous Mobile Robot That Can Read

    NASA Astrophysics Data System (ADS)

    Létourneau, Dominic; Michaud, François; Valin, Jean-Marc

    2004-12-01

    The ability to read would surely contribute to increased autonomy of mobile robots operating in the real world. The process seems fairly simple: the robot must be capable of acquiring an image of a message to read, extract the characters, and recognize them as symbols, characters, and words. Using an optical Character Recognition algorithm on a mobile robot however brings additional challenges: the robot has to control its position in the world and its pan-tilt-zoom camera to find textual messages to read, potentially having to compensate for its viewpoint of the message, and use the limited onboard processing capabilities to decode the message. The robot also has to deal with variations in lighting conditions. In this paper, we present our approach demonstrating that it is feasible for an autonomous mobile robot to read messages of specific colors and font in real-world conditions. We outline the constraints under which the approach works and present results obtained using a Pioneer 2 robot equipped with a Pentium 233 MHz and a Sony EVI-D30 pan-tilt-zoom camera.

  17. Design of an autonomous exterior security robot

    NASA Technical Reports Server (NTRS)

    Myers, Scott D.

    1994-01-01

    This paper discusses the requirements and preliminary design of robotic vehicle designed for performing autonomous exterior perimeter security patrols around warehouse areas, ammunition supply depots, and industrial parks for the U.S. Department of Defense. The preliminary design allows for the operation of up to eight vehicles in a six kilometer by six kilometer zone with autonomous navigation and obstacle avoidance. In addition to detection of crawling intruders at 100 meters, the system must perform real-time inventory checking and database comparisons using a microwave tags system.

  18. Autonomous mobile robot research using the HERMIES-III robot

    SciTech Connect

    Pin, F.G.; Beckerman, M.; Spelt, P.F.; Robinson, J.T.; Weisbin, C.R.

    1989-01-01

    This paper reports on the status and future directions in the research, development and experimental validation of intelligent control techniques for autonomous mobile robots using the HERMIES-III robot at the Center for Engineering Systems Advanced research (CESAR) at Oak Ridge National Laboratory (ORNL). HERMIES-III is the fourth robot in a series of increasingly more sophisticated and capable experimental test beds developed at CESAR. HERMIES-III is comprised of a battery powered, onmi-directional wheeled platform with a seven degree-of-freedom manipulator arm, video cameras, sonar range sensors, laser imaging scanner and a dual computer system containing up to 128 NCUBE nodes in hypercube configuration. All electronics, sensors, computers, and communication equipment required for autonomous operation of HERMIES-III are located on board along with sufficient battery power for three to four hours of operation. The paper first provides a more detailed description of the HERMIES-III characteristics, focussing on the new areas of research and demonstration now possible at CESAR with this new test-bed. The initial experimental program is then described with emphasis placed on autonomous performance of human-scale tasks (e.g., valve manipulation, use of tools), integration of a dexterous manipulator and platform motion in geometrically complex environments, and effective use of multiple cooperating robots (HERMIES-IIB and HERMIES- III). The paper concludes with a discussion of the integration problems and safety considerations necessarily arising from the set-up of an experimental program involving human-scale, multi-autonomous mobile robots performance. 10 refs., 3 figs.

  19. Autonomous vehicle platforms from modular robotic components

    NASA Astrophysics Data System (ADS)

    Schonlau, William J.

    2004-09-01

    A brief survey of current autonomous vehicle (AV) projects is presented with intent to find common infrastructure or subsystems that can be configured from commercially available modular robotic components, thereby providing developers with greatly reduced timelines and costs and encouraging focus on the selected problem domain. The Modular Manipulator System (MMS) robotic system, based on single degree of freedom rotary and linear modules, is introduced and some approaches to autonomous vehicle configuration and deployment are examined. The modules may be configured to provide articulated suspensions for very rugged terrain and fall recovery, articulated sensors and tooling plus a limited capacity for self repair and self reconfiguration. The MMS on-board visually programmed control software (Model Manager) supports experimentation with novel physical configurations and behavior algorithms via real-time 3D graphics for operations simulation and provides useful subsystems for vision, learning and planning to host intelligent behavior.

  20. An architecture for an autonomous learning robot

    NASA Technical Reports Server (NTRS)

    Tillotson, Brian

    1988-01-01

    An autonomous learning device must solve the example bounding problem, i.e., it must divide the continuous universe into discrete examples from which to learn. We describe an architecture which incorporates an example bounder for learning. The architecture is implemented in the GPAL program. An example run with a real mobile robot shows that the program learns and uses new causal, qualitative, and quantitative relationships.

  1. Evolutionary neurocontrollers for autonomous mobile robots.

    PubMed

    Floreano, D; Mondada, F

    1998-10-01

    In this article we describe a methodology for evolving neurocontrollers of autonomous mobile robots without human intervention. The presentation, which spans from technological and methodological issues to several experimental results on evolution of physical mobile robots, covers both previous and recent work in the attempt to provide a unified picture within which the reader can compare the effects of systematic variations on the experimental settings. After describing some key principles for building mobile robots and tools suitable for experiments in adaptive robotics, we give an overview of different approaches to evolutionary robotics and present our methodology. We start reviewing two basic experiments showing that different environments can shape very different behaviours and neural mechanisms under very similar selection criteria. We then address the issue of incremental evolution in two different experiments from the perspective of changing environments and robot morphologies. Finally, we investigate the possibility of evolving plastic neurocontrollers and analyse an evolved neurocontroller that relies on fast and continuously changing synapses characterized by dynamic stability. We conclude by reviewing the implications of this methodology for engineering, biology, cognitive science and artificial life, and point at future directions of research.

  2. A Proposal of Autonomous Robotic Systems Educative Environment

    NASA Astrophysics Data System (ADS)

    Ierache, Jorge; Garcia-Martinez, Ramón; de Giusti, Armando

    This work presents our experiences in the implementation of a laboratory of autonomous robotic systems applied to the training of beginner and advanced students doing a degree course in Computer Engineering., taking into account the specific technologies, robots, autonomous toys, and programming languages. They provide a strategic opportunity for human resources formation by involving different aspects which range from the specification elaboration, modeling, software development and implementation and testing of an autonomous robotic system.

  3. Development of Virtual Robot Based on Autonomous Behavior Acquisition

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masahito; Iwadate, Kenji; Ooe, Ryosuke; Suzuki, Ikuo; Furukawa, Masashi

    In this paper, we demonstrate a design of autonomous virtual robots and develop a design tool for autonomous virtual robots. A virtual robot can behave autonomously by using its own sensors and controllers on three-dimensional physically modeled environment. An approximate fluid environment model based on the drag force modeling is presented. The developed tool can simulate a physical environment at any time during the modeling process. A combinatorial use of neural network implementation for controllers and optimization method (genetic algorithm or particle swarm optimization) enables us to create autonomous behaviors of virtual robots.

  4. An autonomous vision-based mobile robot

    NASA Astrophysics Data System (ADS)

    Baumgartner, Eric Thomas

    This dissertation describes estimation and control methods for use in the development of an autonomous mobile robot for structured environments. The navigation of the mobile robot is based on precise estimates of the position and orientation of the robot within its environment. The extended Kalman filter algorithm is used to combine information from the robot's drive wheels with periodic observations of small, wall-mounted, visual cues to produce the precise position and orientation estimates. The visual cues are reliably detected by at least one video camera mounted on the mobile robot. Typical position estimates are accurate to within one inch. A path tracking algorithm is also developed to follow desired reference paths which are taught by a human operator. Because of the time-independence of the tracking algorithm, the speed that the vehicle travels along the reference path is specified independent from the tracking algorithm. The estimation and control methods have been applied successfully to two experimental vehicle systems. Finally, an analysis of the linearized closed-loop control system is performed to study the behavior and the stability of the system as a function of various control parameters.

  5. PRIMUS: autonomous driving robot for military applications

    NASA Astrophysics Data System (ADS)

    Schwartz, Ingo

    2000-07-01

    This article describes the government experimental program PRIMUS (PRogram of Intelligent Mobile Unmanned Systems) and the achieved results of phase C demonstrated in summer 1999 on a military prooving ground. In this program there shall be shown the autonomous driving on an unmanned robot in open terrain. The most possible degree of autonomy shall be reached with today's technology to get a platform for different missions. The goal is to release the soldier from high dangerous tasks, to increase the performance and to come to a reduction of personnel and costs with unmanned systems. In phase C of the program two small tracked vehicles (Digitized Wiesel 2, airtransportable by CH53) are used. One as a robot vehicle the other as a command & control system. The Wiesel 2 is configured as a drive by wire-system and therefore well suited for the adaption of control computers. The autonomous detection and avoidance of obstacles in unknown, not cooperative environment is the main task. For navigation and orientation a sensor package is integrated. To detect obstacles the scene in the driving corridor of the robot is scanned 4 times per second by a 3D- Range image camera (LADAR). The measured 3D-range image is converted into a 2D-obstacle map and used as input for calculation of an obstacle free path. The combination of local navigation (obstacle avoidance) and global navigation leads to a collission free driving in open terrain to a predefined goal point with a velocity of up to 25km/h. A contour tracker with a TV-camera as sensor is also implemented which allows to follow contours (e.g. edge of a meadow) or to drive on paved or unpaved roads with a velocity up to 50km/h. In addition to these autonomous driving modes the operator in the command & control station can drive the robot by remote control. All the functions were successfully demonstrated in the summer 1999 on a military prooving ground. During a mission example the robot vehicle covered a distance of several

  6. Autonomous Navigation by a Mobile Robot

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Aghazarian, Hrand

    2005-01-01

    ROAMAN is a computer program for autonomous navigation of a mobile robot on a long (as much as hundreds of meters) traversal of terrain. Developed for use aboard a robotic vehicle (rover) exploring the surface of a remote planet, ROAMAN could also be adapted to similar use on terrestrial mobile robots. ROAMAN implements a combination of algorithms for (1) long-range path planning based on images acquired by mast-mounted, wide-baseline stereoscopic cameras, and (2) local path planning based on images acquired by body-mounted, narrow-baseline stereoscopic cameras. The long-range path-planning algorithm autonomously generates a series of waypoints that are passed to the local path-planning algorithm, which plans obstacle-avoiding legs between the waypoints. Both the long- and short-range algorithms use an occupancy-grid representation in computations to detect obstacles and plan paths. Maps that are maintained by the long- and short-range portions of the software are not shared because substantial localization errors can accumulate during any long traverse. ROAMAN is not guaranteed to generate an optimal shortest path, but does maintain the safety of the rover.

  7. Development of autonomous eating mechanism for biomimetic robots

    NASA Astrophysics Data System (ADS)

    Jeong, Kil-Woong; Cho, Ik-Jin; Lee, Yun-Jung

    2005-12-01

    Most of the recently developed robots are human friendly robots which imitate animals or humans such as entertainment robot, bio-mimetic robot and humanoid robot. Interest for these robots are being increased because the social trend is focused on health, welfare, and graying. Autonomous eating functionality is most unique and inherent behavior of pets and animals. Most of entertainment robots and pet robots make use of internal-type battery. Entertainment robots and pet robots with internal-type battery are not able to operate during charging the battery. Therefore, if a robot has an autonomous function for eating battery as its feeds, the robot is not only able to operate during recharging energy but also become more human friendly like pets. Here, a new autonomous eating mechanism was introduced for a biomimetic robot, called ELIRO-II(Eating LIzard RObot version 2). The ELIRO-II is able to find a food (a small battery), eat and evacuate by itself. This work describe sub-parts of the developed mechanism such as head-part, mouth-part, and stomach-part. In addition, control system of autonomous eating mechanism is described.

  8. Task-level control for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid

    1994-01-01

    Task-level control refers to the integration and coordination of planning, perception, and real-time control to achieve given high-level goals. Autonomous mobile robots need task-level control to effectively achieve complex tasks in uncertain, dynamic environments. This paper describes the Task Control Architecture (TCA), an implemented system that provides commonly needed constructs for task-level control. Facilities provided by TCA include distributed communication, task decomposition and sequencing, resource management, monitoring and exception handling. TCA supports a design methodology in which robot systems are developed incrementally, starting first with deliberative plans that work in nominal situations, and then layering them with reactive behaviors that monitor plan execution and handle exceptions. To further support this approach, design and analysis tools are under development to provide ways of graphically viewing the system and validating its behavior.

  9. Towards Robot Scientists for autonomous scientific discovery

    PubMed Central

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  10. Towards Robot Scientists for autonomous scientific discovery.

    PubMed

    Sparkes, Andrew; Aubrey, Wayne; Byrne, Emma; Clare, Amanda; Khan, Muhammed N; Liakata, Maria; Markham, Magdalena; Rowland, Jem; Soldatova, Larisa N; Whelan, Kenneth E; Young, Michael; King, Ross D

    2010-01-01

    We review the main components of autonomous scientific discovery, and how they lead to the concept of a Robot Scientist. This is a system which uses techniques from artificial intelligence to automate all aspects of the scientific discovery process: it generates hypotheses from a computer model of the domain, designs experiments to test these hypotheses, runs the physical experiments using robotic systems, analyses and interprets the resulting data, and repeats the cycle. We describe our two prototype Robot Scientists: Adam and Eve. Adam has recently proven the potential of such systems by identifying twelve genes responsible for catalysing specific reactions in the metabolic pathways of the yeast Saccharomyces cerevisiae. This work has been formally recorded in great detail using logic. We argue that the reporting of science needs to become fully formalised and that Robot Scientists can help achieve this. This will make scientific information more reproducible and reusable, and promote the integration of computers in scientific reasoning. We believe the greater automation of both the physical and intellectual aspects of scientific investigations to be essential to the future of science. Greater automation improves the accuracy and reliability of experiments, increases the pace of discovery and, in common with conventional laboratory automation, removes tedious and repetitive tasks from the human scientist. PMID:20119518

  11. Autonomous biomorphic robots as platforms for sensors

    SciTech Connect

    Tilden, M.; Hasslacher, B.; Mainieri, R.; Moses, J.

    1996-10-01

    The idea of building autonomous robots that can carry out complex and nonrepetitive tasks is an old one, so far unrealized in any meaningful hardware. Tilden has shown recently that there are simple, processor-free solutions to building autonomous mobile machines that continuously adapt to unknown and hostile environments, are designed primarily to survive, and are extremely resistant to damage. These devices use smart mechanics and simple (low component count) electronic neuron control structures having the functionality of biological organisms from simple invertebrates to sophisticated members of the insect and crab family. These devices are paradigms for the development of autonomous machines that can carry out directed goals. The machine then becomes a robust survivalist platform that can carry sensors or instruments. These autonomous roving machines, now in an early stage of development (several proof-of-concept prototype walkers have been built), can be developed so that they are inexpensive, robust, and versatile carriers for a variety of instrument packages. Applications are immediate and many, in areas as diverse as prosthetics, medicine, space, construction, nanoscience, defense, remote sensing, environmental cleanup, and biotechnology.

  12. Outdoor field experience with autonomous RPC based stations

    NASA Astrophysics Data System (ADS)

    Lopes, L.; Assis, P.; Blanco, A.; Carolino, N.; Cerda, M. A.; Conceição, R.; Cunha, O.; Ferreira, M.; Fonte, P.; Luz, R.; Mendes, L.; Pereira, A.; Pimenta, M.; Sarmento, R.; Tomé, B.

    2016-09-01

    In the last two decades Resistive Plate Chambers were employed in the Cosmic Ray Experiments COVER-PLASTEX and ARGO/YBJ. In both experiments the detectors were housed indoors, likely owing to gas distribution requirements and the need to control environment variables that directly affect RPCs operational stability. But in experiments where Extended Air Shower (EAS) sampling is necessary, large area arrays composed by dispersed stations are deployed, rendering this kind of approach impossible. In this situation, it would be mandatory to have detectors that could be deployed in small standalone stations, with very rare opportunities for maintenance, and with good resilience to environmental conditions. Aiming to meet these requirements, we started some years ago the development of RPCs for Autonomous Stations. The results from indoor tests and measurements were very promising, both concerning performance and stability under very low gas flow rate, which is the main requirement for Autonomous Stations. In this work we update the indoor results and show the first ones concerning outdoor stable operation. In particular, a dynamic adjustment of the high voltage is applied to keep gas gain constant.

  13. Robotic technologies for outdoor industrial vehicles

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony

    2001-09-01

    The commercial industries of agriculture, mining, construction, and material handling employ a wide variety of mobile machines, including tractors, combines, Load-Haul-Dump vehicles, trucks, paving machines, fork trucks, and many more. Automation of these vehicles promises to improve productivity, reduce operational costs, and increase safety. Since the vehicles typically operate in difficult environments, under all weather conditions, and in the presence of people and other obstacles, reliable automation faces severe technical challenges. Furthermore, the viable technology solutions are constrained by cost considerations. Fortunately, due to the limited application domain, repetitive nature, and the utility of partial automation for most tasks, robotics technologies can have a profound impact on industrial vehicles. In this paper, we describe a technical approach developed at Carnegie Mellon University for automating mobile machines in several applications, including mass excavation, mining, and agriculture. The approach is introduced via case studies, and the results are presented.

  14. Autonomous robot behavior based on neural networks

    NASA Astrophysics Data System (ADS)

    Grolinger, Katarina; Jerbic, Bojan; Vranjes, Bozo

    1997-04-01

    The purpose of autonomous robot is to solve various tasks while adapting its behavior to the variable environment, expecting it is able to navigate much like a human would, including handling uncertain and unexpected obstacles. To achieve this the robot has to be able to find solution to unknown situations, to learn experienced knowledge, that means action procedure together with corresponding knowledge on the work space structure, and to recognize working environment. The planning of the intelligent robot behavior presented in this paper implements the reinforcement learning based on strategic and random attempts for finding solution and neural network approach for memorizing and recognizing work space structure (structural assignment problem). Some of the well known neural networks based on unsupervised learning are considered with regard to the structural assignment problem. The adaptive fuzzy shadowed neural network is developed. It has the additional shadowed hidden layer, specific learning rule and initialization phase. The developed neural network combines advantages of networks based on the Adaptive Resonance Theory and using shadowed hidden layer provides ability to recognize lightly translated or rotated obstacles in any direction.

  15. Reactive navigational controller for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Hawkins, Scott

    1993-12-01

    Autonomous mobile robots must respond to external challenges and threats in real time. One way to satisfy this requirement is to use a fast low level intelligence to react to local environment changes. A fast reactive controller has been implemented which performs the task of real time local navigation by integrating primitive elements of perception, planning, and control. Competing achievement and constraint behaviors are used to allow abstract qualitative specification of navigation goals. An interface is provided to allow a higher level deliberative intelligence with a more global perspective to set local goals for the reactive controller. The reactive controller's simplistic strategies may not always succeed, so a means to monitor and redirect the reactive controller is provided.

  16. Development of Outdoor Service Robot to Collect Trash on Streets

    NASA Astrophysics Data System (ADS)

    Obata, Masayuki; Nishida, Takeshi; Miyagawa, Hidekazu; Kondo, Takashi; Ohkawa, Fujio

    The outdoor service robot which we call OSR-01 is developed intending for cleaning up urban areas by means of collecting discarded trash such as PET bottles, cans, plastic bags and so on. We, in this paper, describe the architecture of OSR-01 consisting of hardwares such as sensors, a manipulator, driving wheels, etc. for searching for and picking up trash, and softwares such as fast pattern matching for identifying various trash and distance measurement for picking up via the manipulator. After describing the vision system in detail, which is one of the most critical parts of the trash collection task, we show the result of an open experiment in which OSR-01 collects PET bottles on a real shopping street in the special zone for robot research and development in Kitakyushu-city.

  17. Multimedia modeling of autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Zada, Fatma; Guirguis, S.

    1997-09-01

    Modeling of autonomous mobile robots (AMRs) is sought to enable the designers to investigate various aspects of the design before the actual implementation takes place. Simulation techniques are undoubtedly enriching the design tools, by which the designer would be able to vary the design parameters as appropriate until achieving some optimal performance point. Although they are general purpose, multimedia tools, especially authoring tools, can give the AMR designer some degree of assistance in fulfilling his simulation task as fast as possible. This rapid prototyping tool is rather cost effective, and allow the designer to interactively manipulate his design in simple steps. In this paper, a multimedia environment has been constructed to enable designers to simulate AMRs in order to investigate aspects concerning their kinematics and dynamics. It is also sought that these design experiences can be gathered and categorized in a tutoring system that could be used by practitioners and students enrolled in highly technical courses such as robotics. The rich multimedia environment can assist the learner in so many ways by devising examples, suggesting solutions and design tradeoffs that have been experienced before.

  18. Biomimetic smart sensors for autonomous robotic behavior I: acoustic processing

    NASA Astrophysics Data System (ADS)

    Deligeorges, Socrates; Xue, Shuwan; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Robotics are rapidly becoming an integral tool on the battlefield and in homeland security, replacing humans in hazardous conditions. To enhance the effectiveness of robotic assets and their interaction with human operators, smart sensors are required to give more autonomous function to robotic platforms. Biologically inspired sensors are an essential part of this development of autonomous behavior and can increase both capability and performance of robotic systems. Smart, biologically inspired acoustic sensors have the potential to extend autonomous capabilities of robotic platforms to include sniper detection, vehicle tracking, personnel detection, and general acoustic monitoring. The key to enabling these capabilities is biomimetic acoustic processing using a time domain processing method based on the neural structures of the mammalian auditory system. These biologically inspired algorithms replicate the extremely adaptive processing of the auditory system yielding high sensitivity over broad dynamic range. The algorithms provide tremendous robustness in noisy and echoic spaces; properties necessary for autonomous function in real world acoustic environments. These biomimetic acoustic algorithms also provide highly accurate localization of both persistent and transient sounds over a wide frequency range, using baselines on the order of only inches. A specialized smart sensor has been developed to interface with an iRobot Packbot® platform specifically to enhance its autonomous behaviors in response to personnel and gunfire. The low power, highly parallel biomimetic processor, in conjunction with a biomimetic vestibular system (discussed in the companion paper), has shown the system's autonomous response to gunfire in complicated acoustic environments to be highly effective.

  19. Autonomous Realtime Threat-Hunting Robot (ARTHR

    ScienceCinema

    INL

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  20. Autonomous Realtime Threat-Hunting Robot (ARTHR

    SciTech Connect

    INL

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats.

  1. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  2. Designing low cost autonomous robots in unknown environments

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.; Sri, Aravind M.

    2008-04-01

    This paper documents the design and development of a low cost robot capable of autonomous navigation in unknown indoor environments. The proposed design uses only two complementary rotating sensors for navigation. The use of real time mapping allows for detection and avoidance of obstacles. The fusion of the sensors data helped improve accuracy of the online map of the robot environment. The robot builds an online map of its environment, and then automatically plans its navigation path. The feedback control keeps the robot moving along its planned path. The robot has been successfully tested in a cluttered environment in the Advanced Systems Lab. Preliminary tests carried out have shown the success of the robot in navigating autonomously.

  3. An Autonomous Mobile Robot for Tsukuba Challenge: JW-Future

    NASA Astrophysics Data System (ADS)

    Fujimoto, Katsuharu; Kaji, Hirotaka; Negoro, Masanori; Yoshida, Makoto; Mizutani, Hiroyuki; Saitou, Tomoya; Nakamura, Katsu

    “Tsukuba Challenge” is the only of its kind to require mobile robots to work autonomously and safely on public walkways. In this paper, we introduce the outline of our robot “JW-Future”, developed for this experiment based on an electric wheel chair. Additionally, the significance of participation to such a technical trial is discussed from the viewpoint of industries.

  4. Navigation strategies for multiple autonomous mobile robots moving in formation

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1991-01-01

    The problem of deriving navigation strategies for a fleet of autonomous mobile robots moving in formation is considered. Here, each robot is represented by a particle with a spherical effective spatial domain and a specified cone of visibility. The global motion of each robot in the world space is described by the equations of motion of the robot's center of mass. First, methods for formation generation are discussed. Then, simple navigation strategies for robots moving in formation are derived. A sufficient condition for the stability of a desired formation pattern for a fleet of robots each equipped with the navigation strategy based on nearest neighbor tracking is developed. The dynamic behavior of robot fleets consisting of three or more robots moving in formation in a plane is studied by means of computer simulation.

  5. Development of an autonomous satellite robot for retrieving a satellite

    NASA Astrophysics Data System (ADS)

    Komatsu, Tadashi; Uenohara, Michihiro; Iikura, Shoichi; Miura, Hirofumi; Shimoyama, Isao

    We developed a two-dimensional operation test-bed of an autonomous free-flying space robot for retrieving a satellite. This robot is floating on a planar base using air bearings and is able to fly around using thrusters for position control and a control moment gyro for attitude control. This robot also installs hardware systems such as vision systems, board computers, image processing units, and software systems such as algorithms for path plannings.

  6. Tele-assistance for semi-autonomous robots

    NASA Technical Reports Server (NTRS)

    Rogers, Erika; Murphy, Robin R.

    1994-01-01

    This paper describes a new approach in semi-autonomous mobile robots. In this approach the robot has sufficient computerized intelligence to function autonomously under a certain set of conditions, while the local system is a cooperative decision making unit that combines human and machine intelligence. Communication is then allowed to take place in a common mode and in a common language. A number of exception-handling scenarios that were constructed as a result of experiments with actual sensor data collected from two mobile robots were presented.

  7. Experimentation and concept formation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Oliver, G.; Silliman, M.

    1990-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning which involves autonomous concept formation using feedback from trial-and-error experimentation with the environment. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 11 refs., 7 figs.

  8. Towards Principled Experimental Study of Autonomous Mobile Robots

    NASA Technical Reports Server (NTRS)

    Gat, Erann

    1995-01-01

    We review the current state of research in autonomous mobile robots and conclude that there is an inadequate basis for predicting the reliability and behavior of robots operating in unengineered environments. We present a new approach to the study of autonomous mobile robot performance based on formal statistical analysis of independently reproducible experiments conducted on real robots. Simulators serve as models rather than experimental surrogates. We demonstrate three new results: 1) Two commonly used performance metrics (time and distance) are not as well correlated as is often tacitly assumed. 2) The probability distributions of these performance metrics are exponential rather than normal, and 3) a modular, object-oriented simulation accurately predicts the behavior of the real robot in a statistically significant manner.

  9. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  10. Autonomous Evolution of Dynamic Gaits with Two Quadruped Robots

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Takamura, Seichi; Yamamoto, Takashi; Fujita, Masahiro

    2004-01-01

    A challenging task that must be accomplished for every legged robot is creating the walking and running behaviors needed for it to move. In this paper we describe our system for autonomously evolving dynamic gaits on two of Sony's quadruped robots. Our evolutionary algorithm runs on board the robot and uses the robot's sensors to compute the quality of a gait without assistance from the experimenter. First we show the evolution of a pace and trot gait on the OPEN-R prototype robot. With the fastest gait, the robot moves at over 10/min/min., which is more than forty body-lengths/min. While these first gaits are somewhat sensitive to the robot and environment in which they are evolved, we then show the evolution of robust dynamic gaits, one of which is used on the ERS-110, the first consumer version of AIBO.

  11. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    ScienceCinema

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2016-07-12

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  12. Autonomous Realtime Threat-Hunting Robot (ARTHR)

    SciTech Connect

    Idaho National Laboratory - David Bruemmer, Curtis Nielsen

    2008-05-29

    Idaho National Laboratory researchers developed an intelligent plug-and-play robot payload that transforms commercial robots into effective first responders for deadly chemical, radiological and explosive threats. To learn more, visit

  13. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  14. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  15. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work. PMID:21095654

  16. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-11-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  17. Remote radioactive waste drum inspection with an autonomous mobile robot

    SciTech Connect

    Heckendorn, F.M.; Ward, C.R.; Wagner, D.G.

    1992-01-01

    An autonomous mobile robot is being developed to perform remote surveillance and inspection task on large numbers of stored radioactive waste drums. The robot will be self guided through narrow storage aisles and record the visual image of each viewable drum for subsequent off line analysis and archiving. The system will remove the personnel from potential exposure to radiation, perform the require inspections, and improve the ability to assess the long term trends in drum conditions.

  18. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans. PMID:24558734

  19. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans.

  20. FPGA implementation of vision algorithms for small autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Lee, D. J.; Archibald, J. K.

    2005-10-01

    The use of on-board vision with small autonomous robots has been made possible by the advances in the field of Field Programmable Gate Array (FPGA) technology. By connecting a CMOS camera to an FPGA board, on-board vision has been used to reduce the computation time inherent in vision algorithms. The FPGA board allows the user to create custom hardware in a faster, safer, and more easily verifiable manner that decreases the computation time and allows the vision to be done in real-time. Real-time vision tasks for small autonomous robots include object tracking, obstacle detection and avoidance, and path planning. Competitions were created to demonstrate that our algorithms work with our small autonomous vehicles in dealing with these problems. These competitions include Mouse-Trapped-in-a-Box, where the robot has to detect the edges of a box that it is trapped in and move towards them without touching them; Obstacle Avoidance, where an obstacle is placed at any arbitrary point in front of the robot and the robot has to navigate itself around the obstacle; Canyon Following, where the robot has to move to the center of a canyon and follow the canyon walls trying to stay in the center; the Grand Challenge, where the robot had to navigate a hallway and return to its original position in a given amount of time; and Stereo Vision, where a separate robot had to catch tennis balls launched from an air powered cannon. Teams competed on each of these competitions that were designed for a graduate-level robotic vision class, and each team had to develop their own algorithm and hardware components. This paper discusses one team's approach to each of these problems.

  1. Automatic detection and classification of obstacles with applications in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Rosas-Miranda, Dario I.

    2016-04-01

    Hardware implementation of an automatic detection and classification of objects that can represent an obstacle for an autonomous mobile robot using stereo vision algorithms is presented. We propose and evaluate a new method to detect and classify objects for a mobile robot in outdoor conditions. This method is divided in two parts, the first one is the object detection step based on the distance from the objects to the camera and a BLOB analysis. The second part is the classification step that is based on visuals primitives and a SVM classifier. The proposed method is performed in GPU in order to reduce the processing time values. This is performed with help of hardware based on multi-core processors and GPU platform, using a NVIDIA R GeForce R GT640 graphic card and Matlab over a PC with Windows 10.

  2. GPS and odometer data fusion for outdoor robots continuous positioning

    NASA Astrophysics Data System (ADS)

    Pozo-Ruz, Ana; Garcia-Perez, Lia; Garcia-Alegre, Maria C.; Guinea, Domingo; Ribeiro, Angela; Sandoval, Francisco

    2002-02-01

    Present work describes an approximation to obtain the best estimation of the position of the outdoor robot ROJO, a low cost lawnmower to perform unmanned precision agriculture task such are the spraying of pesticides in horticulture. For continuous location of ROJO, two redundant sensors have been installed onboard: a DGPS submetric precision model and an odometric system. DGPS system will allow an absolute positioning of the vehicle in the field, but GPS failures in the reception of the signals due to obstacles and electrical and meteorological disturbance, lead us to the integration of the odometric system. Thus, a robust odometer based upon magnetic strip sensors has been designed and integrated in the vehicle. These sensors continuosly deliver the position of the vehicle relative to its initial position, complementing the DGPS blindness periods. They give an approximated location of the vehicle in the field that can be in turn conveniently updated and corrected by the DGPS. Thus, to provided the best estimation, a fusion algorithm has been proposed and proved, wherein the best estimation is calculated as the maximum value of the join probability function obtained from both position estimation of the onboard sensors. Some results are presented to show the performance of the proposed sensor fusion technique.

  3. ODYSSEUS autonomous walking robot: The leg/arm design

    NASA Technical Reports Server (NTRS)

    Bourbakis, N. G.; Maas, M.; Tascillo, A.; Vandewinckel, C.

    1994-01-01

    ODYSSEUS is an autonomous walking robot, which makes use of three wheels and three legs for its movement in the free navigation space. More specifically, it makes use of its autonomous wheels to move around in an environment where the surface is smooth and not uneven. However, in the case that there are small height obstacles, stairs, or small height unevenness in the navigation environment, the robot makes use of both wheels and legs to travel efficiently. In this paper we present the detailed hardware design and the simulated behavior of the extended leg/arm part of the robot, since it plays a very significant role in the robot actions (movements, selection of objects, etc.). In particular, the leg/arm consists of three major parts: The first part is a pipe attached to the robot base with a flexible 3-D joint. This pipe has a rotated bar as an extended part, which terminates in a 3-D flexible joint. The second part of the leg/arm is also a pipe similar to the first. The extended bar of the second part ends at a 2-D joint. The last part of the leg/arm is a clip-hand. It is used for selecting several small weight and size objects, and when it is in a 'closed' mode, it is used as a supporting part of the robot leg. The entire leg/arm part is controlled and synchronized by a microcontroller (68CH11) attached to the robot base.

  4. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  5. Navigation and learning experiments by an autonomous robot

    SciTech Connect

    de Saussure, G.; Weisbin, C.R.; Spelt, P.F.

    1988-01-01

    Developing an autonomous mobile robot capable of navigation, surveillance and manipulation in complex and dynamic environments is a key research activity at CESAR, Oak Ridge National Laboratory's Center for Engineering Systems Advanced Research. The latest series of completed experiments was performed using the autonomous mobile robot HERMIES-IIB (Hostile Environment Robotic Machine Intelligence Experiment Series II-B). The next section describes HERMIES-IIB and some of its major components required for autonomous operation in unstructured, dynamic environments. Section 3 outlines some ongoing research in autonomous navigation. Section 4 discusses our newest research in machine learning concepts. Section 5 describes a successful experiment in which the robot is placed in an arbitrary initial location without any prior specification of the content of its environment, successively discovers and navigates around stationary or moving obstacles, picks up and moves small obstacles, searches for a control panel and performs a learned sequence of manipulations on the panel devices. The last section outlines some future directions of the program.

  6. Defining proprioceptive behaviors for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Overholt, James L.; Hudas, Greg R.; Gerhart, Grant R.

    2002-07-01

    Proprioception is a sense of body position and movement that supports the control of many automatic motor functions such as posture and locomotion. This concept, normally relegated to the fields of neural physiology and kinesiology, is being utilized in the field of unmanned mobile robotics. This paper looks at developing proprioceptive behaviors for use in controlling an unmanned ground vehicle. First, we will discuss the field of behavioral control of mobile robots. Next, a discussion of proprioception and the development of proprioceptive sensors will be presented. We will then focus on the development of a unique neural-fuzzy architecture that will be used to incorporate the control behaviors coming directly from the proprioceptive sensors. Finally we will present a simulation experiment where a simple multi-sensor robot, utilizing both external and proprioceptive sensors, is presented with the task of navigating an unknown terrain to a known target position. Results of the mobile robot utilizing this unique fusion methodology will be discussed.

  7. Autonomous learning in humanoid robotics through mental imagery.

    PubMed

    Di Nuovo, Alessandro G; Marocco, Davide; Di Nuovo, Santo; Cangelosi, Angelo

    2013-05-01

    In this paper we focus on modeling autonomous learning to improve performance of a humanoid robot through a modular artificial neural networks architecture. A model of a neural controller is presented, which allows a humanoid robot iCub to autonomously improve its sensorimotor skills. This is achieved by endowing the neural controller with a secondary neural system that, by exploiting the sensorimotor skills already acquired by the robot, is able to generate additional imaginary examples that can be used by the controller itself to improve the performance through a simulated mental training. Results and analysis presented in the paper provide evidence of the viability of the approach proposed and help to clarify the rational behind the chosen model and its implementation.

  8. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  9. System safety analysis of an autonomous mobile robot

    SciTech Connect

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  10. Biomimetic smart sensors for autonomous robotic behavior II: vestibular processing

    NASA Astrophysics Data System (ADS)

    Xue, Shuwan; Deligeorges, Socrates; Soloway, Aaron; Lichtenstein, Lee; Gore, Tyler; Hubbard, Allyn

    2009-05-01

    Limited autonomous behaviors are fast becoming a critical capability in the field of robotics as robotic applications are used in more complicated and interactive environments. As additional sensory capabilities are added to robotic platforms, sensor fusion to enhance and facilitate autonomous behavior becomes increasingly important. Using biology as a model, the equivalent of a vestibular system needs to be created in order to orient the system within its environment and allow multi-modal sensor fusion. In mammals, the vestibular system plays a central role in physiological homeostasis and sensory information integration (Fuller et al, Neuroscience 129 (2004) 461-471). At the level of the Superior Colliculus in the brain, there is multimodal sensory integration across visual, auditory, somatosensory, and vestibular inputs (Wallace et al, J Neurophysiol 80 (1998) 1006-1010), with the vestibular component contributing a strong reference frame gating input. Using a simple model for the deep layers of the Superior Colliculus, an off-the-shelf 3-axis solid state gyroscope and accelerometer was used as the equivalent representation of the vestibular system. The acceleration and rotational measurements are used to determine the relationship between a local reference frame of a robotic platform (an iRobot Packbot®) and the inertial reference frame (the outside world), with the simulated vestibular input tightly coupled with the acoustic and optical inputs. Field testing of the robotic platform using acoustics to cue optical sensors coupled through a biomimetic vestibular model for "slew to cue" gunfire detection have shown great promise.

  11. Multiagent collaboration for experimental calibration of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Vachon, Bertrand; Berge-Cherfaoui, Veronique

    1991-03-01

    This paper presents an action in mission SOCRATES whose aim is the development of a self-calibration method for an autonomous mobile robot. The robot has to determine the precise location of the coordinate system shared by its sensors. Knowledge of this system is a sine qua non condition for efficient multisensor fusion and autonomous navigation in an unknown environment. But, as perceptions and motions are not accurate, this knowledge can only be achieved by multisensor fusion. The application described highlights this kind of problem. Multisensor fusion is used here especially in its symbolic aspect. Useful knowledge includes both numerous data coming from various sensors and suitable ways to process these data. A blackboard architecture has been chosen to manage useful information. Knowledge sources are called agents and the implement physical sensors (perceptors or actuators) as well as logical sensors (high level data processors). The problem to solve is self- calibration which includes the determination of the coordinate system R of the robot and the transformations necessary to convert data from sensor reference to R. The origin of R has been chosen to be O, the rotation center of the robot. As its genuine location may vary due to robot or ground characteristics, an experimental determination of O is attempted. A strategy for measuring distances in approximate positions is proposed. This strategy must take into account the fact that motions of the robot as well as perceptions may be inaccurate. Results obtained during experiments and future extensions of the system are discussed.

  12. Navigation system for autonomous mapper robots

    NASA Astrophysics Data System (ADS)

    Halbach, Marc; Baudoin, Yvan

    1993-05-01

    This paper describes the conception and realization of a fast, robust, and general navigation system for a mobile (wheeled or legged) robot. A database, representing a high level map of the environment is generated and continuously updated. The first part describes the legged target vehicle and the hexapod robot being developed. The second section deals with spatial and temporal sensor fusion for dynamic environment modeling within an obstacle/free space probabilistic classification grid. Ultrasonic sensors are used, others are suspected to be integrated, and a-priori knowledge is treated. US sensors are controlled by the path planning module. The third part concerns path planning and a simulation of a wheeled robot is also presented.

  13. Application of a Chaotic Oscillator in an Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tlelo-Cuautle, Esteban; Ramos-López, Hugo C.; Sánchez-Sánchez, Mauro; Pano-Azucena, Ana D.; Sánchez-Gaspariano, Luis A.; Núñez-Pérez, José C.; Camas-Anzueto, Jorge L.

    2014-05-01

    Terrain exploration robots can be of great usefulness in critical navigation circumstances. However, the challenge is how to guarantee a control for covering a full terrain area. That way, the application of a chaotic oscillator to control the wheels of an autonomous mobile robot, is introduced herein. Basically, we describe the realization of a random number generator (RNG) based on a double-scroll chaotic oscillator, which is used to guide the robot to cover a full terrain area. The resolution of the terrain exploration area is determined by both the number of bits provided by the RNG and the characteristics of step motors. Finally, the experimental results highlight the covered area by painting the trajectories that the robot explores.

  14. Rice-obot 1: An intelligent autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R.; Ciscon, L.; Berberian, D.

    1989-01-01

    The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes.

  15. Autonomous kinematic calibration for robot with force sensor

    NASA Astrophysics Data System (ADS)

    Zhao, Dongbo; Xiong, Youlun

    1995-08-01

    This paper presents an autonomous calibration procedure for identifying robot geometric parameters using a wrist force sensor, which guides the robot end effector to track the section contour of an accurately cylindrical workpiece and to find its center. The information from the wrist sensor is needed to determine the motion direction for the end effector and to generate control strategy (hybrid control law of position and force), meanwhile the force vector is required to correct the deformation of the manipulator, which maps in turn into the joint differential vector. The system of constraint equation is in fact nonlinear, and can be linearized for the constraint surface of the cylinder. Simulation has been performed for a PUMA 760 robot and the result shows that the robot positioning accuracy can be improved to the level of the repeatability by the proposed calibration method.

  16. An Aerial–Ground Robotic System for Navigation and Obstacle Mapping in Large Outdoor Areas

    PubMed Central

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  17. An aerial–ground robotic system for navigation and obstacle mapping in large outdoor areas.

    PubMed

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-01

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments. PMID:23337332

  18. An aerial–ground robotic system for navigation and obstacle mapping in large outdoor areas.

    PubMed

    Garzón, Mario; Valente, João; Zapata, David; Barrientos, Antonio

    2013-01-21

    There are many outdoor robotic applications where a robot must reach a goal position or explore an area without previous knowledge of the environment around it. Additionally, other applications (like path planning) require the use of known maps or previous information of the environment. This work presents a system composed by a terrestrial and an aerial robot that cooperate and share sensor information in order to address those requirements. The ground robot is able to navigate in an unknown large environment aided by visual feedback from a camera on board the aerial robot. At the same time, the obstacles are mapped in real-time by putting together the information from the camera and the positioning system of the ground robot. A set of experiments were carried out with the purpose of verifying the system applicability. The experiments were performed in a simulation environment and outdoor with a medium-sized ground robot and a mini quad-rotor. The proposed robotic system shows outstanding results in simultaneous navigation and mapping applications in large outdoor environments.

  19. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, Aed M.; Ward, Clyde R.; Jones, Joel D.; Mallet, William R.; Harpring, Larry J.; Collins, Montenius X.; Anderson, Erin K.

    1999-01-01

    A mobile robotic system that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console.

  20. Mobile autonomous robotic apparatus for radiologic characterization

    DOEpatents

    Dudar, A.M.; Ward, C.R.; Jones, J.D.; Mallet, W.R.; Harpring, L.J.; Collins, M.X.; Anderson, E.K.

    1999-08-10

    A mobile robotic system is described that conducts radiological surveys to map alpha, beta, and gamma radiation on surfaces in relatively level open areas or areas containing obstacles such as stored containers or hallways, equipment, walls and support columns. The invention incorporates improved radiation monitoring methods using multiple scintillation detectors, the use of laser scanners for maneuvering in open areas, ultrasound pulse generators and receptors for collision avoidance in limited space areas or hallways, methods to trigger visible alarms when radiation is detected, and methods to transmit location data for real-time reporting and mapping of radiation locations on computer monitors at a host station. A multitude of high performance scintillation detectors detect radiation while the on-board system controls the direction and speed of the robot due to pre-programmed paths. The operators may revise the preselected movements of the robotic system by ethernet communications to remonitor areas of radiation or to avoid walls, columns, equipment, or containers. The robotic system is capable of floor survey speeds of from 1/2-inch per second up to about 30 inches per second, while the on-board processor collects, stores, and transmits information for real-time mapping of radiation intensity and the locations of the radiation for real-time display on computer monitors at a central command console. 4 figs.

  1. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  2. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed. PMID:15828659

  3. Autonomous stair-climbing with miniature jumping robots.

    PubMed

    Stoeter, Sascha A; Papanikolopoulos, Nikolaos

    2005-04-01

    The problem of vision-guided control of miniature mobile robots is investigated. Untethered mobile robots with small physical dimensions of around 10 cm or less do not permit powerful onboard computers because of size and power constraints. These challenges have, in the past, reduced the functionality of such devices to that of a complex remote control vehicle with fancy sensors. With the help of a computationally more powerful entity such as a larger companion robot, the control loop can be closed. Using the miniature robot's video transmission or that of an observer to localize it in the world, control commands can be computed and relayed to the inept robot. The result is a system that exhibits autonomous capabilities. The framework presented here solves the problem of climbing stairs with the miniature Scout robot. The robot's unique locomotion mode, the jump, is employed to hop one step at a time. Methods for externally tracking the Scout are developed. A large number of real-world experiments are conducted and the results discussed.

  4. Autonomous Robot System for Sensor Characterization

    SciTech Connect

    David Bruemmer; Douglas Few; Frank Carney; Miles Walton; Heather Hunting; Ron Lujan

    2004-03-01

    This paper discusses an innovative application of new Markov localization techniques that combat the problem of odometry drift, allowing a novel control architecture developed at the Idaho National Engineering and Environmental Laboratory (INEEL) to be utilized within a sensor characterization facility developed at the Remote Sensing Laboratory (RSL) in Nevada. The new robotic capability provided by the INEEL will allow RSL to test and evaluate a wide variety of sensors including radiation detection systems, machine vision systems, and sensors that can detect and track heat sources (e.g. human bodies, machines, chemical plumes). By accurately moving a target at varying speeds along designated paths, the robotic solution allows the detection abilities of a wide variety of sensors to be recorded and analyzed.

  5. A task control architecture for autonomous robots

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Mitchell, Tom

    1990-01-01

    An architecture is presented for controlling robots that have multiple tasks, operate in dynamic domains, and require a fair degree of autonomy. The architecture is built on several layers of functionality, including a distributed communication layer, a behavior layer for querying sensors, expanding goals, and executing commands, and a task level for managing the temporal aspects of planning and achieving goals, coordinating tasks, allocating resources, monitoring, and recovering from errors. Application to a legged planetary rover and an indoor mobile manipulator is described.

  6. AMiRESot - A New Robot Soccer League with Autonomous Miniature Robots

    NASA Astrophysics Data System (ADS)

    Witkowski, Ulf; Sitte, Joaquin; Herbrechtsmeier, Stefan; Rückert, Ulrich

    AMiRESot is a new robot soccer league that is played with small autonomous miniature robots. Team sizes are defined with one, two, and three robots per team. Special to the AMiRESot league are the fully autonomous behavior of the robots and their small size. For the matches, the rules mainly follow the FIFA laws with some modifications being useful for robot soccer. The new AMiRESot soccer robot is small in size (maximum 110 mm diameter) but a powerful vehicle, equipped with a differential drive system. For sensing, the robots in their basic configuration are equipped with active infrared sensors and a color image sensor. For information processing a powerful mobile processor and reconfigurable hardware resources (FPGA) are available. Due to the robot’s modular structure it can be easily extended by additional sensing and processing resources. This paper gives an overview of the AMiRESot rules and presents details of the new robot platform used for AMiRESot.

  7. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  8. Robotic reactions: Delay-induced patterns in autonomous vehicle systems

    NASA Astrophysics Data System (ADS)

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  9. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation. PMID:20365620

  10. Predictive information and explorative behavior of autonomous robots

    NASA Astrophysics Data System (ADS)

    Ay, N.; Bertschinger, N.; der, R.; Güttler, F.; Olbrich, E.

    2008-06-01

    Measures of complexity are of immediate interest for the field of autonomous robots both as a means to classify the behavior and as an objective function for the autonomous development of robot behavior. In the present paper we consider predictive information in sensor space as a measure for the behavioral complexity of a two-wheel embodied robot moving in a rectangular arena with several obstacles. The mutual information (MI) between past and future sensor values is found empirically to have a maximum for a behavior which is both explorative and sensitive to the environment. This makes predictive information a prospective candidate as an objective function for the autonomous development of such behaviors. We derive theoretical expressions for the MI in order to obtain an explicit update rule for the gradient ascent dynamics. Interestingly, in the case of a linear or linearized model of the sensorimotor dynamics the structure of the learning rule derived depends only on the dynamical properties while the value of the MI influences only the learning rate. In this way the problem of the prohibitively large sampling times for information theoretic measures can be circumvented. This result can be generalized and may help to derive explicit learning rules from complexity theoretic measures.

  11. A fuzzy logic controller for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Yen, John; Pfluger, Nathan

    1993-01-01

    The ability of a mobile robot system to plan and move intelligently in a dynamic system is needed if robots are to be useful in areas other than controlled environments. An example of a use for this system is to control an autonomous mobile robot in a space station, or other isolated area where it is hard or impossible for human life to exist for long periods of time (e.g., Mars). The system would allow the robot to be programmed to carry out the duties normally accomplished by a human being. Some of the duties that could be accomplished include operating instruments, transporting objects, and maintenance of the environment. The main focus of our early work has been on developing a fuzzy controller that takes a path and adapts it to a given environment. The robot only uses information gathered from the sensors, but retains the ability to avoid dynamically placed obstacles near and along the path. Our fuzzy logic controller is based on the following algorithm: (1) determine the desired direction of travel; (2) determine the allowed direction of travel; and (3) combine the desired and allowed directions in order to determine a direciton that is both desired and allowed. The desired direction of travel is determined by projecting ahead to a point along the path that is closer to the goal. This gives a local direction of travel for the robot and helps to avoid obstacles.

  12. Omnivision-based autonomous mobile robotic platform

    NASA Astrophysics Data System (ADS)

    Cao, Zuoliang; Hu, Jun; Cao, Jin; Hall, Ernest L.

    2001-10-01

    As a laboratory demonstration platform, TUT-I mobile robot provides various experimentation modules to demonstrate the robotics technologies that are involved in remote control, computer programming, teach-and-playback operations. Typically, the teach-and-playback operation has been proved to be an effective solution especially in structured environments. The path generated in the teach mode and path correction in real-time using path error detecting in the playback mode are demonstrated. The vision-based image database is generated as the given path representation in the teaching procedure. The algorithm of an online image positioning is performed for path following. Advanced sensory capability is employed to provide environment perception. A unique omni directional vision (omni-vision) system is used for localization and navigation. The omni directional vision involves an extremely wide-angle lens, which has the feature that a dynamic omni-vision image is processed in real time to respond the widest view during the movement. The beacon guidance is realized by observing locations of points derived from over-head features such as predefined light arrays in a building. The navigation approach is based upon the omni-vision characteristics. A group of ultrasonic sensors is employed for obstacle avoidance.

  13. Active object programming for military autonomous mobile robot software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-10-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge panel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  14. Active objects programming for military autonomous mobile robots software prototyping

    NASA Astrophysics Data System (ADS)

    Cozien, Roger F.

    2001-09-01

    While designing mobile robots, we do think that the prototyping phase is really critical. Good and clever choices have to be made. Indeed, we may not easily upgrade such robots, and most of all, when the robot is on its own, any change in both the software and the physical body is going to be very difficult, if not impossible. Thus, a great effort has to be made when prototyping the robot. Furthermore, I think that the kind of programming is very important. If your programming model is not expressive enough, you may experience a great deal of difficulties to add all the features you want, in order to give your robot reactiveness and decision making autonomy. Moreover, designing, and prototyping the on-board software of a reactive robot brings other difficulties. A reactive robot does not include any matter of rapidity. A reactive system is a system able to respond to a huge pannel of situations of which it does not have the schedule. In other words, for instance, the robot does not know when a particular situation may occur, and overall, what it would be doing at this time, and what would be its internal state. This kind of robot must be able to take a decision and to act even if they do not have all the contextual information. To do so, we use a computer language named oRis featuring object and active object oriented programming, but also parallel and dynamic code, (the code can be changed during its own execution). This last point has been made possible because oRis is fully interpreted. However oRis may call fully compiled code, but also Prolog and Java code. An oRis program may be distributed on several computers using TCP/IP network connections. The main issue in this paper is to show how active objet oriented programming, as a modern extension of object oriented programming, may help us in designing autonomous mobile robots. Based on a fully parallel software programming, an active object code allows us to give many features to a robot, and to easily solve

  15. Acquisition of Autonomous Behaviors by Robotic Assistants

    NASA Technical Reports Server (NTRS)

    Peters, R. A., II; Sarkar, N.; Bodenheimer, R. E.; Brown, E.; Campbell, C.; Hambuchen, K.; Johnson, C.; Koku, A. B.; Nilas, P.; Peng, J.

    2005-01-01

    Our research achievements under the NASA-JSC grant contributed significantly in the following areas. Multi-agent based robot control architecture called the Intelligent Machine Architecture (IMA) : The Vanderbilt team received a Space Act Award for this research from NASA JSC in October 2004. Cognitive Control and the Self Agent : Cognitive control in human is the ability to consciously manipulate thoughts and behaviors using attention to deal with conflicting goals and demands. We have been updating the IMA Self Agent towards this goal. If opportunity arises, we would like to work with NASA to empower Robonaut to do cognitive control. Applications 1. SES for Robonaut, 2. Robonaut Fault Diagnostic System, 3. ISAC Behavior Generation and Learning, 4. Segway Research.

  16. Evolving self-assembly in autonomous homogeneous robots: experiments with two physical robots.

    PubMed

    Ampatzis, Christos; Tuci, Elio; Trianni, Vito; Christensen, Anders Lyhne; Dorigo, Marco

    2009-01-01

    This research work illustrates an approach to the design of controllers for self-assembling robots in which the self-assembly is initiated and regulated by perceptual cues that are brought forth by the physical robots through their dynamical interactions. More specifically, we present a homogeneous control system that can achieve assembly between two modules (two fully autonomous robots) of a mobile self-reconfigurable system without a priori introduced behavioral or morphological heterogeneities. The controllers are dynamic neural networks evolved in simulation that directly control all the actuators of the two robots. The neurocontrollers cause the dynamic specialization of the robots by allocating roles between them based solely on their interaction. We show that the best evolved controller proves to be successful when tested on a real hardware platform, the swarm-bot. The performance achieved is similar to the one achieved by existing modular or behavior-based approaches, also due to the effect of an emergent recovery mechanism that was neither explicitly rewarded by the fitness function, nor observed during the evolutionary simulation. Our results suggest that direct access to the orientations or intentions of the other agents is not a necessary condition for robot coordination: Our robots coordinate without direct or explicit communication, contrary to what is assumed by most research works in collective robotics. This work also contributes to strengthening the evidence that evolutionary robotics is a design methodology that can tackle real-world tasks demanding fine sensory-motor coordination. PMID:19463056

  17. Autonomous intelligent robotic manipulator for on-orbit servicing

    NASA Astrophysics Data System (ADS)

    Larouche, Benoit P.

    The doctoral research is to develop an autonomous intelligent robotic manipulator technology for on-orbit servicing (OOS). More specifically, the research is focused on one of the most critical tasks in OOS- the capture of a non-cooperative object whilst minimizing impact forces and accelerations. The objective of the research is: the development of a vision-based control theory, and the implementation and testing of the developed theory by designing and constructing a custom non-redundant holonomic robotic manipulator. The research validated the newly developed control theory and its ability to (i) capture a moving target autonomously and (ii) minimize unfavourable contact dynamics during the most critical parts of the capture operations between the capture satellite and a non-cooperative/tumbling object. A custom robotic manipulator functional prototype has been designed, assembled, constructed, and programmed from concept to completion in order to provide full customizability and controllability in both the hardware and the software. Based on the test platform, a thorough experimental investigation has been conducted to validate the newly developed control methodologies to govern the behaviour of the robotic manipulators (RM) in an autonomous capture. The capture itself is effected on non-cooperative targets in zero-gravity simulated environment. The RM employs a vision system, force sensors, and encoders in order to sense its environment. The control is effected through position and pseudo-torque inputs to three stepper motors and three servo motors. The controller is a modified hybrid force/neural network impedance controller based on N. Hogan's original work. The experimental results demonstrate the set objectives of this thesis have been successfully achieved.

  18. A software architecture for autonomous orbital robotics

    NASA Astrophysics Data System (ADS)

    Henshaw, Carl G.; Akins, Keith; Creamer, N. Glenn; Faria, Matthew; Flagg, Cris; Hayden, Matthew; Healy, Liam; Hrolenok, Brian; Johnson, Jeffrey; Lyons, Kimberly; Pipitone, Frank; Tasker, Fred

    2006-05-01

    SUMO, the Spacecraft for the Universal Modification of Orbits, is a DARPA-sponsored spacecraft designed to provide orbital repositioning services to geosynchronous satellites. Such services may be needed to facilitate changing the geostationary slot of a satellite, to allow a satellite to be used until the propellant is expended instead of reserving propellant for a retirement burn, or to rescue a satellite stranded in geosynchronous transfer orbit due to a launch failure. Notably, SUMO is being designed to be compatible with the current geosynchronous satellite catalog, which implies that it does not require the customer spacecraft to have special docking fixtures, optical guides, or cooperative communications or pose sensors. In addition, the final approach and grapple will be performed autonomously. SUMO is being designed and built by the Naval Center for Space Technology, a division of the U.S. Naval Research Laboratory in Washington, DC. The nature of the SUMO concept mission leads to significant challenges in onboard spacecraft autonomy. Also, because research and development in machine vision, trajectory planning, and automation algorithms for SUMO is being pursued in parallel with flight software development, there are considerable challenges in prototyping and testing algorithms in situ and in transitioning these algorithms from laboratory form into software suitable for flight. This paper discusses these challenges, outlining the current SUMO design from the standpoint of flight algorithms and software. In particular, the design of the SUMO phase 1 laboratory demonstration software is described in detail. The proposed flight-like software architecture is also described.

  19. Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)

    NASA Technical Reports Server (NTRS)

    Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey

    1990-01-01

    Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.

  20. Context recognition and situation assessment in autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Yavnai, Arie

    1993-05-01

    The capability to recognize the operating context and to assess the situation in real-time is needed, if a high functionality autonomous mobile robot has to react properly and effectively to continuously changing situations and events, either external or internal, while the robot is performing its assigned tasks. A new approach and architecture for context recognition and situation assessment module (CORSA) is presented in this paper. CORSA is a multi-level information processing module which consists of adaptive decision and classification algorithms. It performs dynamic mapping from the data space to the context space, and dynamically decides on the context class. Learning mechanism is employed to update the decision variables so as to minimize the probability of misclassification. CORSA is embedded within the Mission Manager module of the intelligent autonomous hyper-controller (IAHC) of the mobile robot. The information regarding operating context, events and situation is then communicated to other modules of the IAHC where it is used to: (a) select the appropriate action strategy; (b) support the processes to arbitration and conflict resolution between reflexive behaviors and reasoning-driven behaviors; (c) predict future events and situations; and (d) determine criteria and priorities for planning, replanning, and decision making.

  1. The WPI Autonomous Mobile Robot Project: A Progress Report

    NASA Astrophysics Data System (ADS)

    Green, Peter E.; Hall, Kyle S.

    1987-01-01

    This paper presents a report on the WPI autonomous mobile robot (WAMR). This robot is currently under development by the Intelligent Machines Project at WPI. Its purpose is to serve as a testbed for real-time artificial intelligence. WAMR is expected to find its way from one place in a building to another, avoiding people and obstacles enroute. It is given no a priori knowledge of the building, but must learn about its environment by goal-directed exploration. Design concepts and descriptions of the major items completed thus far are presented. WAMR is a self-contained, wheeled robot that uses evidence based techniques to reason about actions. The robot builds and continually updates a world model of its environment. This is done using a combination of ultrasonic and visual data. This world model is interpreted and movement plans are generated by a planner utilizing uses real-time incremental evidence techniques. These movement plans are then carried out by a hierarchical evidence-based adaptive controller. Two interesting features of the robot are the line imaging ultrasonic sensor and the video subsystem. The former uses frequency variation to form a line image of obstacles between one and twenty feet in front of the robot. The latter attempts to mimic the human eye using neural network pattern recognition techniques. Several items have been completed thus far. The paper describes some of these, including the multiprocessor navigator and non-skid motion control system, the ultrasonic line imager, the concepts of the vision system, and the computer hardware and software environment.

  2. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  3. Dynamic map building for an autonomous mobile robot

    SciTech Connect

    Leonard, J.J.; Durrant-Whyte, H.F. ); Cox, I.J. )

    1992-08-01

    This article presents an algorithm for autonomous map building and maintenance for a mobile robot. The authors believe that mobile robot navigation can be treated as a problem of tracking geometric features that occur naturally in the environment. They represent each feature in the map by a location estimate (the feature state vector) and two distinct measures of uncertainty: a covariance matrix to represent uncertainty in feature location, and a credibility measure to represent their belief in the validity of the feature. During each position update cycle, predicted measurements are generated for each geometric feature in the map and compared with actual sensor observations. Successful matches cause a feature's credibility to be increased. Unpredicted observations are used to initialize new geometric features, while unobserved predictions result in a geometric feature's credibility being decreased. They also describe experimental results obtained with the algorithm that demonstrate successful map building using real sonar data.

  4. On autonomous terrain model acquistion by a mobile robot

    NASA Technical Reports Server (NTRS)

    Rao, N. S. V.; Iyengar, S. S.; Weisbin, C. R.

    1987-01-01

    The following problem is considered: A point robot is placed in a terrain populated by an unknown number of polyhedral obstacles of varied sizes and locations in two/three dimensions. The robot is equipped with a sensor capable of detecting all the obstacle vertices and edges that are visible from the present location of the robot. The robot is required to autonomously navigate and build the complete terrain model using the sensor information. It is established that the necessary number of scanning operations needed for complete terrain model acquisition by any algorithm that is based on scan from vertices strategy is given by the summation of i = 1 (sup n) N(O sub i)-n and summation of i = 1 (sup n) N(O sub i)-2n in two- and three-dimensional terrains respectively, where O = (O sub 1, O sub 2,....O sub n) set of the obstacles in the terrain, and N(O sub i) is the number of vertices of the obstacle O sub i.

  5. Classifying and recovering from sensing failures in autonomous mobile robots

    SciTech Connect

    Murphy, R.R.; Hershberger, D.

    1996-12-31

    This paper presents a characterization of sensing failures in autonomous mobile robots, a methodology for classification and recovery, and a demonstration of this approach on a mobile robot performing landmark navigation. A sensing failure is any event leading to defective perception, including sensor malfunctions, software errors, environmental changes, and errant expectations. The approach demonstrated in this paper exploits the ability of the robot to interact with its environment to acquire additional information for classification (i.e., active perception). A Generate and Test strategy is used to generate hypotheses to explain the symptom resulting from the sensing failure. The recovery scheme replaces the affected sensing processes with an alternative logical sensor. The approach is implemented as the Sensor Fusion Effects Exception Handling (SFX-EH) architecture. The advantages of SFX-EH are that it requires only a partial causal model of sensing failure, the control scheme strives for a fast response, tests are constructed so as to prevent confounding from collaborating sensors which have also failed, and the logical sensor organization allows SFX-EH to be interfaced with the behavioral level of existing robot architectures.

  6. Design of a Micro-Autonomous Robot for Use in Astronomical Instruments

    NASA Astrophysics Data System (ADS)

    Cochrane, W. A.; Luo, X.; Lim, T.; Taylor, W. D.; Schnetler, H.

    2012-07-01

    A Micro-Autonomous Positioning System (MAPS) has been developed using micro-autonomous robots for the deployment of small mirrors within multi-object astronomical instruments for use on the next generation ground-based telescopes. The micro-autonomous robot is a two-wheel differential drive robot with a footprint of approximately 20 × 20 mm. The robot uses two brushless DC Smoovy motors with 125:1 planetary gearheads for positioning the mirror. This article describes the various elements of the overall system and in more detail the various robot designs. Also described in this article is the build and test of the most promising design, proving that micro-autonomous robot technology can be used in precision controlled applications.

  7. Using Insect Electroantennogram Sensors on Autonomous Robots for Olfactory Searches

    PubMed Central

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities1 or explosive traces in landmine fields2 face the same problem as insects foraging for food or searching for mates3: the olfactory search is constrained by the physics of turbulent transport4. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity5-6, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones7 but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells8 or toxic and illicit substances9-11. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors12. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies13. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration14 or using nanostructured gas sensors that mimic insect antennae15

  8. Using insect electroantennogram sensors on autonomous robots for olfactory searches.

    PubMed

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-08-04

    Robots designed to track chemical leaks in hazardous industrial facilities or explosive traces in landmine fields face the same problem as insects foraging for food or searching for mates: the olfactory search is constrained by the physics of turbulent transport. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells or toxic and illicit substances. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration or using nanostructured gas sensors that mimic insect antennae.

  9. Using insect electroantennogram sensors on autonomous robots for olfactory searches.

    PubMed

    Martinez, Dominique; Arhidi, Lotfi; Demondion, Elodie; Masson, Jean-Baptiste; Lucas, Philippe

    2014-01-01

    Robots designed to track chemical leaks in hazardous industrial facilities or explosive traces in landmine fields face the same problem as insects foraging for food or searching for mates: the olfactory search is constrained by the physics of turbulent transport. The concentration landscape of wind borne odors is discontinuous and consists of sporadically located patches. A pre-requisite to olfactory search is that intermittent odor patches are detected. Because of its high speed and sensitivity, the olfactory organ of insects provides a unique opportunity for detection. Insect antennae have been used in the past to detect not only sex pheromones but also chemicals that are relevant to humans, e.g., volatile compounds emanating from cancer cells or toxic and illicit substances. We describe here a protocol for using insect antennae on autonomous robots and present a proof of concept for tracking odor plumes to their source. The global response of olfactory neurons is recorded in situ in the form of electroantennograms (EAGs). Our experimental design, based on a whole insect preparation, allows stable recordings within a working day. In comparison, EAGs on excised antennae have a lifetime of 2 hr. A custom hardware/software interface was developed between the EAG electrodes and a robot. The measurement system resolves individual odor patches up to 10 Hz, which exceeds the time scale of artificial chemical sensors. The efficiency of EAG sensors for olfactory searches is further demonstrated in driving the robot toward a source of pheromone. By using identical olfactory stimuli and sensors as in real animals, our robotic platform provides a direct means for testing biological hypotheses about olfactory coding and search strategies. It may also prove beneficial for detecting other odorants of interests by combining EAGs from different insect species in a bioelectronic nose configuration or using nanostructured gas sensors that mimic insect antennae. PMID:25145980

  10. Laser-based pedestrian tracking in outdoor environments by multiple mobile robots.

    PubMed

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  11. Laser-Based Pedestrian Tracking in Outdoor Environments by Multiple Mobile Robots

    PubMed Central

    Ozaki, Masataka; Kakimuma, Kei; Hashimoto, Masafumi; Takahashi, Kazuhiko

    2012-01-01

    This paper presents an outdoors laser-based pedestrian tracking system using a group of mobile robots located near each other. Each robot detects pedestrians from its own laser scan image using an occupancy-grid-based method, and the robot tracks the detected pedestrians via Kalman filtering and global-nearest-neighbor (GNN)-based data association. The tracking data is broadcast to multiple robots through intercommunication and is combined using the covariance intersection (CI) method. For pedestrian tracking, each robot identifies its own posture using real-time-kinematic GPS (RTK-GPS) and laser scan matching. Using our cooperative tracking method, all the robots share the tracking data with each other; hence, individual robots can always recognize pedestrians that are invisible to any other robot. The simulation and experimental results show that cooperating tracking provides the tracking performance better than conventional individual tracking does. Our tracking system functions in a decentralized manner without any central server, and therefore, this provides a degree of scalability and robustness that cannot be achieved by conventional centralized architectures. PMID:23202171

  12. Study on a human guidance method for autonomous cruise of indoor robot

    NASA Astrophysics Data System (ADS)

    Jia, Bao-Zhi; Zhu, Ming

    2011-12-01

    This paper describes a method of human guidance for autonomous cruise of indoor robot. A low-cost robot follows a person in a room and notes the path for autonomous cruise using its monocular vision. A method of video-based object detection and tracking is taken to detect the target by the video received from the robot's camera. The validity of the human guidance method is proved by the experiment.

  13. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  14. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    PubMed

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  15. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    PubMed Central

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-01-01

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments. PMID:24152933

  16. Multi-polarimetric textural distinctiveness for outdoor robotic saliency detection

    NASA Astrophysics Data System (ADS)

    Haider, S. A.; Scharfenberger, C.; Kazemzadeh, F.; Wong, A.; Clausi, D. A.

    2015-01-01

    Mobile robots that rely on vision, for navigation and object detection, use saliency approaches to identify a set of potential candidates to recognize. The state of the art in saliency detection for mobile robotics often rely upon visible light imaging, using conventional camera setups, to distinguish an object against its surroundings based on factors such as feature compactness, heterogeneity and/or homogeneity. We are demonstrating a novel multi- polarimetric saliency detection approach which uses multiple measured polarization states of a scene. We leverage the light-material interaction known as Fresnel reflections to extract rotationally invariant multi-polarimetric textural representations to then train a high dimensional sparse texture model. The multi-polarimetric textural distinctiveness is characterized using a conditional probability framework based on the sparse texture model which is then used to determine the saliency at each pixel of the scene. It was observed that through the inclusion of additional polarized states into the saliency analysis, we were able to compute noticeably improved saliency maps in scenes where objects are difficult to distinguish from their background due to color intensity similarities between the object and its surroundings.

  17. Integrated robotic vehicle control system for outdoor container handling

    NASA Astrophysics Data System (ADS)

    Viitanen, Jouko O.; Haverinen, Janne; Mattila, Pentti; Maekelae, Hannu; von Numers, Thomas; Stanek, Zbigniev; Roening, Juha

    1997-09-01

    We describe an integrated system developed for use onboard a moving work machine. The machine is targeted to such applications as e.g. automatic container handling at loading terminals. The main emphasis is on the various environment perception duties required by autonomous or semi-autonomous operation. These include obstacle detection, container position determination, localization needed for efficient navigation and measurement of docking and grasping locations of containers. Practical experience is reported on the use of several different types of technologies for the tasks. For close distance measurement, such as container row following, ultrasonic measurement was used, with associated control software. For obstacle and docking position detection, 3D active vision techniques were developed with structured lighting, utilizing also motion estimation techniques. Depth from defocus-based methods were developed for passive 3D vision. For localization, fusion of data from several sources was carried out. These included dead-reckoning data from odometry, an inertial unit, and several alternative external localization devices, i.e. real-time kinematic GPS, inductive and optical transponders. The system was integrated to run on a real-time operating system platform, using a high-level software specification tool that created the hierarchical control structure of the software.

  18. Semi-autonomous Simulated Brain Tumor Ablation with RavenII Surgical Robot using Behavior Tree

    PubMed Central

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J.

    2015-01-01

    Medical robots have been widely used to assist surgeons to carry out dexterous surgical tasks via various ways. Most of the tasks require surgeon’s operation directly or indirectly. Certain level of autonomy in robotic surgery could not only free the surgeon from some tedious repetitive tasks, but also utilize the advantages of robot: high dexterity and accuracy. This paper presents a semi-autonomous neurosurgical procedure of brain tumor ablation using RAVEN Surgical Robot and stereo visual feedback. By integrating with the behavior tree framework, the whole surgical task is modeled flexibly and intelligently as nodes and leaves of a behavior tree. This paper provides three contributions mainly: (1) describing the brain tumor ablation as an ideal candidate for autonomous robotic surgery, (2) modeling and implementing the semi-autonomous surgical task using behavior tree framework, and (3) designing an experimental simulated ablation task for feasibility study and robot performance analysis. PMID:26405563

  19. Mechanical deployment system on aries an autonomous mobile robot

    SciTech Connect

    Rocheleau, D.N.

    1995-12-01

    ARIES (Autonomous Robotic Inspection Experimental System) is under development for the Department of Energy (DOE) to survey and inspect drums containing low-level radioactive waste stored in warehouses at DOE facilities. This paper focuses on the mechanical deployment system-referred to as the camera positioning system (CPS)-used in the project. The CPS is used for positioning four identical but separate camera packages consisting of vision cameras and other required sensors such as bar-code readers and light stripe projectors. The CPS is attached to the top of a mobile robot and consists of two mechanisms. The first is a lift mechanism composed of 5 interlocking rail-elements which starts from a retracted position and extends upward to simultaneously position 3 separate camera packages to inspect the top three drums of a column of four drums. The second is a parallelogram special case Grashof four-bar mechanism which is used for positioning a camera package on drums on the floor. Both mechanisms are the subject of this paper, where the lift mechanism is discussed in detail.

  20. Interaction dynamics of multiple autonomous mobile robots in bounded spatial domains

    NASA Technical Reports Server (NTRS)

    Wang, P. K. C.

    1989-01-01

    A general navigation strategy for multiple autonomous robots in a bounded domain is developed analytically. Each robot is modeled as a spherical particle (i.e., an effective spatial domain about the center of mass); its interactions with other robots or with obstacles and domain boundaries are described in terms of the classical many-body problem; and a collision-avoidance strategy is derived and combined with homing, robot-robot, and robot-obstacle collision-avoidance strategies. Results from homing simulations involving (1) a single robot in a circular domain, (2) two robots in a circular domain, and (3) one robot in a domain with an obstacle are presented in graphs and briefly characterized.

  1. Human-robot interaction for field operation of an autonomous helicopter

    NASA Astrophysics Data System (ADS)

    Jones, Henry L.; Frew, Eric W.; Woodley, Bruce R.; Rock, Stephen M.

    1999-01-01

    The robustness of autonomous robotic systems to unanticipated circumstances is typically insufficient for use in the field. The many skills of a human user often fill this gap in robotic capability. To incorporate the human into the system, a useful interaction between man and machine must exist. This interaction should enable useful communication to be exchanged in a natural way between human and robot on a variety of levels. This paper describes the current human-robot interaction of the Stanford HUMMINGBIRD autonomous helicopter. In particular, the paper discuses the elements of the system that enable multiple levels of communication. An intelligent system agent manages the different inputs given to the helicopter. An advanced user interface gives the user and helicopter a method for exchanging useful information. Using this human-robot interaction, the HUMMINGBIRD has carried out various autonomous search, tracking, and retrieval missions.

  2. Distributed, collaborative human-robotic networks for outdoor experiments in search, identify and track

    NASA Astrophysics Data System (ADS)

    Lee, Daniel; McClelland, Mark; Schneider, Joseph; Yang, Tsung-Lin; Gallagher, Dan; Wang, John; Shah, Danelle; Ahmed, Nisar; Moran, Pete; Jones, Brandon; Leung, Tung-Sing; Nathan, Aaron; Kress-Gazit, Hadas; Campbell, Mark

    2010-10-01

    This paper presents an overview of a human-robotic system under development at Cornell which is capable of mapping an unknown environment, as well as discovering, tracking, and neutralizing several static and dynamic objects of interest. In addition, the robots can coordinate their individual tasks with one another without overly burdening a human operator. The testbed utilizes the Segway RMP platform, with lidar, vision, IMU and GPS sensors. The software draws from autonomous systems research, specifically in the areas of pose estimation, target detection and tracking, motion and behavioral planning, and human robot interaction. This paper also details experimental scenarios of mapping, tracking, and neutralization presented by way of pictures, data, and movies.

  3. Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 and Smart Autonomous Sand-Swimming Excavator

    NASA Technical Reports Server (NTRS)

    Sandy, Michael

    2015-01-01

    The Regolith Advanced Surface Systems Operations Robot (RASSOR) Phase 2 is an excavation robot for mining regolith on a planet like Mars. The robot is programmed using the Robotic Operating System (ROS) and it also uses a physical simulation program called Gazebo. This internship focused on various functions of the program in order to make it a more professional and efficient robot. During the internship another project called the Smart Autonomous Sand-Swimming Excavator was worked on. This is a robot that is designed to dig through sand and extract sample material. The intern worked on programming the Sand-Swimming robot, and designing the electrical system to power and control the robot.

  4. Planetary exploration by a mobile robot: mission teleprogramming and autonomous navigation.

    NASA Astrophysics Data System (ADS)

    Chatila, R.; Lacroix, S.; Simeon, T.; Herrb, M.

    Sending mobile robots to accomplish planetary exploration missions is scientifically promising and technologically challenging. The authors present a complete approach that encompasses the major aspects involved in the design of a robotic system for planetary exploration. It includes mission teleprogramming and supervision at a ground station, and autonomous mission execution by the remote mobile robot. They have partially implemented and validated these concepts. Experimental results illustrate the approach and the results.

  5. Behavior-based multi-robot collaboration for autonomous construction tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.

  6. Biomimetic autonomous robot inspired by the Cyanea capillata (Cyro).

    PubMed

    Villanueva, Alex A; Marut, Kenneth J; Michael, Tyler; Priya, Shashank

    2013-12-01

    A biomimetic robot inspired by Cyanea capillata, termed as 'Cyro', was developed to meet the functional demands of underwater surveillance in defense and civilian applications. The vehicle was designed to mimic the morphology and swimming mechanism of the natural counterpart. The body of the vehicle consists of a rigid support structure with linear DC motors which actuate eight mechanical arms. The mechanical arms in conjunction with artificial mesoglea create the hydrodynamic force required for propulsion. The full vehicle measures 170 cm in diameter and has a total mass of 76 kg. An analytical model of the mechanical arm kinematics was developed. The analytical and experimental bell kinematics were analyzed and compared to the C. capillata. Cyro was found to reach the water surface untethered and autonomously from a depth of 182 cm in five actuation cycles. It achieved an average velocity of 8.47 cm s(-1) while consuming an average power of 70 W. A two-axis thrust stand was developed to calculate the thrust directly from a single bell segment yielding an average thrust of 27.9 N for the whole vehicle. Steady state velocity during Cyro's swimming test was not reached but the measured performance during its last swim cycle resulted in a cost of transport of 10.9 J (kg ⋅ m)(-1) and total efficiency of 0.03. PMID:24166747

  7. Biomimetic autonomous robot inspired by the Cyanea capillata (Cyro).

    PubMed

    Villanueva, Alex A; Marut, Kenneth J; Michael, Tyler; Priya, Shashank

    2013-12-01

    A biomimetic robot inspired by Cyanea capillata, termed as 'Cyro', was developed to meet the functional demands of underwater surveillance in defense and civilian applications. The vehicle was designed to mimic the morphology and swimming mechanism of the natural counterpart. The body of the vehicle consists of a rigid support structure with linear DC motors which actuate eight mechanical arms. The mechanical arms in conjunction with artificial mesoglea create the hydrodynamic force required for propulsion. The full vehicle measures 170 cm in diameter and has a total mass of 76 kg. An analytical model of the mechanical arm kinematics was developed. The analytical and experimental bell kinematics were analyzed and compared to the C. capillata. Cyro was found to reach the water surface untethered and autonomously from a depth of 182 cm in five actuation cycles. It achieved an average velocity of 8.47 cm s(-1) while consuming an average power of 70 W. A two-axis thrust stand was developed to calculate the thrust directly from a single bell segment yielding an average thrust of 27.9 N for the whole vehicle. Steady state velocity during Cyro's swimming test was not reached but the measured performance during its last swim cycle resulted in a cost of transport of 10.9 J (kg ⋅ m)(-1) and total efficiency of 0.03.

  8. A hybrid microbial dielectric elastomer generator for autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, Iain A.; Ieropoulos, Ioannis; McKay, Thomas; O'Brien, Benjamin; Melhuish, Chris

    2010-04-01

    We are developing a hybrid Dielectric Elastomer Generator (DEG)-Microbial Fuel Cell (MFC) energy harvester . The system is for EcoBot, an Autonomous Robot (AR) that currently uses its MFCs to extract electrical energy from biomass, in the form of flies. MFCs, though reliable are slow to store charge. Thus, EcoBot operations are characterized by active periods followed by dormant periods when energy stores recover. Providing an alternate energy harvester such as a DEG, driven by wind or water, could therefore increase active time and also provide high voltage energy for direct use by on-board systems employing dielectric elastomer actuators (DEAs). Energy can be harvested from a DEG when work is done on its elastomer membrane.. However, the DEG requires an initial charge and additional charge to compensate for losses due to leakage. The starting charge can be supplied by the EcoBot MFC capacitor. We have developed a self-primer circuit that uses some of the harvested charge to prime the membrane at each cycle. The low voltage MFC initial priming charge was boosted using a voltage converter that was then electrically disconnected. The DEG membrane was cyclically stretched producing charge that replenished leakage losses and energy that could potentially be stored. A further study demonstrated that the DEG with self-primer circuit can boost voltage from very low values without the need for a voltage converter, thus reducing circuit complexity and improving efficiency.

  9. Concept formation and generalization based on experimentation by an autonomous mobile robot

    SciTech Connect

    Spelt, P.F.; deSaussure, G.; Lyness, E.; Oliver, G.; Silliman, M.

    1989-01-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. In this paper, we describe our approach to a class of machine learning problems which involves autonomous concept formation using feedback from trial-and-error learning. Our formulation was experimentally validated on an autonomous mobile robot, which learned the task of control panel monitoring and manipulation for effective process control. Conclusions are drawn concerning the applicability of the system to a more general class of learning problems, and implications for the use of autonomous mobile robots in hostile and unknown environments are discussed. 9 refs., 5 figs.

  10. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring

    PubMed Central

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  11. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-09-14

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper.

  12. A Movement-Assisted Deployment of Collaborating Autonomous Sensors for Indoor and Outdoor Environment Monitoring.

    PubMed

    Niewiadomska-Szynkiewicz, Ewa; Sikora, Andrzej; Marks, Michał

    2016-01-01

    Using mobile robots or unmanned vehicles to assist optimal wireless sensors deployment in a working space can significantly enhance the capability to investigate unknown environments. This paper addresses the issues of the application of numerical optimization and computer simulation techniques to on-line calculation of a wireless sensor network topology for monitoring and tracking purposes. We focus on the design of a self-organizing and collaborative mobile network that enables a continuous data transmission to the data sink (base station) and automatically adapts its behavior to changes in the environment to achieve a common goal. The pre-defined and self-configuring approaches to the mobile-based deployment of sensors are compared and discussed. A family of novel algorithms for the optimal placement of mobile wireless devices for permanent monitoring of indoor and outdoor dynamic environments is described. They employ a network connectivity-maintaining mobility model utilizing the concept of the virtual potential function for calculating the motion trajectories of platforms carrying sensors. Their quality and utility have been justified through simulation experiments and are discussed in the final part of the paper. PMID:27649186

  13. An intelligent hybrid behavior coordination system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Fallouh, Samer

    2013-12-01

    In this paper, development of a low-cost PID controller with an intelligent behavior coordination system for an autonomous mobile robot is described that is equipped with IR sensors, ultrasonic sensors, regulator, and RC filters on the robot platform based on HCS12 microcontroller and embedded systems. A novel hybrid PID controller and behavior coordination system is developed for wall-following navigation and obstacle avoidance of an autonomous mobile robot. Adaptive control used in this robot is a hybrid PID algorithm associated with template and behavior coordination models. Software development contains motor control, behavior coordination intelligent system and sensor fusion. In addition, the module-based programming technique is adopted to improve the efficiency of integrating the hybrid PID and template as well as behavior coordination model algorithms. The hybrid model is developed to synthesize PID control algorithms, template and behavior coordination technique for wall-following navigation with obstacle avoidance systems. The motor control, obstacle avoidance, and wall-following navigation algorithms are developed to propel and steer the autonomous mobile robot. Experiments validate how this PID controller and behavior coordination system directs an autonomous mobile robot to perform wall-following navigation with obstacle avoidance. Hardware configuration and module-based technique are described in this paper. Experimental results demonstrate that the robot is successfully capable of being guided by the hybrid PID controller and behavior coordination system for wall-following navigation with obstacle avoidance.

  14. Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)

    2002-01-01

    Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.

  15. An Autonomous Mobile Robot Guided by a Chaotic True Random Bits Generator

    NASA Astrophysics Data System (ADS)

    Volos, Ch. K.; Kyprianidis, I. M.; Stouboulos, I. N.; Stavrinides, S. G.; Anagnostopoulos, A. N.

    In this work a robot's controller, which ensures chaotic motion to an autonomous mobile robot, is presented. This new strategy, which is very useful in many robotic missions, generates an unpredictable trajectory by using a chaotic path planning generator. The proposed generator produces a trajectory, which is the result of a sequence of planned target locations. In contrary with other similar works, this one is based on a new chaotic true random bits generator, which has as a basic feature the coexistence of two different synchronization phenomena between mutually coupled identical nonlinear circuits. Simulation tests confirm that the whole robot's workplace is covered with unpredictable way in a very satisfactory time.

  16. Remote wave measurements using autonomous mobile robotic systems

    NASA Astrophysics Data System (ADS)

    Kurkin, Andrey; Zeziulin, Denis; Makarov, Vladimir; Belyakov, Vladimir; Tyugin, Dmitry; Pelinovsky, Efim

    2016-04-01

    The project covers the development of a technology for monitoring and forecasting the state of the coastal zone environment using radar equipment transported by autonomous mobile robotic systems (AMRS). Sought-after areas of application are the eastern and northern coasts of Russia, where continuous collection of information on topographic changes of the coastal zone and carrying out hydrodynamic measurements in inaccessible to human environment are needed. The intensity of the reflection of waves, received by radar surveillance, is directly related to the height of the waves. Mathematical models and algorithms for processing experimental data (signal selection, spectral analysis, wavelet analysis), recalculation of landwash from data on heights of waves far from the shore, determination of the threshold values of heights of waves far from the shore have been developed. There has been developed the program complex for functioning of the experimental prototype of AMRS, comprising the following modules: data loading module, reporting module, module of georeferencing, data analysis module, monitoring module, hardware control module, graphical user interface. Further work will be connected with carrying out tests of manufactured experimental prototype in conditions of selected routes coastline of Sakhalin Island. Conducting field tests will allow to reveal the shortcomings of development and to identify ways of optimization of the structure and functioning algorithms of AMRS, as well as functioning the measuring equipment. The presented results have been obtained in Nizhny Novgorod State Technical University n.a. R. Alekseev in the framework of the Federal Target Program «Research and development on priority directions of scientific-technological complex of Russia for 2014 - 2020 years» (agreement № 14.574.21.0089 (unique identifier of agreement - RFMEFI57414X0089)).

  17. Simple sensors for performing useful tasks autonomously in complex outdoor terrain

    NASA Astrophysics Data System (ADS)

    Gat, Erann; Behar, Albert; Desai, Rajiv; Ivlev, Robert V.; Loch, John L.; Miller, David P.

    1992-11-01

    This paper describes the control system for Rocky IV, a prototype microrover designed to demonstrate proof-of-concept for a low-cost scientific mission to Mars. Rocky IV uses a behavior-based control architecture which implements a large variety of functions displaying various degrees of autonomy, from completely autonomous long-duration conditional sequences of actions to very precisely described actions resembling classical AI operators. The control system integrates information from infrared proximity sensors, proprioceptive encoders which report on the state of the articulation of the rover's suspension system and other mechanics, a homing beacon, a magnetic compass, and contact sensors. In addition, significant functionality is implemented as 'virtual sensors', computed values which are presented to the system as if they were sensors values. The robot is able to perform a variety of useful tasks, including soil sample collection, removal of surface weathering layers from rocks, spectral imaging, instrument deployment, and sample return, under realistic mission- like conditions in Mars-like terrain.

  18. An integrated design and fabrication strategy for entirely soft, autonomous robots.

    PubMed

    Wehner, Michael; Truby, Ryan L; Fitzgerald, Daniel J; Mosadegh, Bobak; Whitesides, George M; Lewis, Jennifer A; Wood, Robert J

    2016-08-24

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  19. An integrated design and fabrication strategy for entirely soft, autonomous robots

    NASA Astrophysics Data System (ADS)

    Wehner, Michael; Truby, Ryan L.; Fitzgerald, Daniel J.; Mosadegh, Bobak; Whitesides, George M.; Lewis, Jennifer A.; Wood, Robert J.

    2016-08-01

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots.

  20. An integrated design and fabrication strategy for entirely soft, autonomous robots.

    PubMed

    Wehner, Michael; Truby, Ryan L; Fitzgerald, Daniel J; Mosadegh, Bobak; Whitesides, George M; Lewis, Jennifer A; Wood, Robert J

    2016-08-25

    Soft robots possess many attributes that are difficult, if not impossible, to achieve with conventional robots composed of rigid materials. Yet, despite recent advances, soft robots must still be tethered to hard robotic control systems and power sources. New strategies for creating completely soft robots, including soft analogues of these crucial components, are needed to realize their full potential. Here we report the untethered operation of a robot composed solely of soft materials. The robot is controlled with microfluidic logic that autonomously regulates fluid flow and, hence, catalytic decomposition of an on-board monopropellant fuel supply. Gas generated from the fuel decomposition inflates fluidic networks downstream of the reaction sites, resulting in actuation. The body and microfluidic logic of the robot are fabricated using moulding and soft lithography, respectively, and the pneumatic actuator networks, on-board fuel reservoirs and catalytic reaction chambers needed for movement are patterned within the body via a multi-material, embedded 3D printing technique. The fluidic and elastomeric architectures required for function span several orders of magnitude from the microscale to the macroscale. Our integrated design and rapid fabrication approach enables the programmable assembly of multiple materials within this architecture, laying the foundation for completely soft, autonomous robots. PMID:27558065

  1. The experimental humanoid robot H7: a research platform for autonomous behaviour.

    PubMed

    Nishiwaki, Koichi; Kuffner, James; Kagami, Satoshi; Inaba, Masayuki; Inoue, Hirochika

    2007-01-15

    This paper gives an overview of the humanoid robot 'H7', which was developed over several years as an experimental platform for walking, autonomous behaviour and human interaction research at the University of Tokyo. H7 was designed to be a human-sized robot capable of operating autonomously in indoor environments designed for humans. The hardware is relatively simple to operate and conduct research on, particularly with respect to the hierarchical design of its control architecture. We describe the overall design goals and methodology, along with a summary of its online walking capabilities, autonomous vision-based behaviours and automatic motion planning. We show experimental results obtained by implementations running within a simulation environment as well as on the actual robot hardware. PMID:17148051

  2. The experimental humanoid robot H7: a research platform for autonomous behaviour.

    PubMed

    Nishiwaki, Koichi; Kuffner, James; Kagami, Satoshi; Inaba, Masayuki; Inoue, Hirochika

    2007-01-15

    This paper gives an overview of the humanoid robot 'H7', which was developed over several years as an experimental platform for walking, autonomous behaviour and human interaction research at the University of Tokyo. H7 was designed to be a human-sized robot capable of operating autonomously in indoor environments designed for humans. The hardware is relatively simple to operate and conduct research on, particularly with respect to the hierarchical design of its control architecture. We describe the overall design goals and methodology, along with a summary of its online walking capabilities, autonomous vision-based behaviours and automatic motion planning. We show experimental results obtained by implementations running within a simulation environment as well as on the actual robot hardware.

  3. A testbed for a unified teleoperated-autonomous dual-arm robotic system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Lee, T.; Tso, K.; Backes, P.; Lloyd, J.

    1990-01-01

    This paper describes a complete robot control facility built at the Jet Propulsion Laboratory as part of NASA a telerobotics program to develop a state-of-the-art robot control environment for laboratory based space-like experiments. This system, which is now fully operational, has the following features: separation of the computing facilities into local and remote sites, autonomous motion generation in joint or Cartesian coordinates, dual-arm force reflecting teleoperation with voice interaction between the operator and the robots, shared control between the autonomously generated motions and operator controlled teleoperation, and dual-arm coordinated trajectory generation. The system has been used to carry out realistic experiments such as the exchange of an Orbital Replacement Unit (ORU), bolt turning, and door opening, using a mixture of autonomous actions and teleoperation, with either a single arm or two cooperating arms.

  4. Autonomous navigation of a mobile robot using custom-designed qualitative reasoning VLSI chips and boards

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, H.; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of a mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation is a-priori unknown environments is discussed. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse inaccurate sensor data. 17 refs., 6 figs.

  5. Using custom-designed VLSI fuzzy inferencing chips for the autonomous navigation of a mobile robot

    SciTech Connect

    Pin, F.G.; Pattay, R.S. ); Watanabe, Hiroyuki; Symon, J. . Dept. of Computer Science)

    1991-01-01

    Two types of computer boards including custom-designed VLSI fuzzy inferencing chips have been developed to add a qualitative reasoning capability to the real-time control of autonomous mobile robots. The design and operation of these boards are first described and an example of their use for the autonomous navigation of mobile robot is presented. The development of qualitative reasoning schemes emulating human-like navigation in apriori unknown environments is discussed. An approach using superposition of elemental sensor-based behaviors is shown to alloy easy development and testing of the inferencing rule base, while providing for progressive addition of behaviors to resolve situations of increasing complexity. The efficiency of such schemes, which can consist of as little as a dozen qualitative rules, is illustrated in experiments involving an autonomous mobile robot navigating on the basis of very sparse and inaccurate sensor data. 17 refs., 6 figs.

  6. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    PubMed

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1).

  7. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    NASA Astrophysics Data System (ADS)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  8. Autonomous discovery and learning by a mobile robot in unstructured environments

    SciTech Connect

    Pin, F.G.; de Saussure, G.; Spelt, P.F.; Barnett, D.L.; Killough, S.M.; Weisbin, C.R.

    1988-01-01

    This paper presents recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of autonomous discovery and learning of emergency and maintenance tasks in unstructured environments by a mobile robot. The methodologies for learning basic operating principles of control devices, and for using the acquired knowledge to solve new problems with conditions not encountered before are presented. The algorithms necessary for the robot to discover problem-solving sequences of actions, through experimentation with the environment, in the two cases of immediate feedback and delayed feedback are described. The inferencing schemes allowing the robot to classify the information acquired from a reduced set of examples and to generalize its knowledge to a much wider problem-solving domain are also provided. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot is then presented. 8 refs., 2 figs.

  9. An architectural approach to create self organizing control systems for practical autonomous robots

    NASA Technical Reports Server (NTRS)

    Greiner, Helen

    1991-01-01

    For practical industrial applications, the development of trainable robots is an important and immediate objective. Therefore, the developing of flexible intelligence directly applicable to training is emphasized. It is generally agreed upon by the AI community that the fusion of expert systems, neural networks, and conventionally programmed modules (e.g., a trajectory generator) is promising in the quest for autonomous robotic intelligence. Autonomous robot development is hindered by integration and architectural problems. Some obstacles towards the construction of more general robot control systems are as follows: (1) Growth problem; (2) Software generation; (3) Interaction with environment; (4) Reliability; and (5) Resource limitation. Neural networks can be successfully applied to some of these problems. However, current implementations of neural networks are hampered by the resource limitation problem and must be trained extensively to produce computationally accurate output. A generalization of conventional neural nets is proposed, and an architecture is offered in an attempt to address the above problems.

  10. Research and development of Ro-boat: an autonomous river cleaning robot

    NASA Astrophysics Data System (ADS)

    Sinha, Aakash; Bhardwaj, Prashant; Vaibhav, Bipul; Mohommad, Noor

    2013-12-01

    Ro-Boat is an autonomous river cleaning intelligent robot incorporating mechanical design and computer vision algorithm to achieve autonomous river cleaning and provide a sustainable environment. Ro-boat is designed in a modular fashion with design details such as mechanical structural design, hydrodynamic design and vibrational analysis. It is incorporated with a stable mechanical system with air and water propulsion, robotic arms and solar energy source and it is proceed to become autonomous by using computer vision. Both "HSV Color Space" and "SURF" are proposed to use for measurements in Kalman Filter resulting in extremely robust pollutant tracking. The system has been tested with successful results in the Yamuna River in New Delhi. We foresee that a system of Ro-boats working autonomously 24x7 can clean a major river in a city on about six months time, which is unmatched by alternative methods of river cleaning.

  11. Autonomous Mobile Platform for Research in Cooperative Robotics

    NASA Technical Reports Server (NTRS)

    Daemi, Ali; Pena, Edward; Ferguson, Paul

    1998-01-01

    This paper describes the design and development of a platform for research in cooperative mobile robotics. The structure and mechanics of the vehicles are based on R/C cars. The vehicle is rendered mobile by a DC motor and servo motor. The perception of the robot's environment is achieved using IR sensors and a central vision system. A laptop computer processes images from a CCD camera located above the testing area to determine the position of objects in sight. This information is sent to each robot via RF modem. Each robot is operated by a Motorola 68HC11E micro-controller, and all actions of the robots are realized through the connections of IR sensors, modem, and motors. The intelligent behavior of each robot is based on a hierarchical fuzzy-rule based approach.

  12. LABRADOR: a learning autonomous behavior-based robot for adaptive detection and object retrieval

    NASA Astrophysics Data System (ADS)

    Yamauchi, Brian; Moseley, Mark; Brookshire, Jonathan

    2013-01-01

    As part of the TARDEC-funded CANINE (Cooperative Autonomous Navigation in a Networked Environment) Program, iRobot developed LABRADOR (Learning Autonomous Behavior-based Robot for Adaptive Detection and Object Retrieval). LABRADOR was based on the rugged, man-portable, iRobot PackBot unmanned ground vehicle (UGV) equipped with an explosives ordnance disposal (EOD) manipulator arm and a custom gripper. For LABRADOR, we developed a vision-based object learning and recognition system that combined a TLD (track-learn-detect) filter based on object shape features with a color-histogram-based object detector. Our vision system was able to learn in real-time to recognize objects presented to the robot. We also implemented a waypoint navigation system based on fused GPS, IMU (inertial measurement unit), and odometry data. We used this navigation capability to implement autonomous behaviors capable of searching a specified area using a variety of robust coverage strategies - including outward spiral, random bounce, random waypoint, and perimeter following behaviors. While the full system was not integrated in time to compete in the CANINE competition event, we developed useful perception, navigation, and behavior capabilities that may be applied to future autonomous robot systems.

  13. Autonomous Navigation, Dynamic Path and Work Flow Planning in Multi-Agent Robotic Swarms Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Zeitlin, Nancy; Leucht, Kurt; Stolleis, Karl

    2015-01-01

    Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots, called Swarmies, to be used as a ground-based research platform for in-situ resource utilization missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in an unknown environment and return those resources to a central site.

  14. Introduction to autonomous mobile robotics using Lego Mindstorms NXT

    NASA Astrophysics Data System (ADS)

    Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-12-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.

  15. Robust performance of multiple tasks by an autonomous robot

    SciTech Connect

    Beckerman, M.; Barnett, D.L.; Einstein, R.; Jones, J.P.; Spelt, P.D.; Weisbin, C.R.

    1989-01-01

    There have been many successful mobile robot experiments, but very few papers have appeared that examine the range of applicability, or robustness, of a robot system. The purpose of this paper is to determine and quantify robustness of the Hermies-IIB experimental capabilities. 6 refs., 1 tab.

  16. Autonomous Motion Learning for Intra-Vehicular Activity Space Robot

    NASA Astrophysics Data System (ADS)

    Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo

    Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.

  17. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  18. Autonomous Mobile Robot Navigation Using Harmonic Potential Field

    NASA Astrophysics Data System (ADS)

    Panati, Subbash; Baasandorj, Bayanjargal; Chong, Kil To

    2015-05-01

    Mobile robot navigation has been an area of robotics which has gained massive attention among the researchers of robotics community. Path planning and obstacle avoidance are the key aspects of mobile robot navigation. This paper presents harmonic potential field based navigation algorithm for mobile robots. Harmonic potential field method overcomes the issue of local minima which was a major bottleneck in the case of artificial potential field method. The harmonic potential field is calculated using harmonic functions and Dirichlet boundary conditions are used for the obstacles, goal and initial position. The simulation results shows that the proposed method is able to overcome the local minima issue and navigate successfully from initial position to the goal without colliding into obstacles in static environment.

  19. Development of the Research Platform of Small Autonomous Blimp Robot

    NASA Astrophysics Data System (ADS)

    Takaya, Toshihiko; Kawamura, Hidenori; Yamamoto, Masahito; Ohuchi, Azuma

    A blimp robot is attractive as an small flight robot and can float in the air by buoyancy and realize safe to the crash small flight with low energy and can movement for a long time compared with other flight robots with low energy and can movement for a long time compared with other flight robots. However, control of an airplane robot is difficult for the nonlinear characteristic exposed to inertia by the air flow in response to influence. Therefore, the applied research which carried out the maximum use of such in recent years a blimp robot's feature is prosperous. In this paper, we realized development of blimp robot for research which can be used general-purpose by carrying out clue division of the blimp robot body at a unit, and constituting and building for research of blimp robot, and application development. On the other hand, by developing a general-purpose blimp robot research platform, improvement in the research efficiency of many researchers can be attained, and further, research start of blimp robot becomes easy and contributes to development of research. We performed the experiments for the above-mentioned proof. 1. Checked basic keeping position performance and that various orbital operation was possible. And the unit exchange ease of software unit was checked by the experiment which exchanges the control layer of software for learning control from PID control, and carries out comparison of operation. 2. In order to check the exchange ease of hardware unit, the sensor was exchanged for the microphon from the camera, and control of operation was checked. 3. For the unit addition ease, the microphon which carries out sound detection with the picture detection with a camera was added, and control of operation was verified. 4. The unit exchange was carried out for the check of a function addition and the topological map generation experiment by addition of an ultrasonic sensor was conducted. Developed blimp robot for research mounted the exchange ease

  20. Motor-response learning at a process control panel by an autonomous robot

    SciTech Connect

    Spelt, P.F.; de Saussure, G.; Lyness, E.; Pin, F.G.; Weisbin, C.R.

    1988-01-01

    The Center for Engineering Systems Advanced Research (CESAR) was founded at Oak Ridge National Laboratory (ORNL) by the Department of Energy's Office of Energy Research/Division of Engineering and Geoscience (DOE-OER/DEG) to conduct basic research in the area of intelligent machines. Therefore, researchers at the CESAR Laboratory are engaged in a variety of research activities in the field of machine learning. In this paper, we describe our approach to a class of machine learning which involves motor response acquisition using feedback from trial-and-error learning. Our formulation is being experimentally validated using an autonomous robot, learning tasks of control panel monitoring and manipulation for effect process control. The CLIPS Expert System and the associated knowledge base used by the robot in the learning process, which reside in a hypercube computer aboard the robot, are described in detail. Benchmark testing of the learning process on a robot/control panel simulation system consisting of two intercommunicating computers is presented, along with results of sample problems used to train and test the expert system. These data illustrate machine learning and the resulting performance improvement in the robot for problems similar to, but not identical with, those on which the robot was trained. Conclusions are drawn concerning the learning problems, and implications for future work on machine learning for autonomous robots are discussed. 16 refs., 4 figs., 1 tab.

  1. A Prototype Novel Sensor for Autonomous, Space Based Robots - Phase 2

    NASA Technical Reports Server (NTRS)

    Squillante, M. R.; Derochemont, L. P.; Cirignano, L.; Lieberman, P.; Soller, M. S.

    1990-01-01

    The goal of this program was to develop new sensing capabilities for autonomous robots operating in space. Information gained by the robot using these new capabilities would be combined with other information gained through more traditional capabilities, such as video, to help the robot characterize its environment as well as to identify known or unknown objects that it encounters. Several sensing capabilities using nuclear radiation detectors and backscatter technology were investigated. The result of this research has been the construction and delivery to NASA of a prototype system with three capabilities for use by autonomous robots. The primary capability was the use of beta particle backscatter measurements to determine the average atomic number (Z) of an object. This gives the robot a powerful tool to differentiate objects which may look the same, such as objects made out of different plastics or other light weight materials. In addition, the same nuclear sensor used in the backscatter measurement can be used as a nuclear spectrometer to identify sources of nuclear radiation that may be encountered by the robot, such as nuclear powered satellites. A complete nuclear analysis system is included in the software and hardware of the prototype system built in phase 2 of this effort. Finally, a method to estimate the radiation dose in the environment of the robot has been included as a third capability. Again, the same nuclear sensor is used in a different operating mode and with different analysis software. Each of these capabilities are described.

  2. Robot soccer anywhere: achieving persistent autonomous navigation, mapping, and object vision tracking in dynamic environments

    NASA Astrophysics Data System (ADS)

    Dragone, Mauro; O'Donoghue, Ruadhan; Leonard, John J.; O'Hare, Gregory; Duffy, Brian; Patrikalakis, Andrew; Leederkerken, Jacques

    2005-06-01

    The paper describes an ongoing effort to enable autonomous mobile robots to play soccer in unstructured, everyday environments. Unlike conventional robot soccer competitions that are usually held on purpose-built robot soccer "fields", in our work we seek to develop the capability for robots to demonstrate aspects of soccer-playing in more diverse environments, such as schools, hospitals, or shopping malls, with static obstacles (furniture) and dynamic natural obstacles (people). This problem of "Soccer Anywhere" presents numerous research challenges including: (1) Simultaneous Localization and Mapping (SLAM) in dynamic, unstructured environments, (2) software control architectures for decentralized, distributed control of mobile agents, (3) integration of vision-based object tracking with dynamic control, and (4) social interaction with human participants. In addition to the intrinsic research merit of these topics, we believe that this capability would prove useful for outreach activities, in demonstrating robotics technology to primary and secondary school students, to motivate them to pursue careers in science and engineering.

  3. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators

    PubMed Central

    Onal, Cagdas D.; Rus, Daniela

    2014-01-01

    Abstract In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input–output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion. PMID:27625912

  4. Autonomous Soft Robotic Fish Capable of Escape Maneuvers Using Fluidic Elastomer Actuators

    PubMed Central

    Onal, Cagdas D.; Rus, Daniela

    2014-01-01

    Abstract In this work we describe an autonomous soft-bodied robot that is both self-contained and capable of rapid, continuum-body motion. We detail the design, modeling, fabrication, and control of the soft fish, focusing on enabling the robot to perform rapid escape responses. The robot employs a compliant body with embedded actuators emulating the slender anatomical form of a fish. In addition, the robot has a novel fluidic actuation system that drives body motion and has all the subsystems of a traditional robot onboard: power, actuation, processing, and control. At the core of the fish's soft body is an array of fluidic elastomer actuators. We design the fish to emulate escape responses in addition to forward swimming because such maneuvers require rapid body accelerations and continuum-body motion. These maneuvers showcase the performance capabilities of this self-contained robot. The kinematics and controllability of the robot during simulated escape response maneuvers are analyzed and compared with studies on biological fish. We show that during escape responses, the soft-bodied robot has similar input–output relationships to those observed in biological fish. The major implication of this work is that we show soft robots can be both self-contained and capable of rapid body motion.

  5. Engineering Education to Synthesized Ability by Designing and Producing Autonomous Robot

    NASA Astrophysics Data System (ADS)

    Sakabe, Toshiya; Fukuda, Kazuhiro; Amano, Yuji; Michishita, Takahiro

    This paper presents the results of the designing and producing lesson for an autonomous robot, which nurture designing ability and team's power. From the results of the student questionnaire on self-achievement, it is shown that this school lesson is useful to cultivate the creativity of students.

  6. Adaptive artificial neural network for autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    The topics are presented in viewgraph form and include: neural network controller for robot arm positioning with visual feedback; initial training of the arm; automatic recovery from cumulative fault scenarios; and error reduction by iterative fine movements.

  7. Detection of Water Hazards for Autonomous Robotic Vehicles

    NASA Technical Reports Server (NTRS)

    Matthes, Larry; Belluta, Paolo; McHenry, Michael

    2006-01-01

    Four methods of detection of bodies of water are under development as means to enable autonomous robotic ground vehicles to avoid water hazards when traversing off-road terrain. The methods involve processing of digitized outputs of optoelectronic sensors aboard the vehicles. It is planned to implement these methods in hardware and software that would operate in conjunction with the hardware and software for navigation and for avoidance of solid terrain obstacles and hazards. The first method, intended for use during the day, is based on the observation that, under most off-road conditions, reflections of sky from water are easily discriminated from the adjacent terrain by their color and brightness, regardless of the weather and of the state of surface waves on the water. Accordingly, this method involves collection of color imagery by a video camera and processing of the image data by an algorithm that classifies each pixel as soil, water, or vegetation according to its color and brightness values (see figure). Among the issues that arise is the fact that in the presence of reflections of objects on the opposite shore, it is difficult to distinguish water by color and brightness alone. Another issue is that once a body of water has been identified by means of color and brightness, its boundary must be mapped for use in navigation. Techniques for addressing these issues are under investigation. The second method, which is not limited by time of day, is based on the observation that ladar returns from bodies of water are usually too weak to be detected. In this method, ladar scans of the terrain are analyzed for returns and the absence thereof. In appropriate regions, the presence of water can be inferred from the absence of returns. Under some conditions in which reflections from the bottom are detectable, ladar returns could, in principle, be used to determine depth. The third method involves the recognition of bodies of water as dark areas in short

  8. A simple, inexpensive, and effective implementation of a vision-guided autonomous robot

    NASA Astrophysics Data System (ADS)

    Tippetts, Beau; Lillywhite, Kirt; Fowers, Spencer; Dennis, Aaron; Lee, Dah-Jye; Archibald, James

    2006-10-01

    This paper discusses a simple, inexpensive, and effective implementation of a vision-guided autonomous robot. This implementation is a second year entrance for Brigham Young University students to the Intelligent Ground Vehicle Competition. The objective of the robot was to navigate a course constructed of white boundary lines and orange obstacles for the autonomous competition. A used electric wheelchair was used as the robot base. The wheelchair was purchased from a local thrift store for $28. The base was modified to include Kegresse tracks using a friction drum system. This modification allowed the robot to perform better on a variety of terrains, resolving issues with last year's design. In order to control the wheelchair and retain the robust motor controls already on the wheelchair the wheelchair joystick was simply removed and replaced with a printed circuit board that emulated joystick operation and was capable of receiving commands through a serial port connection. Three different algorithms were implemented and compared: a purely reactive approach, a potential fields approach, and a machine learning approach. Each of the algorithms used color segmentation methods to interpret data from a digital camera in order to identify the features of the course. This paper will be useful to those interested in implementing an inexpensive vision-based autonomous robot.

  9. Autonomous robot for detecting subsurface voids and tunnels using microgravity

    NASA Astrophysics Data System (ADS)

    Wilson, Stacy S.; Crawford, Nicholas C.; Croft, Leigh Ann; Howard, Michael; Miller, Stephen; Rippy, Thomas

    2006-05-01

    Tunnels have been used to evade security of defensive positions both during times of war and peace for hundreds of years. Tunnels are presently being built under the Mexican Border by drug smugglers and possibly terrorists. Several have been discovered at the border crossing at Nogales near Tucson, Arizona, along with others at other border towns. During this war on terror, tunnels under the Mexican Border pose a significant threat for the security of the United States. It is also possible that terrorists will attempt to tunnel under strategic buildings and possibly discharge explosives. The Center for Cave and Karst Study (CCKS) at Western Kentucky University has a long and successful history of determining the location of caves and subsurface voids using microgravity technology. Currently, the CCKS is developing a remotely controlled robot which will be used to locate voids underground. The robot will be a remotely controlled vehicle that will use microgravity and GPS to accurately detect and measure voids below the surface. It is hoped that this robot will also be used in military applications to locate other types of voids underground such as tunnels and bunkers. It is anticipated that the robot will be able to function up to a mile from the operator. This paper will describe the construction of the robot and the use of microgravity technology to locate subsurface voids with the robot.

  10. Challenging of path planning algorithms for autonomous robot in known environment

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Irwan, N.; Zuraida, Raja Lailatul; Shaharum, Umairah; Hanafi@Omar, Hafiz Mohd

    2014-06-01

    Most of the mobile robot path planning is estimated to reach its predetermined aim through the shortest path and avoiding the obstacles. This paper is a survey on path planning algorithms of various current research and existing system of Unmanned Ground Vehicles (UGV) where their challenging issues to be intelligent autonomous robot. The focuses are some short reviews on individual papers for UGV in the known environment. Methods and algorithms in path planning for the autonomous robot had been discussed. From the reviews, we obtained that the algorithms proposed are appropriate for some cases such as single or multiple obstacles, static or movement obstacle and optimal shortest path. This paper also describes some pros and cons for every reviewed paper toward algorithms improvement for further work.

  11. Development of an Interactive Augmented Environment and Its Application to Autonomous Learning for Quadruped Robots

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hayato; Osaki, Tsugutoyo; Okuyama, Tetsuro; Gramm, Joshua; Ishino, Akira; Shinohara, Ayumi

    This paper describes an interactive experimental environment for autonomous soccer robots, which is a soccer field augmented by utilizing camera input and projector output. This environment, in a sense, plays an intermediate role between simulated environments and real environments. We can simulate some parts of real environments, e.g., real objects such as robots or a ball, and reflect simulated data into the real environments, e.g., to visualize the positions on the field, so as to create a situation that allows easy debugging of robot programs. The significant point compared with analogous work is that virtual objects are touchable in this system owing to projectors. We also show the portable version of our system that does not require ceiling cameras. As an application in the augmented environment, we address the learning of goalie strategies on real quadruped robots in penalty kicks. We make our robots utilize virtual balls in order to perform only quadruped locomotion in real environments, which is quite difficult to simulate accurately. Our robots autonomously learn and acquire more beneficial strategies without human intervention in our augmented environment than those in a fully simulated environment.

  12. Terrain coverage of an unknown room by an autonomous mobile robot

    SciTech Connect

    VanderHeide, J.R.

    1995-12-05

    Terrain coverage problems are nearly as old as mankind: they were necessary early in our history for basic activities such as finding food and other necessities. As our societies and their associated machineries have grown more complex, we have not outgrown the need for this primitive skill. It is still used on a small scale for cleaning tasks and on a large scale for {open_quotes}search and report{close_quotes} missions of various kinds. The motivation for automating this process may not lie in the novelty of anything we might gain as an end product, but in freedom from something which we as humans find tedious, time-consuming and sometimes dangerous. Here we consider autonomous coverage of a terrain, typically indoor rooms, by a mobile robot that has no a priori model of the terrain. In evaluating its surroundings, the robot employs only inexpensive and commercially available ultrasonic and infrared sensors. The proposed solution is a basic step - a proof of principle - that can contribute to robots capable of autonomously performing tasks such as vacuum cleaning, mopping, radiation scanning, etc. The area of automatic terrain coverage and the closely related problem of terrain model acquisition have been studied both analytically and experimentally. Compared to the existing works, the following are three major distinguishing aspects of our study: (1) the theory is actually applied to an existing robot, (2) the robot has no a priori knowledge of the terrain, and (3) the robot can be realized relatively inexpensively.

  13. Evaluation of a Home Biomonitoring Autonomous Mobile Robot.

    PubMed

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  14. Evaluation of a Home Biomonitoring Autonomous Mobile Robot.

    PubMed

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too.

  15. Evaluation of a Home Biomonitoring Autonomous Mobile Robot

    PubMed Central

    Dorronzoro Zubiete, Enrique; Nakahata, Keigo; Imamoglu, Nevrez; Sekine, Masashi; Sun, Guanghao; Gomez, Isabel; Yu, Wenwei

    2016-01-01

    Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too. PMID:27212940

  16. An autonomous mobil robot to perform waste drum inspections

    SciTech Connect

    Peterson, K.D.; Ward, C.R.

    1994-03-01

    A mobile robot is being developed by the Savannah River Technology Center (SRTC) Robotics Group of Westinghouse Savannah River company (WSRC) to perform mandated inspections of waste drums stored in warehouse facilities. The system will reduce personnel exposure and create accurate, high quality documentation to ensure regulatory compliance. Development work is being coordinated among several DOE, academic and commercial entities in accordance with DOE`s technology transfer initiative. The prototype system was demonstrated in November of 1993. A system is now being developed for field trails at the Fernald site.

  17. Semi-autonomous exploration of multi-floor buildings with a legged robot

    NASA Astrophysics Data System (ADS)

    Wenger, Garrett J.; Johnson, Aaron M.; Taylor, Camillo J.; Koditschek, Daniel E.

    2015-05-01

    This paper presents preliminary results of a semi-autonomous building exploration behavior using the hexapedal robot RHex. Stairwells are used in virtually all multi-floor buildings, and so in order for a mobile robot to effectively explore, map, clear, monitor, or patrol such buildings it must be able to ascend and descend stairwells. However most conventional mobile robots based on a wheeled platform are unable to traverse stairwells, motivating use of the more mobile legged machine. This semi-autonomous behavior uses a human driver to provide steering input to the robot, as would be the case in, e.g., a tele-operated building exploration mission. The gait selection and transitions between the walking and stair climbing gaits are entirely autonomous. This implementation uses an RGBD camera for stair acquisition, which offers several advantages over a previously documented detector based on a laser range finder, including significantly reduced acquisition time. The sensor package used here also allows for considerable expansion of this behavior. For example, complete automation of the building exploration task driven by a mapping algorithm and higher level planner is presently under development.

  18. Processing real-time stereo video for an autonomous robot using disparity maps and sensor fusion

    NASA Astrophysics Data System (ADS)

    Rosselot, Donald W.; Hall, Ernest L.

    2004-10-01

    The Bearcat "Cub" Robot is an interactive, intelligent, Autonomous Guided Vehicle (AGV) designed to serve in unstructured environments. Recent advances in computer stereo vision algorithms that produce quality disparity and the availability of low cost high speed camera systems have simplified many of tasks associated with robot navigation and obstacle avoidance using stereo vision. Leveraging these benefits, this paper describes a novel method for autonomous navigation and obstacle avoidance currently being implemented on the UC Bearcat Robot. The core of this approach is the synthesis of multiple sources of real-time data including stereo image disparity maps, tilt sensor data, and LADAR data with standard contour, edge, color, and line detection methods to provide robust and intelligent obstacle avoidance. An algorithm is presented with Matlab code to process the disparity maps to rapidly produce obstacle size and location information in a simple format, and features cancellation of noise and correction for pitch and roll. The vision and control computers are clustered with the Parallel Virtual Machine (PVM) software. The significance of this work is in presenting the methods needed for real time navigation and obstacle avoidance for intelligent autonomous robots.

  19. Development of a semi-autonomous service robot with telerobotic capabilities

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; White, D. R.

    1987-01-01

    The importance to the United States of semi-autonomous systems for application to a large number of manufacturing and service processes is very clear. Two principal reasons emerge as the primary driving forces for development of such systems: enhanced national productivity and operation in environments whch are hazardous to humans. Completely autonomous systems may not currently be economically feasible. However, autonomous systems that operate in a limited operation domain or that are supervised by humans are within the technology capability of this decade and will likely provide reasonable return on investment. The two research and development efforts of autonomy and telerobotics are distinctly different, yet interconnected. The first addresses the communication of an intelligent electronic system with a robot while the second requires human communication and ergonomic consideration. Discussed here are work in robotic control, human/robot team implementation, expert system robot operation, and sensor development by the American Welding Institute, MTS Systems Corporation, and the Colorado School of Mines--Center for Welding Research.

  20. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  1. Collaboration among a Group of Self-Autonomous Mobile Robots with Diversified Personalities

    NASA Astrophysics Data System (ADS)

    Tauchi, Makiko; Sagawa, Yuji; Tanaka, Toshimitsu; Sugie, Noboru

    Simulation studies were carried out about a group of self-autonomous mobile robots collaborating in collection cleaning-up tasks. The robots are endowed with two kinds of human-like personalities; positivity and tenderness. Dependent on the rank of positivity, decision is made on which one of robots nearby should avoid collision and which one of robots heading for the same small baggage should carry one. As for large baggage which can be carried only by two collaborating robots, tenderness plays an essential role. In the first series of simulation, the initial configuration of 4 robots, 4 small baggage, and 2 large baggage were fixed. The cleaning-up tasks were carried out for all combinations of personalities, 625 cases in total. In the second series, 8 robots performed the task. 5 voluntarily cases were chosen to carry out 100 simulations for each case, by changing the configuration of baggage. From the results of the simulation, it was found that the heterogeneous group performs the task more effectively than the homogeneous group. It seems that diversity in personality is good for survival. In addition to the performance index of task execution time, satisfaction index is introduced to evaluate the degree of satisfaction of the group, too.

  2. Exploration of Teisi Knoll by Autonomous Underwater Vehicle "R-One Robot"

    NASA Astrophysics Data System (ADS)

    Ura, Tamaki; Obara, Takashi; Nagahashi, Kenji; Nakane, Kenji; Sakai, Shoji; Oyabu, Yuji; Sakamaki, Takashi; Takagawa, Shinichi; Kawano, Hiroshi; Gamo, Toshitaka; Takano, Michiaki; Doi, Takashi

    This paper outlines the exploration of Teisi Knoll by the autonomous underwater vehicle the R-One Robot, as carried out October 19-22, 2000, and presents images taken by the sidescan SONAR fitted to the bottom of the vehicle. The R-One Robot was launched from the R/V Kaiyo, started diving near the support ship, followed predetermined tracklines which were defined by waypoints, and finally came back to the destination where it was recovered by the support vessel. In order to minimize positioning error, which is determined by the inertial navigation system and Doppler SONAR, the robot ascended to the surface several times to ascertain its precise position using the global positioning system, the antenna of which is fitted on the vertical fin. Taking advantage of this positioning system, the robot followed the predetermined tracklines with an error of less than 40 meters in 30 minutes of continuous submerging. Disturbance to the robot is small enough compared to towed vehicles that its movement can be regarded as stable. This stability resulted in clear side scanning images of the knoll and surrounding sea floor. The robot stopped at the center of the knoll, and descended vertically into the crater. When the vehicle was in the crater, anomalous manganese ion concentrations were detected by the in situ trace metal micro-analyzer GAMOS, which was loaded in the payload bay at the front of the robot.

  3. R-MASTIF: robotic mobile autonomous system for threat interrogation and object fetch

    NASA Astrophysics Data System (ADS)

    Das, Aveek; Thakur, Dinesh; Keller, James; Kuthirummal, Sujit; Kira, Zsolt; Pivtoraiko, Mihail

    2013-01-01

    Autonomous robotic "fetch" operation, where a robot is shown a novel object and then asked to locate it in the field, re- trieve it and bring it back to the human operator, is a challenging problem that is of interest to the military. The CANINE competition presented a forum for several research teams to tackle this challenge using state of the art in robotics technol- ogy. The SRI-UPenn team fielded a modified Segway RMP 200 robot with multiple cameras and lidars. We implemented a unique computer vision based approach for textureless colored object training and detection to robustly locate previ- ously unseen objects out to 15 meters on moderately flat terrain. We integrated SRI's state of the art Visual Odometry for GPS-denied localization on our robot platform. We also designed a unique scooping mechanism which allowed retrieval of up to basketball sized objects with a reciprocating four-bar linkage mechanism. Further, all software, including a novel target localization and exploration algorithm was developed using ROS (Robot Operating System) which is open source and well adopted by the robotics community. We present a description of the system, our key technical contributions and experimental results.

  4. Welding torch trajectory generation for hull joining using autonomous welding mobile robot

    NASA Astrophysics Data System (ADS)

    Hascoet, J. Y.; Hamilton, K.; Carabin, G.; Rauch, M.; Alonso, M.; Ares, E.

    2012-04-01

    Shipbuilding processes involve highly dangerous manual welding operations. Welding of ship hulls presents a hazardous environment for workers. This paper describes a new robotic system, developed by the SHIPWELD consortium, that moves autonomously on the hull and automatically executes the required welding processes. Specific focus is placed on the trajectory control of such a system and forms the basis for the discussion in this paper. It includes a description of the robotic hardware design as well as some methodology used to establish the torch trajectory control.

  5. Analysis of mutual assured destruction-like scenario with swarms of non-recallable autonomous robots

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    This paper considers the implications of the creation of an autonomous robotic fighting force without recall-ability which could serve as a deterrent to a `total war' magnitude attack. It discusses the technical considerations for this type of robotic system and the limited enhancements required to current technologies (particularly UAVs) needed to create such a system. Particular consideration is paid to how the introduction of this type of technology by one actor could create a need for reciprocal development. Also considered is the prospective utilization of this type of technology by non-state actors and the impact of this on state actors.

  6. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    NASA Astrophysics Data System (ADS)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  7. Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew

    2005-01-01

    We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.

  8. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path.

  9. Positional estimation techniques for an autonomous mobile robot

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Aggarwal, J. K.

    1990-01-01

    Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.

  10. Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation

    PubMed Central

    Masmoudi, Mohamed Slim; Masmoudi, Mohamed

    2016-01-01

    This paper describes the design and the implementation of a trajectory tracking controller using fuzzy logic for mobile robot to navigate in indoor environments. Most of the previous works used two independent controllers for navigation and avoiding obstacles. The main contribution of the paper can be summarized in the fact that we use only one fuzzy controller for navigation and obstacle avoidance. The used mobile robot is equipped with DC motor, nine infrared range (IR) sensors to measure the distance to obstacles, and two optical encoders to provide the actual position and speeds. To evaluate the performances of the intelligent navigation algorithms, different trajectories are used and simulated using MATLAB software and SIMIAM navigation platform. Simulation results show the performances of the intelligent navigation algorithms in terms of simulation times and travelled path. PMID:27688748

  11. Autonomous dexterous end-effectors for space robotics

    NASA Technical Reports Server (NTRS)

    Bekey, George A.; Iberall, Thea; Liu, Huan

    1989-01-01

    The development of a knowledge-based controller is summarized for the Belgrade/USC robot hand, a five-fingered end effector, designed for maximum autonomy. The biological principles of the hand and its architecture are presented. The conceptual and software aspects of the grasp selection system are discussed, including both the effects of the geometry of the target object and the task to be performed. Some current research issues are presented.

  12. Application-based control of an autonomous mobile robot

    SciTech Connect

    Fisher, J.J.

    1988-01-01

    Industry response to new technology is governed, almost without exception, by the systems available to meet real world needs, not tools which prove the feasibility of the technology. To this end, SRL is developing robust control strategies and tools for potential autonomous vehicle applications on site. This document describes the work packages developed to perform remote tasks and a integrated control environment which allows rapid vehicle applications development and diagnostic capabilities. 5 refs., 7 figs.

  13. Autonomous star field identification for robotic solar system exploration

    NASA Astrophysics Data System (ADS)

    Scholl, Marija S.

    A six-feature all-sky star field identification algorithm has been developed. The minimum identifiable star pattern element consists of an oriented star triplet defined by three stars, their celestial coordinates and visual magnitudes. This algorithm has been integrated with a CCD-based imaging camera. The autonomous intelligent camera identifies in real time any star field without a priori knowledge. Observatory tests on star fields with this intelligent camera are described.

  14. Navigation in an autonomous flying robot by using a biologically inspired visual odometer

    NASA Astrophysics Data System (ADS)

    Iida, Fumiya; Lambrinos, Dimitrios

    2000-10-01

    While mobile robots and walking insects can use proprioceptive information (specialized receptors in the insects' leg, or wheel encoders in robots) to estimate distance traveled, flying agents have to rely mainly on visual cues. Experiments with bees provide evidence that flying insects might be using optical flow induced by egomotion to estimate distance traveled. Recently some details of this odometer have been unraveled. In this study, we propose a biologically inspired model of the bee's visual odometer based on Elementary Motion Detectors (EMDs), and present results from goal-directed navigation experiments with an autonomous flying robot platform that we developed specifically for this purpose. The robot is equipped with a panoramic vision system, which is used to provide input to the EMDs of the left and right visual fields. The outputs of the EMDs are in later stage spatially integrated by wide field motion detectors, and their accumulated response is directly used for the odometer. In a set of initial experiments, the robot moves through a corridor on a fixed route, and the outputs of EMDs, the odometer, are recorded. The results show that the proposed model can be used to provide an estimate of the distance traveled, but the performance depends on the route the robot follows, something which is biologically plausible since natural insects tend to adopt a fixed route during foraging. Given these results, we assumed that the optomotor response plays an important role in the context of goal-directed navigation, and we conducted experiments with an autonomous freely flying robot. The experiments demonstrate that this computationally cheap mechanism can be successfully employed in natural indoor environments.

  15. Road network modeling in open source GIS to manage the navigation of autonomous robots

    NASA Astrophysics Data System (ADS)

    Mangiameli, Michele; Muscato, Giovanni; Mussumeci, Giuseppe

    2013-10-01

    The autonomous navigation of a robot can be accomplished through the assignment of a sequence of waypoints previously identified in the territory to be explored. In general, the starting point is a vector graph of the network consisting of possible paths. The vector graph can be directly available in the case of actual road networks, or it can be modeled, i.e. on the basis of cartographic supports or, even better, of a digital terrain model (DTM). In this paper we present software procedures developed in Grass-GIS, PostGIS and QGIS environments to identify, model, and visualize a road graph and to extract and normalize sequence of waypoints which can be transferred to a robot for its autonomous navigation.

  16. Neuromodulated Neural Hardware and Its Implementation on an Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Tokura, Seiji; Ishiguro, Akio; Okuma, Shigeru

    In order to construct truly autonomous mobile robots, the concept of packaging is highly indispensable: all parts such as controllers, power systems, and batteries should be embedded inside a body. Therefore, implementing a controller on hardware is one of the most promising ways, since this contributes to low power consumption, miniaturization, and so on. Another crucial requirement in the field of autonomous mobile robots is robustness. That is, autonomous mobile robots have to cope with their unpredictably changing environments in real time. In this study, to meet these requirements the concept of Dynamically Rearrangeable Electrical Circuit(DREC) is proposed. In addition, we implement DREC onto FPGAs as physical electronic circuits by using the diffusion-reaction mechanism of neuromodulation which is widely observed in biological nervous systems. We developed the DREC for the peg-pushing task as a practical example. We confirmed that the physical DREC can successfully regulate the behavior according to the situation by changing its properties in real time.

  17. Autonomous undulatory serpentine locomotion utilizing body dynamics of a fluidic soft robot.

    PubMed

    Onal, Cagdas D; Rus, Daniela

    2013-06-01

    Soft robotics offers the unique promise of creating inherently safe and adaptive systems. These systems bring man-made machines closer to the natural capabilities of biological systems. An important requirement to enable self-contained soft mobile robots is an on-board power source. In this paper, we present an approach to create a bio-inspired soft robotic snake that can undulate in a similar way to its biological counterpart using pressure for actuation power, without human intervention. With this approach, we develop an autonomous soft snake robot with on-board actuation, power, computation and control capabilities. The robot consists of four bidirectional fluidic elastomer actuators in series to create a traveling curvature wave from head to tail along its body. Passive wheels between segments generate the necessary frictional anisotropy for forward locomotion. It takes 14 h to build the soft robotic snake, which can attain an average locomotion speed of 19 mm s(-1). PMID:23524383

  18. Automatic generation of modules of object categorization for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Gorbenko, Anna

    2013-10-01

    Many robotic tasks require advanced systems of visual sensing. Robotic systems of visual sensing must be able to solve a number of different complex problems of visual data analysis. Object categorization is one of such problems. In this paper, we propose an approach to automatic generation of computationally effective modules of object categorization for autonomous mobile robots. This approach is based on the consideration of the stack cover problem. In particular, it is assumed that the robot is able to perform an initial inspection of the environment. After such inspection, the robot needs to solve the stack cover problem by using a supercomputer. A solution of the stack cover problem allows the robot to obtain a template for computationally effective scheduling of object categorization. Also, we consider an efficient approach to solve the stack cover problem. In particular, we consider an explicit reduction from the decision version of the stack cover problem to the satisfiability problem. For different satisfiability algorithms, the results of computational experiments are presented.

  19. Monocular SLAM for Autonomous Robots with Enhanced Features Initialization

    PubMed Central

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-01-01

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided. PMID:24699284

  20. Monocular SLAM for autonomous robots with enhanced features initialization.

    PubMed

    Guerra, Edmundo; Munguia, Rodrigo; Grau, Antoni

    2014-04-02

    This work presents a variant approach to the monocular SLAM problem focused in exploiting the advantages of a human-robot interaction (HRI) framework. Based upon the delayed inverse-depth feature initialization SLAM (DI-D SLAM), a known monocular technique, several but crucial modifications are introduced taking advantage of data from a secondary monocular sensor, assuming that this second camera is worn by a human. The human explores an unknown environment with the robot, and when their fields of view coincide, the cameras are considered a pseudo-calibrated stereo rig to produce estimations for depth through parallax. These depth estimations are used to solve a related problem with DI-D monocular SLAM, namely, the requirement of a metric scale initialization through known artificial landmarks. The same process is used to improve the performance of the technique when introducing new landmarks into the map. The convenience of the approach taken to the stereo estimation, based on SURF features matching, is discussed. Experimental validation is provided through results from real data with results showing the improvements in terms of more features correctly initialized, with reduced uncertainty, thus reducing scale and orientation drift. Additional discussion in terms of how a real-time implementation could take advantage of this approach is provided.

  1. Adaptive Control for Autonomous Navigation of Mobile Robots Considering Time Delay and Uncertainty

    NASA Astrophysics Data System (ADS)

    Armah, Stephen Kofi

    Autonomous control of mobile robots has attracted considerable attention of researchers in the areas of robotics and autonomous systems during the past decades. One of the goals in the field of mobile robotics is development of platforms that robustly operate in given, partially unknown, or unpredictable environments and offer desired services to humans. Autonomous mobile robots need to be equipped with effective, robust and/or adaptive, navigation control systems. In spite of enormous reported work on autonomous navigation control systems for mobile robots, achieving the goal above is still an open problem. Robustness and reliability of the controlled system can always be improved. The fundamental issues affecting the stability of the control systems include the undesired nonlinear effects introduced by actuator saturation, time delay in the controlled system, and uncertainty in the model. This research work develops robustly stabilizing control systems by investigating and addressing such nonlinear effects through analytical, simulations, and experiments. The control systems are designed to meet specified transient and steady-state specifications. The systems used for this research are ground (Dr Robot X80SV) and aerial (Parrot AR.Drone 2.0) mobile robots. Firstly, an effective autonomous navigation control system is developed for X80SV using logic control by combining 'go-to-goal', 'avoid-obstacle', and 'follow-wall' controllers. A MATLAB robot simulator is developed to implement this control algorithm and experiments are conducted in a typical office environment. The next stage of the research develops an autonomous position (x, y, and z) and attitude (roll, pitch, and yaw) controllers for a quadrotor, and PD-feedback control is used to achieve stabilization. The quadrotor's nonlinear dynamics and kinematics are implemented using MATLAB S-function to generate the state output. Secondly, the white-box and black-box approaches are used to obtain a linearized

  2. Effectiveness of Social Behaviors for Autonomous Wheelchair Robot to Support Elderly People in Japan

    PubMed Central

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  3. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan.

    PubMed

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items. PMID:25993038

  4. Effectiveness of social behaviors for autonomous wheelchair robot to support elderly people in Japan.

    PubMed

    Shiomi, Masahiro; Iio, Takamasa; Kamei, Koji; Sharma, Chandraprakash; Hagita, Norihiro

    2015-01-01

    We developed a wheelchair robot to support the movement of elderly people and specifically implemented two functions to enhance their intention to use it: speaking behavior to convey place/location related information and speed adjustment based on individual preferences. Our study examines how the evaluations of our wheelchair robot differ when compared with human caregivers and a conventional autonomous wheelchair without the two proposed functions in a moving support context. 28 senior citizens participated in the experiment to evaluate three different conditions. Our measurements consisted of questionnaire items and the coding of free-style interview results. Our experimental results revealed that elderly people evaluated our wheelchair robot higher than the wheelchair without the two functions and the human caregivers for some items.

  5. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  6. Command and Control Architectures for Autonomous Micro-Robotic Forces - FY-2000 Project Report

    SciTech Connect

    Dudenhoeffer, Donald Dean

    2001-04-01

    Advances in Artificial Intelligence (AI) and micro-technologies will soon give rise to production of large-scale forces of autonomous micro-robots with systems of innate behaviors and with capabilities of self-organization and real world tasking. Such organizations have been compared to schools of fish, flocks of birds, herds of animals, swarms of insects, and military squadrons. While these systems are envisioned as maintaining a high degree of autonomy, it is important to understand the relationship of man with such machines. In moving from research studies to the practical deployment of large-scale numbers of robots, one of critical pieces that must be explored is the command and control architecture for humans to re-task and also inject global knowledge, experience, and intuition into the force. Tele-operation should not be the goal, but rather a level of adjustable autonomy and high-level control. If a herd of sheep is comparable to the collective of robots, then the human element is comparable to the shepherd pulling in strays and guiding the herd in the direction of greener pastures. This report addresses the issues and development of command and control for largescale numbers of autonomous robots deployed as a collective force.

  7. Emergence of Leadership in a Group of Autonomous Robots

    PubMed Central

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different “styles” of leadership (active and passive). PMID:26340449

  8. Emergence of Leadership in a Group of Autonomous Robots.

    PubMed

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different "styles" of leadership (active and passive). PMID:26340449

  9. Emergence of Leadership in a Group of Autonomous Robots.

    PubMed

    Pugliese, Francesco; Acerbi, Alberto; Marocco, Davide

    2015-01-01

    In this paper we examine the factors contributing to the emergence of leadership in a group, and we explore the relationship between the role of the leader and the behavioural capabilities of other individuals. We use a simulation technique where a group of foraging robots must coordinate to choose between two identical food zones in order to forage collectively. Behavioural and quantitative analysis indicate that a form of leadership emerges, and that groups with a leader are more effective than groups without. Moreover, we show that the most skilled individuals in a group tend to be the ones that assume a leadership role, supporting biological findings. Further analysis reveals the emergence of different "styles" of leadership (active and passive).

  10. Microbial-powered artificial muscles for autonomous robots

    NASA Astrophysics Data System (ADS)

    Ieropoulos, Ioannis; Anderson, Iain A.; Gisby, Todd; Wang, Cheng-Hung; Rossiter, Jonathan

    2009-03-01

    We consider the embodiment of a microbial fuel cell using artificial muscle actuators. The microbial fuel cell digests organic matter and generates electricity. This energy is stored in a capacitor bank until it is discharged to power one of two complimentary artificial muscle technologies: the dielectric elastomer actuator and the ionic-polymer metal composite. We study the ability of the fuel cell to generate useful actuation and consider appropriate configurations to maximally exploit both of these artificial muscle technologies. A prototype artificial sphincter is implemented using a dielectric elastomer actuator. Stirrer and cilia mechanisms motivate experimentation using ionic polymer metal composite actuators. The ability of the fuel cell to drive both of these technologies opens up new possibilities for truly biomimetic soft artificial robotic organisms.

  11. Lane identification and path planning for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    McKeon, Robert T.; Paulik, Mark; Krishnan, Mohan

    2006-10-01

    This work has been performed in conjunction with the University of Detroit Mercy's (UDM) ECE Department autonomous vehicle entry in the 2006 Intelligent Ground Vehicle Competition (www.igvc.org). The IGVC challenges engineering students to design autonomous vehicles and compete in a variety of unmanned mobility competitions. The course to be traversed in the competition consists of a lane demarcated by painted lines on grass with the possibility of one of the two lines being deliberately left out over segments of the course. The course also consists of other challenging artifacts such as sandpits, ramps, potholes, and colored tarps that alter the color composition of scenes, and obstacles set up using orange and white construction barrels. This paper describes a composite lane edge detection approach that uses three algorithms to implement noise filters enabling increased removal of noise prior to the application of image thresholding. The first algorithm uses a row-adaptive statistical filter to establish an intensity floor followed by a global threshold based on a reverse cumulative intensity histogram and a priori knowledge about lane thickness and separation. The second method first improves the contrast of the image by implementing an arithmetic combination of the blue plane (RGB format) and a modified saturation plane (HSI format). A global threshold is then applied based on the mean of the intensity image and a user-defined offset. The third method applies the horizontal component of the Sobel mask to a modified gray scale of the image, followed by a thresholding method similar to the one used in the second method. The Hough transform is applied to each of the resulting binary images to select the most probable line candidates. Finally, a heuristics-based confidence interval is determined, and the results sent on to a separate fuzzy polar-based navigation algorithm, which fuses the image data with that produced by a laser scanner (for obstacle detection).

  12. Immune systems are not just for making you feel better: they are for controlling autonomous robots

    NASA Astrophysics Data System (ADS)

    Rosenblum, Mark

    2005-05-01

    The typical algorithm for robot autonomous navigation in off-road complex environments involves building a 3D map of the robot's surrounding environment using a 3D sensing modality such as stereo vision or active laser scanning, and generating an instantaneous plan to navigate around hazards. Although there has been steady progress using these methods, these systems suffer from several limitations that cannot be overcome with 3D sensing and planning alone. Geometric sensing alone has no ability to distinguish between compressible and non-compressible materials. As a result, these systems have difficulty in heavily vegetated environments and require sensitivity adjustments across different terrain types. On the planning side, these systems have no ability to learn from their mistakes and avoid problematic environmental situations on subsequent encounters. We have implemented an adaptive terrain classification system based on the Artificial Immune System (AIS) computational model, which is loosely based on the biological immune system, that combines various forms of imaging sensor inputs to produce a "feature labeled" image of the scene categorizing areas as benign or detrimental for autonomous robot navigation. Because of the qualities of the AIS computation model, the resulting system will be able to learn and adapt on its own through interaction with the environment by modifying its interpretation of the sensor data. The feature labeled results from the AIS analysis are inserted into a map and can then be used by a planner to generate a safe route to a goal point. The coupling of diverse visual cues with the malleable AIS computational model will lead to autonomous robotic ground vehicles that require less human intervention for deployment in novel environments and more robust operation as a result of the system's ability to improve its performance through interaction with the environment.

  13. Measure of the accuracy of navigational sensors for autonomous path tracking

    NASA Astrophysics Data System (ADS)

    Motazed, Ben

    1994-02-01

    Outdoor mobile robot path tracking for an extended period of time and distance is a formidable task. The difficulty lies in the ability of robot navigation systems to reliably and accurately report on the position and orientation of the vehicle. This paper addresses the accurate navigation of mobile robots in the context of non-line of sight autonomous convoying. Dead-reckoning, GPS and vision based autonomous road following navigational schemes are integrated through a Kalman filter formulation to derive mobile robot position and orientation. The accuracy of these navigational schemes and their sufficiency to achieve autonomous path tracking for long duration are examined.

  14. Autonomous robot navigation based on the evolutionary multi-objective optimization of potential fields

    NASA Astrophysics Data System (ADS)

    Herrera Ortiz, Juan Arturo; Rodríguez-Vázquez, Katya; Padilla Castañeda, Miguel A.; Arámbula Cosío, Fernando

    2013-01-01

    This article presents the application of a new multi-objective evolutionary algorithm called RankMOEA to determine the optimal parameters of an artificial potential field for autonomous navigation of a mobile robot. Autonomous robot navigation is posed as a multi-objective optimization problem with three objectives: minimization of the distance to the goal, maximization of the distance between the robot and the nearest obstacle, and maximization of the distance travelled on each field configuration. Two decision makers were implemented using objective reduction and discrimination in performance trade-off. The performance of RankMOEA is compared with NSGA-II and SPEA2, including both decision makers. Simulation experiments using three different obstacle configurations and 10 different routes were performed using the proposed methodology. RankMOEA clearly outperformed NSGA-II and SPEA2. The robustness of this approach was evaluated with the simulation of different sensor masks and sensor noise. The scheme reported was also combined with the wavefront-propagation algorithm for global path planning.

  15. Vision-directed path planning, navigation, and control for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Wong, Andrew K. C.; Gao, Song

    1993-05-01

    This paper presents a model and sensor based path planning, navigation and control system for an autonomous mobile robot (AMR) operating in a known laboratory environment. The goal of this research is to enable the AMR to use its on-board computer vision to: (a) locate its own position in the laboratory environment; (b) plan its path; (c) and navigate itself along the planned path. To determine the position and the orientation of the AMR before and during navigation, the vision system has to first recognize and locate the known landmarks, such as doors and columns, in the laboratory. Once the AMR relates its own position and orientation with the world environment, it is able to plan a path to reach a certain prescribed destination. In order to achieve on-line visual feedback, an autonomous target (landmark) acquisition, recognition, and tracking scheme is used. The AMR system is designed and developed to support flexible manufacturing in general, and surveillance and transporting materials in a hazardous environment as well as an autonomous space robotics project funded by MRCO and the Canadian Space Program related to the Freedom Space Station.

  16. Reliability of EUCLIDIAN: An autonomous robotic system for image-guided prostate brachytherapy

    SciTech Connect

    Podder, Tarun K.; Buzurovic, Ivan; Huang Ke; Showalter, Timothy; Dicker, Adam P.; Yu, Yan

    2011-01-15

    Purpose: Recently, several robotic systems have been developed to perform accurate and consistent image-guided brachytherapy. Before introducing a new device into clinical operations, it is important to assess the reliability and mean time before failure (MTBF) of the system. In this article, the authors present the preclinical evaluation and analysis of the reliability and MTBF of an autonomous robotic system, which is developed for prostate seed implantation. Methods: The authors have considered three steps that are important in reliability growth analysis. These steps are: Identification and isolation of failures, classification of failures, and trend analysis. For any one-of-a-kind product, the reliability enhancement is accomplished through test-fix-test. The authors have used failure mode and effect analysis for collection and analysis of reliability data by identifying and categorizing the failure modes. Failures were classified according to severity. Failures that occurred during the operation of this robotic system were considered as nonhomogenous Poisson process. The failure occurrence trend was analyzed using Laplace test. For analyzing and predicting reliability growth, commonly used and widely accepted models, Duane's model and the Army Material Systems Analysis Activity, i.e., Crow's model, were applied. The MTBF was used as an important measure for assessing the system's reliability. Results: During preclinical testing, 3196 seeds (in 53 test cases) were deposited autonomously by the robot and 14 critical failures were encountered. The majority of the failures occurred during the first few cases. The distribution of failures followed Duane's postulation as well as Crow's postulation of reliability growth. The Laplace test index was -3.82 (<0), indicating a significant trend in failure data, and the failure intervals lengthened gradually. The continuous increase in the failure occurrence interval suggested a trend toward improved reliability. The MTBF

  17. A learning-based semi-autonomous controller for robotic exploration of unknown disaster scenes while searching for victims.

    PubMed

    Doroodgar, Barzin; Liu, Yugang; Nejat, Goldie

    2014-12-01

    Semi-autonomous control schemes can address the limitations of both teleoperation and fully autonomous robotic control of rescue robots in disaster environments by allowing a human operator to cooperate and share such tasks with a rescue robot as navigation, exploration, and victim identification. In this paper, we present a unique hierarchical reinforcement learning-based semi-autonomous control architecture for rescue robots operating in cluttered and unknown urban search and rescue (USAR) environments. The aim of the controller is to enable a rescue robot to continuously learn from its own experiences in an environment in order to improve its overall performance in exploration of unknown disaster scenes. A direction-based exploration technique is integrated in the controller to expand the search area of the robot via the classification of regions and the rubble piles within these regions. Both simulations and physical experiments in USAR-like environments verify the robustness of the proposed HRL-based semi-autonomous controller to unknown cluttered scenes with different sizes and varying types of configurations.

  18. Development and training of a learning expert system in an autonomous mobile robot via simulation

    SciTech Connect

    Spelt, P.F.; Lyness, E.; DeSaussure, G. . Center for Engineering Systems Advanced Research)

    1989-11-01

    The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using a computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.

  19. The Structure Of A Fuzzy Production System For Autonomous Robot Control

    NASA Astrophysics Data System (ADS)

    Isik, Can; Meystel, Alexander

    1986-03-01

    A knowledge-based controller for an autonomous mobile robot is realized as a hierarchy of production systems. The hierarchical structure is achieved following the information hierarchy of the system. A high level path planning is possible by utilizing the incomplete world description. More detailed linguistic information, obtained from sensors that cover the close surroundings, enables the lower level planning and control of the robot motion. A linguistic model is developed by describing the relationships among the entities of the world description. This model is then transformed into the form of rules of motion control. The inexactness of the world description is modeled using the tools of fuzzy set theory, leading to a production system with a fuzzy database and a redundant rule base.

  20. Localization of Non-Linearly Modeled Autonomous Mobile Robots Using Out-of-Sequence Measurements

    PubMed Central

    Besada-Portas, Eva; Lopez-Orozco, Jose A.; Lanillos, Pablo; de la Cruz, Jesus M.

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  1. Localization of non-linearly modeled autonomous mobile robots using out-of-sequence measurements.

    PubMed

    Besada-Portas, Eva; Lopez-Orozco, Jose A; Lanillos, Pablo; de la Cruz, Jesus M

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost.

  2. Autonomous robotic capture of non-cooperative target by adaptive extended Kalman filter based visual servo

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Zheng H.

    2016-05-01

    This paper presents a real-time, vision-based algorithm for the pose and motion estimation of non-cooperative targets and its application in visual servo robotic manipulator to perform autonomous capture. A hybrid approach of adaptive extended Kalman filter and photogrammetry is developed for the real-time pose and motion estimation of non-cooperative targets. Based on the pose and motion estimates, the desired pose and trajectory of end-effector is defined and the corresponding desired joint angles of the robotic manipulator are derived by inverse kinematics. A close-loop visual servo control scheme is then developed for the robotic manipulator to track, approach and capture the target. Validating experiments are designed and performed on a custom-built six degrees of freedom robotic manipulator with an eye-in-hand configuration. The experimental results demonstrate the feasibility, effectiveness and robustness of the proposed adaptive extended Kalman filter enabled pose and motion estimation and visual servo strategy.

  3. 3-D world modeling based on combinatorial geometry for autonomous robot navigation

    SciTech Connect

    Goldstein, M.; Pin, F.G.; de Saussure, G.; Weisbin, C.R.

    1987-01-01

    In applications of robotics to surveillance and mapping at nuclear facilities, the scene to be described is fundamentally three-dimensional. Usually, only partial information concerning the 3-D environment is known a-priori. Using an autonomous robot, this information may be updated using range data to provide an accurate model of the environment. Range data quantify the distances from the sensor focal plane to the object surface. In other words, the 3-D coordinates of discrete points on the object surface are known. The approach proposed herein for 3-D world modeling is based on the Combinatorial Geometry (C.G.) Method which is widely used in Monte Carlo particle transport calculations. First, each measured point on the object surface is surrounded by a small solid sphere with a radius determined by the range to that point. Then, the 3-D shapes of the visible surfaces are obtained by taking the (Boolean) union of all the spheres. The result is a concise and unambiguous representation of the object's boundary surfaces. The distances from discrete points on the robot's boundary surface to various objects are calculated effectively using the C.G. type of representation. This feature is particularly useful for navigation purposes. The efficiency of the proposed approach is illustrated by a simulation of a spherical robot navigating in a 3-D room with several static obstacles.

  4. Localization of non-linearly modeled autonomous mobile robots using out-of-sequence measurements.

    PubMed

    Besada-Portas, Eva; Lopez-Orozco, Jose A; Lanillos, Pablo; de la Cruz, Jesus M

    2012-01-01

    This paper presents a state of the art of the estimation algorithms dealing with Out-of-Sequence (OOS) measurements for non-linearly modeled systems. The state of the art includes a critical analysis of the algorithm properties that takes into account the applicability of these techniques to autonomous mobile robot navigation based on the fusion of the measurements provided, delayed and OOS, by multiple sensors. Besides, it shows a representative example of the use of one of the most computationally efficient approaches in the localization module of the control software of a real robot (which has non-linear dynamics, and linear and non-linear sensors) and compares its performance against other approaches. The simulated results obtained with the selected OOS algorithm shows the computational requirements that each sensor of the robot imposes to it. The real experiments show how the inclusion of the selected OOS algorithm in the control software lets the robot successfully navigate in spite of receiving many OOS measurements. Finally, the comparison highlights that not only is the selected OOS algorithm among the best performing ones of the comparison, but it also has the lowest computational and memory cost. PMID:22736962

  5. Optical 3D laser measurement system for navigation of autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Básaca-Preciado, Luis C.; Sergiyenko, Oleg Yu.; Rodríguez-Quinonez, Julio C.; García, Xochitl; Tyrsa, Vera V.; Rivas-Lopez, Moises; Hernandez-Balbuena, Daniel; Mercorelli, Paolo; Podrygalo, Mikhail; Gurko, Alexander; Tabakova, Irina; Starostenko, Oleg

    2014-03-01

    In our current research, we are developing a practical autonomous mobile robot navigation system which is capable of performing obstacle avoiding task on an unknown environment. Therefore, in this paper, we propose a robot navigation system which works using a high accuracy localization scheme by dynamic triangulation. Our two main ideas are (1) integration of two principal systems, 3D laser scanning technical vision system (TVS) and mobile robot (MR) navigation system. (2) Novel MR navigation scheme, which allows benefiting from all advantages of precise triangulation localization of the obstacles, mostly over known camera oriented vision systems. For practical use, mobile robots are required to continue their tasks with safety and high accuracy on temporary occlusion condition. Presented in this work, prototype II of TVS is significantly improved over prototype I of our previous publications in the aspects of laser rays alignment, parasitic torque decrease and friction reduction of moving parts. The kinematic model of the MR used in this work is designed considering the optimal data acquisition from the TVS with the main goal of obtaining in real time, the necessary values for the kinematic model of the MR immediately during the calculation of obstacles based on the TVS data.

  6. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  7. Navigation of Autonomous Mobile Robot under Decision-making Strategy tuned by Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Kamano, Takuya; Yasuno, Takashi; Suzuki, Takayuki; Harada, Hironobu

    This paper describes a novel application of genetic algorithm for navigation of an autonomous mobile robot (AMR) under unknown environments. In the navigation system, the AMR is controlled by the decision-making block, which consists of neural network. To achieve both successful navigation to the goal and the suitable obstacle avoidance, the connection weights of the neural network and speed gains for predefined actions are encoded as genotypes and are tuned simultaneously by genetic algorithm so that the static and dynamic danger-degrees, the energy consumption and the distance and direction errors decrease during the navigation. Experimental results demonstrate the validity of the proposed navigation system.

  8. Concept for practical exercises for studying autonomous flying robots in a university environment: part II

    NASA Astrophysics Data System (ADS)

    Gageik, Nils; Dilger, Erik; Montenegro, Sergio; Schön, Stefan; Wildenhein, Rico; Creutzburg, Reiner; Fischer, Arno

    2015-03-01

    The present paper demonstrates the application of quadcopters as educational material for students in aerospace computer science, as it is already in usage today. The work with quadrotors teaches students theoretical and practical knowledge in the fields of robotics, control theory, aerospace and electrical engineering as well as embedded programming and computer science. For this the material, concept, realization and future view of such a course is discussed in this paper. Besides that, the paper gives a brief overview of student research projects following the course, which are related to the research and development of fully autonomous quadrotors.

  9. Autonomous global sky monitoring with real-time robotic follow-up

    SciTech Connect

    Vestrand, W Thomas; Davis, H; Wren, J; Wozniak, P; Norman, B; White, R; Bloch, J; Fenimore, E; Hodge, Barry; Jah, Moriba; Rast, Richard

    2008-01-01

    We discuss the development of prototypes for a global grid of advanced 'thinking' sky sentinels and robotic follow-up telescopes that observe the full night sky to provide real-time monitoring of the night sky by autonomously recognizing anomalous behavior, selecting targets for detailed investigation, and making real-time anomaly detection to enable rapid recognition and a swift response to transients as they emerge. This T3 global EO grid avoids the limitations imposed by geography and weather to provide persistent monitoring of the night sky.

  10. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  11. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  12. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  13. Demonstration of a Spoken Dialogue Interface for Planning Activities of a Semi-autonomous Robot

    NASA Technical Reports Server (NTRS)

    Dowding, John; Frank, Jeremy; Hockey, Beth Ann; Jonsson, Ari; Aist, Gregory

    2002-01-01

    Planning and scheduling in the face of uncertainty and change pushes the capabilities of both planning and dialogue technologies by requiring complex negotiation to arrive at a workable plan. Planning for use of semi-autonomous robots involves negotiation among multiple participants with competing scientific and engineering goals to co-construct a complex plan. In NASA applications this plan construction is done under severe time pressure so having a dialogue interface to the plan construction tools can aid rapid completion of the process. But, this will put significant demands on spoken dialogue technology, particularly in the areas of dialogue management and generation. The dialogue interface will need to be able to handle the complex dialogue strategies that occur in negotiation dialogues, including hypotheticals and revisions, and the generation component will require an ability to summarize complex plans. This demonstration will describe a work in progress towards building a spoken dialogue interface to the EUROPA planner for the purposes of planning and scheduling the activities of a semi-autonomous robot. A prototype interface has been built for planning the schedule of the Personal Satellite Assistant (PSA), a mobile robot designed for micro-gravity environments that is intended for use on the Space Shuttle and International Space Station. The spoken dialogue interface gives the user the capability to ask for a description of the plan, ask specific questions about the plan, and update or modify the plan. We anticipate that a spoken dialogue interface to the planner will provide a natural augmentation or alternative to the visualization interface, in situations in which the user needs very targeted information about the plan, in situations where natural language can express complex ideas more concisely than GUI actions, or in situations in which a graphical user interface is not appropriate.

  14. Recognition of 3D objects for autonomous mobile robot's navigation in automated shipbuilding

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Cho, Hyungsuck

    2007-10-01

    Nowadays many parts of shipbuilding process are automated, but the painting process is not, because of the difficulty of automated on-line painting quality measurement, harsh painting environment and the difficulty of robot navigation. However, the painting automation is necessary, because it can provide consistent performance of painting film thickness. Furthermore, autonomous mobile robots are strongly required for flexible painting work. However, the main problem of autonomous mobile robot's navigation is that there are many obstacles which are not expressed in the CAD data. To overcome this problem, obstacle detection and recognition are necessary to avoid obstacles and painting work effectively. Until now many object recognition algorithms have been studied, especially 2D object recognition methods using intensity image have been widely studied. However, in our case environmental illumination does not exist, so these methods cannot be used. To overcome this, to use 3D range data must be used, but the problem of using 3D range data is high computational cost and long estimation time of recognition due to huge data base. In this paper, we propose a 3D object recognition algorithm based on PCA (Principle Component Analysis) and NN (Neural Network). In the algorithm, the novelty is that the measured 3D range data is transformed into intensity information, and then adopts the PCA and NN algorithm for transformed intensity information to reduce the processing time and make the data easy to handle which are disadvantages of previous researches of 3D object recognition. A set of experimental results are shown to verify the effectiveness of the proposed algorithm.

  15. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    SciTech Connect

    EISLER, G. RICHARD

    2002-08-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstrate the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.

  16. Towards Autonomous Inspection of Space Systems Using Mobile Robotic Sensor Platforms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Saad, Ashraf; Litt, Jonathan S.

    2007-01-01

    The space transportation systems required to support NASA's Exploration Initiative will demand a high degree of reliability to ensure mission success. This reliability can be realized through autonomous fault/damage detection and repair capabilities. It is crucial that such capabilities are incorporated into these systems since it will be impractical to rely upon Extra-Vehicular Activity (EVA), visual inspection or tele-operation due to the costly, labor-intensive and time-consuming nature of these methods. One approach to achieving this capability is through the use of an autonomous inspection system comprised of miniature mobile sensor platforms that will cooperatively perform high confidence inspection of space vehicles and habitats. This paper will discuss the efforts to develop a small scale demonstration test-bed to investigate the feasibility of using autonomous mobile sensor platforms to perform inspection operations. Progress will be discussed in technology areas including: the hardware implementation and demonstration of robotic sensor platforms, the implementation of a hardware test-bed facility, and the investigation of collaborative control algorithms.

  17. GNC architecture for autonomous robotic capture of a non-cooperative target: Preliminary concept design

    NASA Astrophysics Data System (ADS)

    Jankovic, Marko; Paul, Jan; Kirchner, Frank

    2016-04-01

    Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.

  18. Autonomous charging to enable long-endurance missions for small aerial robots

    NASA Astrophysics Data System (ADS)

    Mulgaonkar, Yash; Kumar, Vijay

    2014-06-01

    The past decade has seen an increased interest towards research involving Autonomous Micro Aerial Vehicles (MAVs). The predominant reason for this is their agility and ability to perform tasks too difficult or dangerous for their human counterparts and to navigate into places where ground robots cannot reach. Among MAVs, rotary wing aircraft such as quadrotors have the ability to operate in confined spaces, hover at a given point in space and perch1 or land on a flat surface. This makes the quadrotor a very attractive aerial platform giving rise to a myriad of research opportunities. The potential of these aerial platforms is severely limited by the constraints on the flight time due to limited battery capacity. This in turn arises from limits on the payload of these rotorcraft. By automating the battery recharging process, creating autonomous MAVs that can recharge their on-board batteries without any human intervention and by employing a team of such agents, the overall mission time can be greatly increased. This paper describes the development, testing, and implementation of a system of autonomous charging stations for a team of Micro Aerial Vehicles. This system was used to perform fully autonomous long-term multi-agent aerial surveillance experiments with persistent station keeping. The scalability of the algorithm used in the experiments described in this paper was also tested by simulating a persistence surveillance scenario for 10 MAVs and charging stations. Finally, this system was successfully implemented to perform a 9½ hour multi-agent persistent flight test. Preliminary implementation of this charging system in experiments involving construction of cubic structures with quadrotors showed a three-fold increase in effective mission time.

  19. A field robot for autonomous laser-based N2O flux measurements

    NASA Astrophysics Data System (ADS)

    Molstad, Lars; Reent Köster, Jan; Bakken, Lars; Dörsch, Peter; Lien, Torgrim; Overskeid, Øyvind; Utstumo, Trygve; Løvås, Daniel; Brevik, Anders

    2014-05-01

    N2O measurements in multi-plot field trials are usually carried out by chamber-based manual gas sampling and subsequent laboratory-based gas chromatographic N2O determination. Spatial and temporal resolution of these measurements are commonly limited by available manpower. However, high spatial and temporal variability of N2O fluxes within individual field plots can add large uncertainties to time- and area-integrated flux estimates. Detailed mapping of this variability would improve these estimates, as well as help our understanding of the factors causing N2O emissions. An autonomous field robot was developed to increase the sampling frequency and to operate outside normal working hours. The base of this system was designed as an open platform able to carry versatile instrumentation. It consists of an electrically motorized platform powered by a lithium-ion battery pack, which is capable of autonomous navigation by means of a combined high precision real-time kinematic (RTK) GPS and an inertial measurement unit (IMU) system. On this platform an elevator is mounted, carrying a lateral boom with a static chamber on each side of the robot. Each chamber is equipped with a frame of plastic foam to seal the chamber when lowered onto the ground by the elevator. N2O flux from the soil covered by the two chambers is sequentially determined by circulating air between each chamber and a laser spectrometer (DLT-100, Los Gatos Research, Mountain View, CA, USA), which monitors the increase in N2O concentration. The target enclosure time is 1 - 2 minutes, but may be longer when emissions are low. CO2 concentrations are determined by a CO2/H2O gas analyzer (LI-840A, LI-COR Inc., Lincoln, NE, USA). Air temperature and air pressure inside both chambers are continuously monitored and logged. Wind speed and direction are monitored by a 3D sonic anemometer on top of the elevator boom. This autonomous field robot can operate during day and night time, and its working hours are only

  20. Self-localization for an autonomous mobile robot based on an omni-directional vision system

    NASA Astrophysics Data System (ADS)

    Chiang, Shu-Yin; Lin, Kuang-Yu; Chia, Tsorng-Lin

    2013-12-01

    In this study, we designed an autonomous mobile robot based on the rules of the Federation of International Robotsoccer Association (FIRA) RoboSot category, integrating the techniques of computer vision, real-time image processing, dynamic target tracking, wireless communication, self-localization, motion control, path planning, and control strategy to achieve the contest goal. The self-localization scheme of the mobile robot is based on the algorithms featured in the images from its omni-directional vision system. In previous works, we used the image colors of the field goals as reference points, combining either dual-circle or trilateration positioning of the reference points to achieve selflocalization of the autonomous mobile robot. However, because the image of the game field is easily affected by ambient light, positioning systems exclusively based on color model algorithms cause errors. To reduce environmental effects and achieve the self-localization of the robot, the proposed algorithm is applied in assessing the corners of field lines by using an omni-directional vision system. Particularly in the mid-size league of the RobotCup soccer competition, selflocalization algorithms based on extracting white lines from the soccer field have become increasingly popular. Moreover, white lines are less influenced by light than are the color model of the goals. Therefore, we propose an algorithm that transforms the omni-directional image into an unwrapped transformed image, enhancing the extraction features. The process is described as follows: First, radical scan-lines were used to process omni-directional images, reducing the computational load and improving system efficiency. The lines were radically arranged around the center of the omni-directional camera image, resulting in a shorter computational time compared with the traditional Cartesian coordinate system. However, the omni-directional image is a distorted image, which makes it difficult to recognize the

  1. An algorithm for image clusters detection and identification based on color for an autonomous mobile robot

    SciTech Connect

    Uy, D.L.

    1996-02-01

    An algorithm for detection and identification of image clusters or {open_quotes}blobs{close_quotes} based on color information for an autonomous mobile robot is developed. The input image data are first processed using a crisp color fuszzyfier, a binary smoothing filter, and a median filter. The processed image data is then inputed to the image clusters detection and identification program. The program employed the concept of {open_quotes}elastic rectangle{close_quotes}that stretches in such a way that the whole blob is finally enclosed in a rectangle. A C-program is develop to test the algorithm. The algorithm is tested only on image data of 8x8 sizes with different number of blobs in them. The algorithm works very in detecting and identifying image clusters.

  2. Autonomous trajectory generation for mobile robots with non-holonomic and steering angle constraints

    SciTech Connect

    Pin, F.G.; Vasseur, H.A.

    1990-01-01

    This paper presents an approach to the trajectory planning of mobile platforms characterized by non-holonomic constraints and constraints on the steering angle and steering angle rate. The approach is based on geometric reasoning and provides deterministic trajectories for all pairs of initial and final configurations (position x, y, and orientation {theta}) of the robot. Furthermore, the method generates trajectories taking into account the forward and reverse mode of motion of the vehicle, or combination of these when complex maneuvering is involved or when the environment is obstructed with obstacles. The trajectory planning algorithm is described, and examples of trajectories generated for a variety of environmental conditions are presented. The generation of the trajectories only takes a few milliseconds of run time on a micro Vax, making the approach quite attractive for use as a real-time motion planner for teleoperated or sensor-based autonomous vehicles in complex environments. 10 refs., 11 figs.

  3. Portable robot for autonomous venipuncture using 3D near infrared image guidance

    PubMed Central

    Chen, Alvin; Nikitczuk, Kevin; Nikitczuk, Jason; Maguire, Tim; Yarmush, Martin

    2015-01-01

    Venipuncture is pivotal to a wide range of clinical interventions and is consequently the leading cause of medical injury in the U.S. Complications associated with venipuncture are exacerbated in difficult settings, where the rate of success depends heavily on the patient's physiology and the practitioner's experience. In this paper, we describe a device that improves the accuracy and safety of the procedure by autonomously establishing a peripheral line for blood draws and IV's. The device combines a near-infrared imaging system, computer vision software, and a robotically driven needle within a portable shell. The device operates by imaging and mapping in real-time the 3D spatial coordinates of subcutaneous veins in order to direct the needle into a designated vein. We demonstrate proof of concept by assessing imaging performance in humans and cannulation accuracy on an advanced phlebotomy training model. PMID:26120592

  4. A Survey on Terrain Assessment Techniques for Autonomous Operation of Planetary Robots

    NASA Astrophysics Data System (ADS)

    Sancho-Pradel, D. L.; Gao, Y.

    A key challenge in autonomous planetary surface exploration is the extraction of meaningful information from sensor data, which would allow a good interpretation of the nearby terrain, and a reasonable assessment of more distant areas. In the last decade, the desire to increase the autonomy of unmanned ground vehicles (UGVs), particularly in terms of off-road navigation, has significantly increased the interest in the field of automated terrain classification. Although the field is relatively new, its advances and goals are scattered across different robotic platforms and applications. The objective of this paper is to present a survey of the field from a planetary exploration perspective, bringing together the underlying techniques, existing approaches and relevant applications under a common framework. The aim is to provide a comprehensive overview to the newcomer in the field, and a structured reference for the practitioners.

  5. Multiresolutional schemata for unsupervised learning of autonomous robots for 3D space operation

    NASA Technical Reports Server (NTRS)

    Lacaze, Alberto; Meystel, Michael; Meystel, Alex

    1994-01-01

    This paper describes a novel approach to the development of a learning control system for autonomous space robot (ASR) which presents the ASR as a 'baby' -- that is, a system with no a priori knowledge of the world in which it operates, but with behavior acquisition techniques that allows it to build this knowledge from the experiences of actions within a particular environment (we will call it an Astro-baby). The learning techniques are rooted in the recursive algorithm for inductive generation of nested schemata molded from processes of early cognitive development in humans. The algorithm extracts data from the environment and by means of correlation and abduction, it creates schemata that are used for control. This system is robust enough to deal with a constantly changing environment because such changes provoke the creation of new schemata by generalizing from experiences, while still maintaining minimal computational complexity, thanks to the system's multiresolutional nature.

  6. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-03-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.

  7. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  8. Robotic-Controlled, Autonomous Friction Stir Welding Processes for In-Situ Fabrication, Maintenance, and Repair

    NASA Astrophysics Data System (ADS)

    Zhou, W.

    NASA s new vision of human and robotic missions to the Moon Mars and beyond will demand large and permanent infrastructures on the Moon and other planets including power plants communication towers human and biomass habitats launch and landing facilities fabrication and repair workshops and research facilities so that material utilization and product development can be carried out and subsisted in-situ The conventional approach of transporting pre-constructed fabricated structures from earth to the Moon planets will no longer be feasible due to limited lifting capacity and extremely high transportation costs associated with long duration space travel To minimize transport of pre-made large structures between earth and the Moon planets minimize crew time for the fabrication and assembly of infrastructures on the Moon planets and to assure crew safety and maintain quality during the operation there is a strong need for robotic capabilities for in-situ fabrication maintenance and repair Clearly development of innovative autonomous in-situ fabrication maintenance and repair technologies is crucial to the success of both NASA s unmanned preparation missions and manned exploration missions In-space material joining is not new to NASA Many lessons were learned from NASA s International Space Welding Experiment which employed the Electron Beam Welding process for space welding experiments Significant safety concerns related to high-energy beams arcing spatter elecromagnetic fields and molten particles were

  9. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  10. Teaching and implementing autonomous robotic lab walkthroughs in a biotech laboratory through model-based visual tracking

    NASA Astrophysics Data System (ADS)

    Wojtczyk, Martin; Panin, Giorgio; Röder, Thorsten; Lenz, Claus; Nair, Suraj; Heidemann, Rüdiger; Goudar, Chetan; Knoll, Alois

    2010-01-01

    After utilizing robots for more than 30 years for classic industrial automation applications, service robots form a constantly increasing market, although the big breakthrough is still awaited. Our approach to service robots was driven by the idea of supporting lab personnel in a biotechnology laboratory. After initial development in Germany, a mobile robot platform extended with an industrial manipulator and the necessary sensors for indoor localization and object manipulation, has been shipped to Bayer HealthCare in Berkeley, CA, USA, a global player in the sector of biopharmaceutical products, located in the San Francisco bay area. The determined goal of the mobile manipulator is to support the off-shift staff to carry out completely autonomous or guided, remote controlled lab walkthroughs, which we implement utilizing a recent development of our computer vision group: OpenTL - an integrated framework for model-based visual tracking.

  11. On the design of neuro-controllers for individual and social learning behaviour in autonomous robots: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Pini, Giovanni; Tuci, Elio

    2008-06-01

    In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).

  12. Intelligent behavior generator for autonomous mobile robots using planning-based AI decision making and supervisory control logic

    NASA Astrophysics Data System (ADS)

    Shah, Hitesh K.; Bahl, Vikas; Martin, Jason; Flann, Nicholas S.; Moore, Kevin L.

    2002-07-01

    In earlier research the Center for Self-Organizing and Intelligent Systems (CSOIS) at Utah State University (USU) have been funded by the US Army Tank-Automotive and Armaments Command's (TACOM) Intelligent Mobility Program to develop and demonstrate enhanced mobility concepts for unmanned ground vehicles (UGVs). One among the several out growths of this work has been the development of a grammar-based approach to intelligent behavior generation for commanding autonomous robotic vehicles. In this paper we describe the use of this grammar for enabling autonomous behaviors. A supervisory task controller (STC) sequences high-level action commands (taken from the grammar) to be executed by the robot. It takes as input a set of goals and a partial (static) map of the environment and produces, from the grammar, a flexible script (or sequence) of the high-level commands that are to be executed by the robot. The sequence is derived by a planning function that uses a graph-based heuristic search (A* -algorithm). Each action command has specific exit conditions that are evaluated by the STC following each task completion or interruption (in the case of disturbances or new operator requests). Depending on the system's state at task completion or interruption (including updated environmental and robot sensor information), the STC invokes a reactive response. This can include sequencing the pending tasks or initiating a re-planning event, if necessary. Though applicable to a wide variety of autonomous robots, an application of this approach is demonstrated via simulations of ODIS, an omni-directional inspection system developed for security applications.

  13. Evaluation of the autonomic response in healthy subjects during treadmill training with assistance of a robot-driven gait orthosis.

    PubMed

    Magagnin, Valentina; Porta, Alberto; Fusini, Laura; Licari, Vittorio; Bo, Ivano; Turiel, Maurizio; Molteni, Franco; Cerutti, Sergio; Caiani, Enrico G

    2009-04-01

    Body weight supported treadmill training assisted with a robotic driven gait orthosis is an emerging clinical tool helpful to restore gait in individuals with loss of motor skills. However, the autonomic response during this rehabilitation protocol is not known. The aim of the study was to evaluate the autonomic response during a routine protocol of motor rehabilitation through spectral and symbolic analyses of short-term heart rate variability in a group of 20 healthy subjects (11 men, mean age 25+/-3.8 years). The protocol included the following phases: (1) sitting position; (2) standing position; (3) suspension during subject instrumentation; (4 and 5) robotic-assisted treadmill locomotion at 1.5km/h and 2.5km/h respectively with partial body weight support; (6) standing recovery after exercise. Results showed a significant tachycardia associated with the reduction in variance during the suspended phase of the protocol compared to the sitting position. Spectral analysis did not demonstrate any significant autonomic response during the entire protocol, while symbolic analysis detected an increase in sympathetic modulation during body suspension and an increase of vagal modulation during walking. These results could be used to improve understanding of the cardiovascular effects of rehabilitation in subjects undergoing robotic driven gait orthosis treadmill training.

  14. Remote Sensing of Radiation Dose Rate by a Robot for Outdoor Usage

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Doi, K.; Kanematsu, H.; Utsumi, Y.; Hashimoto, R.; Takashina, T.

    2013-04-01

    In the present paper, the design and prototyping of a telemetry system, in which GPS, camera, and scintillation counter were mounted on a crawler type traveling vehicle, were conducted for targeting outdoor usage such as school playground. As a result, the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt. The results were as follows: (1) It was confirmed that the crawler type traveling vehicle can be operated smoothly in the school grounds of brick and asphalt (running speed: 17[m/min]). (2) It was confirmed that the location information captured by GPS is visible on the Google map, and that the incorporation of video information is also possible to play. (3)A radiation dose rate of 0.09[μSv / h] was obtained in the ground. The value is less than the 1/40 ([3.8μSv / h]) allowable radiation dose rate for children in Fukushima Prefecture.(4)As a further work, modifying to program traveling, the measurement of the distribution of the radiation dose rate in a school of Fukushima Prefecture, and class delivery on radiation measurement will be carried out.

  15. Autonomous Scheduling of the 1.3-meter Robotically Controlled Telescope (RCT)

    NASA Astrophysics Data System (ADS)

    Strolger, Louis-Gregory; Gelderman, Richard; Carini, Michael T.; Davis, Donald R.; Engle, Scott G.; Guinan, Edward F.; McGruder, Charles H., III; Tedesco, Edward F.; Walter, Donald K.

    2011-03-01

    The 1.3-meter telescope at Kitt Peak operates as a fully robotic instrument for optical imaging. An autonomous scheduling algorithm is an essential component of this observatory, and has been designed to manage numerous requests in various imaging modes in a manner similar to how requests are managed at queue-scheduled observatories, but with greater efficiency. Built from the INSGEN list generator and process spawner originally developed for the Berkeley Automatic Imaging Telescope, the RCT scheduler manages and integrates multi-user observations in real time, according to target and exposure information and program-specific constraints (e.g., user assigned priority, moon avoidance, airmass, or temporal constraints), while accounting for instrument limitations, meteorologic conditions, and other technical constraints. The robust system supports time-critical requests, such as with coordinated observations, while also providing short-term (hours) and long-term (days) monitoring capabilities, and one-off observations. We discuss the RCT scheduler, its current decision tree, and future prospects including integration with active partner-share monitoring (which factor into future observation requests) to insure fairness and parity of requests.

  16. Adjustably Autonomous Multi-agent Plan Execution with an Internal Spacecraft Free-Flying Robot Prototype

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Nicewarner, Keith

    2006-01-01

    We present an multi-agent model-based autonomy architecture with monitoring, planning, diagnosis, and execution elements. We discuss an internal spacecraft free-flying robot prototype controlled by an implementation of this architecture and a ground test facility used for development. In addition, we discuss a simplified environment control life support system for the spacecraft domain also controlled by an implementation of this architecture. We discuss adjustable autonomy and how it applies to this architecture. We describe an interface that provides the user situation awareness of both autonomous systems and enables the user to dynamically edit the plans prior to and during execution as well as control these agents at various levels of autonomy. This interface also permits the agents to query the user or request the user to perform tasks to help achieve the commanded goals. We conclude by describing a scenario where these two agents and a human interact to cooperatively detect, diagnose and recover from a simulated spacecraft fault.

  17. The VIPER project (Visualization Integration Platform for Exploration Research): a biologically inspired autonomous reconfigurable robotic platform for diverse unstructured environments

    NASA Astrophysics Data System (ADS)

    Schubert, Oliver J.; Tolle, Charles R.

    2004-09-01

    Over the last decade the world has seen numerous autonomous vehicle programs. Wheels and track designs are the basis for many of these vehicles. This is primarily due to four main reasons: a vast preexisting knowledge base for these designs, energy efficiency of power sources, scalability of actuators, and the lack of control systems technologies for handling alternate highly complex distributed systems. Though large efforts seek to improve the mobility of these vehicles, many limitations still exist for these systems within unstructured environments, e.g. limited mobility within industrial and nuclear accident sites where existing plant configurations have been extensively changed. These unstructured operational environments include missions for exploration, reconnaissance, and emergency recovery of objects within reconfigured or collapsed structures, e.g. bombed buildings. More importantly, these environments present a clear and present danger for direct human interactions during the initial phases of recovery operations. Clearly, the current classes of autonomous vehicles are incapable of performing in these environments. Thus the next generation of designs must include highly reconfigurable and flexible autonomous robotic platforms. This new breed of autonomous vehicles will be both highly flexible and environmentally adaptable. Presented in this paper is one of the most successful designs from nature, the snake-eel-worm (SEW). This design implements shape memory alloy (SMA) actuators which allow for scaling of the robotic SEW designs from sub-micron scale to heavy industrial implementations without major conceptual redesigns as required in traditional hydraulic, pneumatic, or motor driven systems. Autonomous vehicles based on the SEW design posses the ability to easily move between air based environments and fluid based environments with limited or no reconfiguration. Under a SEW designed vehicle, one not only achieves vastly improved maneuverability within a

  18. Control Algorithms and Simulated Environment Developed and Tested for Multiagent Robotics for Autonomous Inspection of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Wong, Edmond

    2005-01-01

    The NASA Glenn Research Center and academic partners are developing advanced multiagent robotic control algorithms that will enable the autonomous inspection and repair of future propulsion systems. In this application, on-wing engine inspections will be performed autonomously by large groups of cooperative miniature robots that will traverse the surfaces of engine components to search for damage. The eventual goal is to replace manual engine inspections that require expensive and time-consuming full engine teardowns and allow the early detection of problems that would otherwise result in catastrophic component failures. As a preliminary step toward the long-term realization of a practical working system, researchers are developing the technology to implement a proof-of-concept testbed demonstration. In a multiagent system, the individual agents are generally programmed with relatively simple controllers that define a limited set of behaviors. However, these behaviors are designed in such a way that, through the localized interaction among individual agents and between the agents and the environment, they result in self-organized, emergent group behavior that can solve a given complex problem, such as cooperative inspection. One advantage to the multiagent approach is that it allows for robustness and fault tolerance through redundancy in task handling. In addition, the relatively simple agent controllers demand minimal computational capability, which in turn allows for greater miniaturization of the robotic agents.

  19. Total mesorectal excision for rectal cancer with emphasis on pelvic autonomic nerve preservation: Expert technical tips for robotic surgery.

    PubMed

    Kim, Nam Kyu; Kim, Young Wan; Cho, Min Soo

    2015-09-01

    The primary goal of surgical intervention for rectal cancer is to achieve an oncologic cure while preserving function. Since the introduction of total mesorectal excision (TME), the oncologic outcome has improved greatly in terms of local recurrence and cancer-specific survival. However, there are still concerns regarding functional outcomes such as sexual and urinary dysfunction, even among experienced colorectal surgeons. Intraoperative nerve damage is the primary reason for sexual and urinary dysfunction and occurs due to lack of anatomical knowledge and poor visualization of the pelvic autonomic nerves. The rectum is located concavely along the curved sacrum and both the ischial tuberosity and iliac wing limit the pelvic cavity boundary. Thus, pelvic autonomic nerve preservation during dissection in a narrow or deep pelvis, with adherence to the TME principles, is very challenging for colorectal surgeons. Recent developments in robotic technology enable overcoming these difficulties caused by complex pelvic anatomy. This system can facilitate better preservation of the pelvic autonomic nerve and thereby achieve favorable postoperative sexual and voiding functions after rectal cancer surgery. The nerve-preserving TME technique includes identification and preservation of the superior hypogastric plexus nerve, bilateral hypogastric nerves, pelvic plexus, and neurovascular bundles. Standardized procedures should be performed sequentially as follows: posterior dissection, deep posterior dissection, anterior dissection, posterolateral dissection, and final circumferential pelvic dissection toward the pelvic floor. In future perspective, a structured education program on nerve-preserving robotic TME should be incorporated in the training for minimally invasive surgery.

  20. Autonomous mobile robot fast hybrid decision system DT-FAM based on laser system measurement LSM

    NASA Astrophysics Data System (ADS)

    Będkowski, Janusz; Jankowski, Stanisław

    2006-10-01

    In this paper the new intelligent data processing system for mobile robot is described. The robot perception uses the LSM - Laser System Measurement. The innovative fast hybrid decision system is based on fuzzy ARTMAP supported by decision tree. The virtual laboratory of robotics was implemented to execute experiments.

  1. Assessing the Impact of an Autonomous Robotics Competition for STEM Education

    ERIC Educational Resources Information Center

    Chung, C. J. ChanJin; Cartwright, Christopher; Cole, Matthew

    2014-01-01

    Robotics competitions for K-12 students are popular, but are students really learning and improving their STEM scores through robotics competitions? If not, why not? If they are, how much more effective is learning through competitions than traditional classes? Is there room for improvement? What is the best robotics competition model to maximize…

  2. Current challenges in autonomous vehicle development

    NASA Astrophysics Data System (ADS)

    Connelly, J.; Hong, W. S.; Mahoney, R. B., Jr.; Sparrow, D. A.

    2006-05-01

    The field of autonomous vehicles is a rapidly growing one, with significant interest from both government and industry sectors. Autonomous vehicles represent the intersection of artificial intelligence (AI) and robotics, combining decision-making with real-time control. Autonomous vehicles are desired for use in search and rescue, urban reconnaissance, mine detonation, supply convoys, and more. The general adage is to use robots for anything dull, dirty, dangerous or dumb. While a great deal of research has been done on autonomous systems, there are only a handful of fielded examples incorporating machine autonomy beyond the level of teleoperation, especially in outdoor/complex environments. In an attempt to assess and understand the current state of the art in autonomous vehicle development, a few areas where unsolved problems remain became clear. This paper outlines those areas and provides suggestions for the focus of science and technology research. The first step in evaluating the current state of autonomous vehicle development was to develop a definition of autonomy. A number of autonomy level classification systems were reviewed. The resulting working definitions and classification schemes used by the authors are summarized in the opening sections of the paper. The remainder of the report discusses current approaches and challenges in decision-making and real-time control for autonomous vehicles. Suggested research focus areas for near-, mid-, and long-term development are also presented.

  3. Operator-centered control of a semi-autonomous industrial robot

    SciTech Connect

    Spelt, P.F.; Jones, S.L.

    1994-12-31

    This paper presents work done by Oak Ridge National Laboratory and Remotec, Inc., to develop a new operator-centered control system for Remotec`s Andros telerobot. Andros robots are presently used by numerous electric utilities, the armed forces, and numerous law enforcement agencies to perform tasks which are hazardous for human operators. This project has automated task components and enhanced the video graphics display of the robot`s position in the environment to significantly reduce operator workload. The procedure of automating a telerobot requires the addition of computer power to the robot, along with a variety of sensors and encoders to provide information about the robots performance in and relationship to its environment The resulting vehicle serves as a platform for research on strategies to integrate automated tasks with those performed by a human operator. The addition of these capabilities will greatly enhance the safety and efficiency of performance in hazardous environments.

  4. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  5. The Embudito Mission: A Case Study of the Systematics of Autonomous Ground Mobile Robots

    SciTech Connect

    EICKER,PATRICK J.

    2001-02-01

    Ground mobile robots are much in the mind of defense planners at this time, being considered for a significant variety of missions with a diversity ranging from logistics supply to reconnaissance and surveillance. While there has been a very large amount of basic research funded in the last quarter century devoted to mobile robots and their supporting component technologies, little of this science base has been fully developed and deployed--notable exceptions being NASA's Mars rover and several terrestrial derivatives. The material in this paper was developed as a first exemplary step in the development of a more systematic approach to the R and D of ground mobile robots.

  6. Robotics.

    ERIC Educational Resources Information Center

    Waddell, Steve; Doty, Keith L.

    1999-01-01

    "Why Teach Robotics?" (Waddell) suggests that the United States lags behind Europe and Japan in use of robotics in industry and teaching. "Creating a Course in Mobile Robotics" (Doty) outlines course elements of the Intelligent Machines Design Lab. (SK)

  7. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self

  8. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  9. Experiments in autonomous navigation and control of multi-manipulator, free-flying space robots

    NASA Astrophysics Data System (ADS)

    Ullman, Marc Albert

    Although space presents an exciting frontier for science and manufacturing, it has proven to be a costly and dangerous place for humans. It is an ideal environment for sophisticated robots capable of performing tasks that currently require the active participation of astronauts. The Aerospace Robotics Laboratory, working with NASA, has developed an experimental model of a multimanipulator, free-flying space robot capable of capturing and manipulating free-floating objects without human assistance. The experimental robot model uses air-cushion technology to simulate, in two dimensions, the drag-free, zero-g characteristics of space. Fully self-contained, the vehicle/manipulator system is equipped with gas-jet thrusters, two two-link manipulators, an electrical power system, digital and analog I/0 capabilities, high speed vision, and a multiprocessor real-time computer. These subsystems have been carefully integrated in a modular architecture that facilitates maintenance and ease of use. A sophisticated control system was designed and implemented to manage and coordinate the actions of the vehicle/manipulator system. A custom on-board vision system is used for closed-loop endpoint control and object tracking in the robot's local reference frame. A multicamera off-board vision system provides global positioning information to the robot via a wireless communication link. Successful rendezvous, tracking, and capture of free-flying, spinning objects is facilitated by simultaneously controlling the robot base position and manipulator motions. These actions are coordinated by a sophisticated event-driven finite-state machine. A graphical user interface enables a remotely situated operator to provide high-level task description commands to the robot and to monitor the robot's activities while it carries out these assignments. The user interface allows a task to be fully specified before any action takes place, thereby eliminating problems associated with communications

  10. Generating Self-Reliant Teams of Autonomous Cooperating Robots: Desired design Characteristics

    SciTech Connect

    Parker, L.E.

    1999-05-01

    The difficulties in designing a cooperative team are significant. Several of the key questions that must be resolved when designing a cooperative control architecture include: How do we formulate, describe, decompose, and allocate problems among a group of intelligent agents? How do we enable agents to communicate and interact? How do we ensure that agents act coherently in their actions? How do we allow agents to recognize and reconcile conflicts? However, in addition to these key issues, the software architecture must be designed to enable multi-robot teams to be robust, reliable, and flexible. Without these capabilities, the resulting robot team will not be able to successfully deal with the dynamic and uncertain nature of the real world. In this extended abstract, we first describe these desired capabilities. We then briefly describe the ALLIANCE software architecture that we have previously developed for multi-robot cooperation. We then briefly analyze the ALLIANCE architecture in terms of the desired design qualities identified.

  11. Ground Simulation of an Autonomous Satellite Rendezvous and Tracking System Using Dual Robotic Systems

    NASA Technical Reports Server (NTRS)

    Trube, Matthew J.; Hyslop, Andrew M.; Carignan, Craig R.; Easley, Joseph W.

    2012-01-01

    A hardware-in-the-loop ground system was developed for simulating a robotic servicer spacecraft tracking a target satellite at short range. A relative navigation sensor package "Argon" is mounted on the end-effector of a Fanuc 430 manipulator, which functions as the base platform of the robotic spacecraft servicer. Machine vision algorithms estimate the pose of the target spacecraft, mounted on a Rotopod R-2000 platform, relay the solution to a simulation of the servicer spacecraft running in "Freespace", which performs guidance, navigation and control functions, integrates dynamics, and issues motion commands to a Fanuc platform controller so that it tracks the simulated servicer spacecraft. Results will be reviewed for several satellite motion scenarios at different ranges. Key words: robotics, satellite, servicing, guidance, navigation, tracking, control, docking.

  12. Creative Engineering Based Education with Autonomous Robots Considering Job Search Support

    NASA Astrophysics Data System (ADS)

    Takezawa, Satoshi; Nagamatsu, Masao; Takashima, Akihiko; Nakamura, Kaeko; Ohtake, Hideo; Yoshida, Kanou

    The Robotics Course in our Mechanical Systems Engineering Department offers “Robotics Exercise Lessons” as one of its Problem-Solution Based Specialized Subjects. This is intended to motivate students learning and to help them acquire fundamental items and skills on mechanical engineering and improve understanding of Robotics Basic Theory. Our current curriculum was established to accomplish this objective based on two pieces of research in 2005: an evaluation questionnaire on the education of our Mechanical Systems Engineering Department for graduates and a survey on the kind of human resources which companies are seeking and their expectations for our department. This paper reports the academic results and reflections of job search support in recent years as inherited and developed from the previous curriculum.

  13. Behavior generation strategy of artificial behavioral system by self-learning paradigm for autonomous robot tasks

    NASA Astrophysics Data System (ADS)

    Dağlarli, Evren; Temeltaş, Hakan

    2008-04-01

    In this study, behavior generation and self-learning paradigms are investigated for the real-time applications of multi-goal mobile robot tasks. The method is capable to generate new behaviors and it combines them in order to achieve multi goal tasks. The proposed method is composed from three layers: Behavior Generating Module, Coordination Level and Emotion -Motivation Level. Last two levels use Hidden Markov models to manage dynamical structure of behaviors. The kinematics and dynamic model of the mobile robot with non-holonomic constraints are considered in the behavior based control architecture. The proposed method is tested on a four-wheel driven and four-wheel steered mobile robot with constraints in simulation environment and results are obtained successfully.

  14. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  15. First observations of teleseismic P-waves with autonomous underwater robots: towards future global network of mobile seismometers

    NASA Astrophysics Data System (ADS)

    Sukhovich, Alexei; Nolet, Guust; Hello, Yann; Simons, Frederik; Bonnieux, Sébastien

    2013-04-01

    We report here the first successful observations of underwater acoustic signals generated by teleseismic P-waves recorded by autonomous robots MERMAID (short for Mobile Earthquake Recording in Marine Areas by Independent Divers). During 2011-2012 we have conducted three test campaigns for a total duration of about 8 weeks in the Ligurian Sea which have allowed us to record nine teleseismic events (distance more than 60 degree) of magnitudes higher than 6 and one closer event (distance 23 degree) of magnitude 5.5. Our results indicate that no simple relation exists between the magnitude of the source event and the signal-to-noise ratio (SNR) of the corresponding acoustic signals. Other factors, such as fault orientation and meteorological conditions, play an important role in the detectability of the seismic events. We also show examples of the events recorded during these test runs and how their frequency characteristics allow them to be recognized automatically by an algorithm based on the wavelet transform. We shall also report on more recent results obtained during the first fully autonomous run (currently ongoing) of the final MERMAID design in the Mediterranean Sea.

  16. Autonomous intelligent assembly systems LDRD 105746 final report.

    SciTech Connect

    Anderson, Robert J.

    2013-04-01

    This report documents a three-year to develop technology that enables mobile robots to perform autonomous assembly tasks in unstructured outdoor environments. This is a multi-tier problem that requires an integration of a large number of different software technologies including: command and control, estimation and localization, distributed communications, object recognition, pose estimation, real-time scanning, and scene interpretation. Although ultimately unsuccessful in achieving a target brick stacking task autonomously, numerous important component technologies were nevertheless developed. Such technologies include: a patent-pending polygon snake algorithm for robust feature tracking, a color grid algorithm for uniquely identification and calibration, a command and control framework for abstracting robot commands, a scanning capability that utilizes a compact robot portable scanner, and more. This report describes this project and these developed technologies.

  17. Real-time Needle Steering in Response to Rolling Vein Deformation by a 9-DOF Image-Guided Autonomous Venipuncture Robot

    PubMed Central

    Chen, Alvin I.; Balter, Max L.; Maguire, Timothy J.; Yarmush, Martin L.

    2015-01-01

    Venipuncture is the most common invasive medical procedure performed in the United States and the number one cause of hospital injury. Failure rates are particularly high in pediatric and elderly patients, whose veins tend to deform, move, or roll as the needle is introduced. To improve venipuncture accuracy in challenging patient populations, we have developed a portable device that autonomously servos a needle into a suitable vein under image guidance. The device operates in real time, combining near-infrared and ultrasound imaging, computer vision software, and a 9 degrees-of-freedom robot that servos the needle. In this paper, we present the kinematic and mechanical design of the latest generation robot. We then investigate in silico and in vitro the mechanics of vessel rolling and deformation in response to needle insertions performed by the robot. Finally, we demonstrate how the robot can make real-time adjustments under ultrasound image guidance to compensate for subtle vessel motions during venipuncture. PMID:26779381

  18. Approaching Complexity through Planful Play: Kindergarten Children's Strategies in Constructing an Autonomous Robot's Behavior

    ERIC Educational Resources Information Center

    Levy, S. T.; Mioduser, D.

    2010-01-01

    This study investigates how young children master, construct and understand intelligent rule-based robot behaviors, focusing on their strategies in gradually meeting the tasks' complexity. The wider aim is to provide a comprehensive map of the kinds of transitions and learning that take place in constructing simple emergent behaviors, particularly…

  19. A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion.

    PubMed

    Manfredi, L; Assaf, T; Mintchev, S; Marrazza, S; Capantini, L; Orofino, S; Ascari, L; Grillner, S; Wallén, P; Ekeberg, O; Stefanini, C; Dario, P

    2013-10-01

    The bioinspired approach has been key in combining the disciplines of robotics with neuroscience in an effective and promising fashion. Indeed, certain aspects in the field of neuroscience, such as goal-directed locomotion and behaviour selection, can be validated through robotic artefacts. In particular, swimming is a functionally important behaviour where neuromuscular structures, neural control architecture and operation can be replicated artificially following models from biology and neuroscience. In this article, we present a biomimetic system inspired by the lamprey, an early vertebrate that locomotes using anguilliform swimming. The artefact possesses extra- and proprioceptive sensory receptors, muscle-like actuation, distributed embedded control and a vision system. Experiments on optimised swimming and on goal-directed locomotion are reported, as well as the assessment of the performance of the system, which shows high energy efficiency and adaptive behaviour. While the focus is on providing a robotic platform for testing biological models, the reported system can also be of major relevance for the development of engineering system applications. PMID:24030051

  20. A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion.

    PubMed

    Manfredi, L; Assaf, T; Mintchev, S; Marrazza, S; Capantini, L; Orofino, S; Ascari, L; Grillner, S; Wallén, P; Ekeberg, O; Stefanini, C; Dario, P

    2013-10-01

    The bioinspired approach has been key in combining the disciplines of robotics with neuroscience in an effective and promising fashion. Indeed, certain aspects in the field of neuroscience, such as goal-directed locomotion and behaviour selection, can be validated through robotic artefacts. In particular, swimming is a functionally important behaviour where neuromuscular structures, neural control architecture and operation can be replicated artificially following models from biology and neuroscience. In this article, we present a biomimetic system inspired by the lamprey, an early vertebrate that locomotes using anguilliform swimming. The artefact possesses extra- and proprioceptive sensory receptors, muscle-like actuation, distributed embedded control and a vision system. Experiments on optimised swimming and on goal-directed locomotion are reported, as well as the assessment of the performance of the system, which shows high energy efficiency and adaptive behaviour. While the focus is on providing a robotic platform for testing biological models, the reported system can also be of major relevance for the development of engineering system applications.

  1. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment.

  2. Novel microbial diversity retrieved by autonomous robotic exploration of the world's deepest vertical phreatic sinkhole.

    PubMed

    Sahl, Jason W; Fairfield, Nathaniel; Harris, J Kirk; Wettergreen, David; Stone, William C; Spear, John R

    2010-03-01

    The deep phreatic thermal explorer (DEPTHX) is an autonomous underwater vehicle designed to navigate an unexplored environment, generate high-resolution three-dimensional (3-D) maps, collect biological samples based on an autonomous sampling decision, and return to its origin. In the spring of 2007, DEPTHX was deployed in Zacatón, a deep (approximately 318 m), limestone, phreatic sinkhole (cenote) in northeastern Mexico. As DEPTHX descended, it generated a 3-D map based on the processing of range data from 54 onboard sonars. The vehicle collected water column samples and wall biomat samples throughout the depth profile of the cenote. Post-expedition sample analysis via comparative analysis of 16S rRNA gene sequences revealed a wealth of microbial diversity. Traditional Sanger gene sequencing combined with a barcoded-amplicon pyrosequencing approach revealed novel, phylum-level lineages from the domains Bacteria and Archaea; in addition, several novel subphylum lineages were also identified. Overall, DEPTHX successfully navigated and mapped Zacatón, and collected biological samples based on an autonomous decision, which revealed novel microbial diversity in a previously unexplored environment. PMID:20298146

  3. Sustainable Cooperative Robotic Technologies for Human and Robotic Outpost Infrastructure Construction and Maintenance

    NASA Technical Reports Server (NTRS)

    Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric

    2004-01-01

    Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.

  4. An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers

    NASA Technical Reports Server (NTRS)

    Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun

    2007-01-01

    One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the

  5. Design and implementation of a mechanically heterogeneous robot group

    NASA Astrophysics Data System (ADS)

    Sukhatme, Gaurav S.; Montgomery, James F.; Mataric, Maja J.

    1999-08-01

    This paper describes the design and construction of a cooperative, heterogeneous robot group comprised of one semi-autonomous aerial robot and two autonomous ground robots. The robots are designed to perform automated surveillance and reconnaissance of an urban outdoor area using onboard sensing. The ground vehicles have GPS, sonar for obstacle detection and avoidance, and a simple color- based vision system. Navigation is performed using an optimal mixture of odometry and GPS. The helicopter is equipped with a GPS/INS system, a camera, and a framegrabber. Each robot has an embedded 486 PC/104 processor running the QNX real-time operating system. Individual robot controllers are behavior-based and decentralized. We describe a control strategy and architecture that coordinates the robots with minimal top- down planning. The overall system is controlled at high level by a single human operator using a specially designed control unit. The operator is able to task the group with a mission using a minimal amount of training. The group can re-task itself based on sensor inputs and can also be re- tasked by the operator. We describe a particular reconnaissance mission that the robots have been tested with, and lessons learned during the design and implementation. Our initial results with these experiments are encouraging given the challenging mechanics of the aerial robot. We conclude the paper with a discussion of ongoing and future work.

  6. Vector Field Driven Design for Lightweight Signal Processing and Control Schemes for Autonomous Robotic Navigation

    NASA Astrophysics Data System (ADS)

    Mathai, Nebu John; Zourntos, Takis; Kundur, Deepa

    2009-12-01

    We address the problem of realizing lightweight signal processing and control architectures for agents in multirobot systems. Motivated by the promising results of neuromorphic engineering which suggest the efficacy of analog as an implementation substrate for computation, we present the design of an analog-amenable signal processing scheme. We use control and dynamical systems theory both as a description language and as a synthesis toolset to rigorously develop our computational machinery; these mechanisms are mated with structural insights from behavior-based robotics to compose overall algorithmic architectures. Our perspective is that robotic behaviors consist of actions taken by an agent to cause its sensory perception of the environment to evolve in a desired manner. To provide an intuitive aid for designing these behavioral primitives we present a novel visual tool, inspired vector field design, that helps the designer to exploit the dynamics of the environment. We present simulation results and animation videos to demonstrate the signal processing and control architecture in action.

  7. CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2009-12-01

    While artificial vision prostheses are quickly becoming a reality, actual testing time with visual prosthesis carriers is at a premium. Moreover, it is helpful to have a more realistic functional approximation of a blind subject. Instead of a normal subject with a healthy retina looking at a low-resolution (pixelated) image on a computer monitor or head-mounted display, a more realistic approximation is achieved by employing a subject-independent mobile robotic platform that uses a pixelated view as its sole visual input for navigation purposes. We introduce CYCLOPS: an AWD, remote controllable, mobile robotic platform that serves as a testbed for real-time image processing and autonomous navigation systems for the purpose of enhancing the visual experience afforded by visual prosthesis carriers. Complete with wireless Internet connectivity and a fully articulated digital camera with wireless video link, CYCLOPS supports both interactive tele-commanding via joystick, and autonomous self-commanding. Due to its onboard computing capabilities and extended battery life, CYCLOPS can perform complex and numerically intensive calculations, such as image processing and autonomous navigation algorithms, in addition to interfacing to additional sensors. Its Internet connectivity renders CYCLOPS a worldwide accessible testbed for researchers in the field of artificial vision systems. CYCLOPS enables subject-independent evaluation and validation of image processing and autonomous navigation systems with respect to the utility and efficiency of supporting and enhancing visual prostheses, while potentially reducing to a necessary minimum the need for valuable testing time with actual visual prosthesis carriers. PMID:19651459

  8. CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2009-12-01

    While artificial vision prostheses are quickly becoming a reality, actual testing time with visual prosthesis carriers is at a premium. Moreover, it is helpful to have a more realistic functional approximation of a blind subject. Instead of a normal subject with a healthy retina looking at a low-resolution (pixelated) image on a computer monitor or head-mounted display, a more realistic approximation is achieved by employing a subject-independent mobile robotic platform that uses a pixelated view as its sole visual input for navigation purposes. We introduce CYCLOPS: an AWD, remote controllable, mobile robotic platform that serves as a testbed for real-time image processing and autonomous navigation systems for the purpose of enhancing the visual experience afforded by visual prosthesis carriers. Complete with wireless Internet connectivity and a fully articulated digital camera with wireless video link, CYCLOPS supports both interactive tele-commanding via joystick, and autonomous self-commanding. Due to its onboard computing capabilities and extended battery life, CYCLOPS can perform complex and numerically intensive calculations, such as image processing and autonomous navigation algorithms, in addition to interfacing to additional sensors. Its Internet connectivity renders CYCLOPS a worldwide accessible testbed for researchers in the field of artificial vision systems. CYCLOPS enables subject-independent evaluation and validation of image processing and autonomous navigation systems with respect to the utility and efficiency of supporting and enhancing visual prostheses, while potentially reducing to a necessary minimum the need for valuable testing time with actual visual prosthesis carriers.

  9. Vertical stream curricula integration of problem-based learning using an autonomous vacuum robot in a mechatronics course

    NASA Astrophysics Data System (ADS)

    Chin, Cheng; Yue, Keng

    2011-10-01

    Difficulties in teaching a multi-disciplinary subject such as the mechatronics system design module in Departments of Mechatronics Engineering at Temasek Polytechnic arise from the gap in experience and skill among staff and students who have different backgrounds in mechanical, computer and electrical engineering within the Mechatronics Department. The departments piloted a new vertical stream curricula model (VSCAM) to enhance student learning in mechatronics system design through integration of educational activities from the first to the second year of the course. In this case study, a problem-based learning (PBL) method on an autonomous vacuum robot in the mechatronics systems design module was proposed to allow the students to have hands-on experience in the mechatronics system design. The proposed works included in PBL consist of seminar sessions, weekly works and project presentation to provide holistic assessment on teamwork and individual contributions. At the end of VSCAM, an integrative evaluation was conducted using confidence logs, attitude surveys and questionnaires. It was found that the activities were quite appreciated by the participating staff and students. Hence, PBL has served as an effective pedagogical framework for teaching multidisciplinary subjects in mechatronics engineering education if adequate guidance and support are given to staff and students.

  10. A novel autonomous, bioinspired swimming robot developed by neuroscientists and bioengineers.

    PubMed

    Stefanini, C; Orofino, S; Manfredi, L; Mintchev, S; Marrazza, S; Assaf, T; Capantini, L; Sinibaldi, E; Grillner, S; Wallén, P; Dario, P

    2012-06-01

    This paper describes the development of a new biorobotic platform inspired by the lamprey. Design, fabrication and implemented control are all based on biomechanical and neuroscientific findings on this eel-like fish. The lamprey model has been extensively studied and characterized in recent years because it possesses all basic functions and control mechanisms of higher vertebrates, while at the same time having fewer neurons and simplified neural structures. The untethered robot has a flexible body driven by compliant actuators with proprioceptive feedback. It also has binocular vision for vision-based navigation. The platform has been successfully and extensively experimentally tested in aquatic environments, has high energy efficiency and is ready to be used as investigation tool for high level motor tasks. PMID:22619181

  11. A novel autonomous, bioinspired swimming robot developed by neuroscientists and bioengineers.

    PubMed

    Stefanini, C; Orofino, S; Manfredi, L; Mintchev, S; Marrazza, S; Assaf, T; Capantini, L; Sinibaldi, E; Grillner, S; Wallén, P; Dario, P

    2012-06-01

    This paper describes the development of a new biorobotic platform inspired by the lamprey. Design, fabrication and implemented control are all based on biomechanical and neuroscientific findings on this eel-like fish. The lamprey model has been extensively studied and characterized in recent years because it possesses all basic functions and control mechanisms of higher vertebrates, while at the same time having fewer neurons and simplified neural structures. The untethered robot has a flexible body driven by compliant actuators with proprioceptive feedback. It also has binocular vision for vision-based navigation. The platform has been successfully and extensively experimentally tested in aquatic environments, has high energy efficiency and is ready to be used as investigation tool for high level motor tasks.

  12. Semi-autonomous robots for reactor containments. Annual summary report, [1993--1994

    SciTech Connect

    Not Available

    1994-05-06

    During 1993, the activity at the University was split into two primary groups. One group provided direct support for the development and testing of the RVIR vehicle. This effort culminated in a demonstration of the vehicle at ORNL during December. The second group of researchers focused attention on pushing the technology forward in the areas of radiation imaging, navigation, and sensing modalities. A major effort in technology transfer took place during this year. All of these efforts reflected in the periodic progress reports which are attached. During 1994, our attention will change from the Nuclear Energy program to the Environmental Restoration and Waste Management office. The immediate needs of the Robotics Technology Development Program within the Office of Technology Development of EM drove this change in target applications. The University will be working closely with the national laboratories to further develop and transfer existing technologies to mobile platforms which are currently being designed and employed in seriously hazardous environments.

  13. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  14. The Summer Robotic Autonomy Course

    NASA Technical Reports Server (NTRS)

    Nourbakhsh, Illah R.

    2002-01-01

    We offered a first Robotic Autonomy course this summer, located at NASA/Ames' new NASA Research Park, for approximately 30 high school students. In this 7-week course, students worked in ten teams to build then program advanced autonomous robots capable of visual processing and high-speed wireless communication. The course made use of challenge-based curricula, culminating each week with a Wednesday Challenge Day and a Friday Exhibition and Contest Day. Robotic Autonomy provided a comprehensive grounding in elementary robotics, including basic electronics, electronics evaluation, microprocessor programming, real-time control, and robot mechanics and kinematics. Our course then continued the educational process by introducing higher-level perception, action and autonomy topics, including teleoperation, visual servoing, intelligent scheduling and planning and cooperative problem-solving. We were able to deliver such a comprehensive, high-level education in robotic autonomy for two reasons. First, the content resulted from close collaboration between the CMU Robotics Institute and researchers in the Information Sciences and Technology Directorate and various education program/project managers at NASA/Ames. This collaboration produced not only educational content, but will also be focal to the conduct of formative and summative evaluations of the course for further refinement. Second, CMU rapid prototyping skills as well as the PI's low-overhead perception and locomotion research projects enabled design and delivery of affordable robot kits with unprecedented sensory- locomotory capability. Each Trikebot robot was capable of both indoor locomotion and high-speed outdoor motion and was equipped with a high-speed vision system coupled to a low-cost pan/tilt head. As planned, follow the completion of Robotic Autonomy, each student took home an autonomous, competent robot. This robot is the student's to keep, as she explores robotics with an extremely capable tool in the

  15. Nasa's Ant-Inspired Swarmie Robots

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.

    2016-01-01

    As humans push further beyond the grasp of earth, robotic missions in advance of human missions will play an increasingly important role. These robotic systems will find and retrieve valuable resources as part of an in-situ resource utilization (ISRU) strategy. They will need to be highly autonomous while maintaining high task performance levels. NASA Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots to be used as a ground-based research platform for ISRU missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in a previously unmapped environment and return those resources to a central site. This talk will guide the audience through the Swarmie robot project from its conception by students in a New Mexico research lab to its robot trials in an outdoor parking lot at NASA. The software technologies and techniques used on the project will be discussed, as well as various challenges and solutions that were encountered by the development team along the way.

  16. Adaptive fuzzy approach to modeling of operational space for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Musilek, Petr; Gupta, Madan M.

    1998-10-01

    Robots operating in an unstructured environment need high level of modeling of their operational space in order to plan a suitable path from an initial position to a desired goal. From this perspective, operational space modeling seems to be crucial to ensure a sufficient level of autonomy. In order to compile the information from various sources, we propose a fuzzy approach to evaluate each unit region on a grid map by a certain value of transition cost. This value expresses the cost of movement over the unit region: the higher the value, the more expensive the movement through the region in terms of energy, time, danger, etc. The approach for modeling, proposed in this paper, employs fuzzy granulation of information on various terrain features and their combination based on a fuzzy neural network. In order to adapt to the changing environmental conditions, and to improve the validity of constructed cost maps on-line, the system can be endowed with learning abilities. The learning subsystem would change parameters of the fuzzy neural network based decision system by reinforcements derived from comparisons of the actual cost of transition with the cost obtained from the model.

  17. Automatic learning rate adjustment for self-supervising autonomous robot control

    NASA Technical Reports Server (NTRS)

    Arras, Michael K.; Protzel, Peter W.; Palumbo, Daniel L.

    1992-01-01

    Described is an application in which an Artificial Neural Network (ANN) controls the positioning of a robot arm with five degrees of freedom by using visual feedback provided by two cameras. This application and the specific ANN model, local liner maps, are based on the work of Ritter, Martinetz, and Schulten. We extended their approach by generating a filtered, average positioning error from the continuous camera feedback and by coupling the learning rate to this error. When the network learns to position the arm, the positioning error decreases and so does the learning rate until the system stabilizes at a minimum error and learning rate. This abolishes the need for a predetermined cooling schedule. The automatic cooling procedure results in a closed loop control with no distinction between a learning phase and a production phase. If the positioning error suddenly starts to increase due to an internal failure such as a broken joint, or an environmental change such as a camera moving, the learning rate increases accordingly. Thus, learning is automatically activated and the network adapts to the new condition after which the error decreases again and learning is 'shut off'. The automatic cooling is therefore a prerequisite for the autonomy and the fault tolerance of the system.

  18. Volumetric mapping of tubeworm colonies in Kagoshima Bay through autonomous robotic surveys

    NASA Astrophysics Data System (ADS)

    Maki, Toshihiro; Kume, Ayaka; Ura, Tamaki

    2011-07-01

    We developed and tested a comprehensive method for measuring the three-dimensional distribution of tubeworm colonies using an autonomous underwater vehicle (AUV). We derived volumetric measurements such as the volume, area, average height, and number of tubes for colonies of Lamellibrachia satsuma, the world's shallowest-dwelling vestimentiferan tubeworm discovered at a depth of 82 m, at the Haorimushi site in Kagoshima Bay, Japan, by processing geometric and visual data obtained through low-altitude surveys using the AUV Tri-Dog 1. According to the results, the tubeworm colonies cover an area of 151.9 m 2, accounting for 5.8% of the observed area (2600 m 2). The total number of tubes was estimated to be 99,500. Morphological parameters such as area, volume, and average height were estimated for each colony. On the basis of average height, colonies could be clearly separated into two groups, short (0.1-0.3 m) and tall (0.6-0.7 m), independent of the area.

  19. Image processing for navigation on a mobile embedded platform: design of an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Loose, Harald; Lemke, Christiane; Papazov, Chavdar

    2006-02-01

    This paper deals with intelligent mobile platforms connected to a camera controlled by a small hardware-platform called RCUBE. This platform is able to provide features of a typical actuator-sensor board with various inputs and outputs as well as computing power and image recognition capabilities. Several intelligent autonomous RCBUE devices can be equipped and programmed to participate in the BOSPORUS network. These components form an intelligent network for gathering sensor and image data, sensor data fusion, navigation and control of mobile platforms. The RCUBE platform provides a standalone solution for image processing, which will be explained and presented. It plays a major role for several components in a reference implementation of the BOSPORUS system. On the one hand, intelligent cameras will be positioned in the environment, analyzing the events from a fixed point of view and sharing their perceptions with other components in the system. On the other hand, image processing results will contribute to a reliable navigation of a mobile system, which is crucially important. Fixed landmarks and other objects appropriate for determining the position of a mobile system can be recognized. For navigation other methods are added, i.e. GPS calculations and odometers.

  20. Ultra-miniature omni-directional camera for an autonomous flying micro-robot

    NASA Astrophysics Data System (ADS)

    Ferrat, Pascal; Gimkiewicz, Christiane; Neukom, Simon; Zha, Yingyun; Brenzikofer, Alain; Baechler, Thomas

    2008-04-01

    CSEM presents a highly integrated ultra-miniature camera module with omni-directional view dedicated to autonomous micro flying devices. Very tight design and integration requirements (related to size, weight, and power consumption) for the optical, microelectronic and electronic components are fulfilled. The presented ultra-miniature camera platform is based on two major components: a catadioptric lens system and a dedicated image sensor. The optical system consists of a hyperbolic mirror and an imaging lens. The vertical field of view is +10° to -35°.The CMOS image sensor provides a polar pixel field with 128 (horizontal) by 64 (vertical) pixels. Since the number of pixels for each circle is constant, the unwrapped panoramic image achieves a constant resolution in polar direction for all image regions. The whole camera module, delivering 40 frames per second, contains optical image preprocessing for effortless re-mapping of the acquired image into undistorted cylindrical coordinates. The total weight of the complete camera is less than 5 g. The system's outer dimensions are 14.4 mm in height, with a 11.4 mm x 11.4 mm foot print. Thanks to the innovative PROGLOGTM, a dynamic range of over 140 dB is achieved.

  1. Conversion and control of an all-terrain vehicle for use as an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Jacob, John S.; Gunderson, Robert W.; Fullmer, R. R.

    1998-08-01

    A systematic approach to ground vehicle automation is presented, combining low-level controls, trajectory generation and closed-loop path correction in an integrated system. Development of cooperative robotics for precision agriculture at Utah State University required the automation of a full-scale motorized vehicle. The Triton Predator 8- wheeled skid-steering all-terrain vehicle was selected for the project based on its ability to maneuver precisely and the simplicity of controlling the hydrostatic drivetrain. Low-level control was achieved by fitting an actuator on the engine throttle, actuators for the left and right drive controls, encoders on the left and right drive shafts to measure wheel speeds, and a signal pick-off on the alternator for measuring engine speed. Closed loop control maintains a desired engine speed and tracks left and right wheel speeds commands. A trajectory generator produces the wheel speed commands needed to steer the vehicle through a predetermined set of map coordinates. A planar trajectory through the points is computed by fitting a 2D cubic spline over each path segment while enforcing initial and final orientation constraints at segment endpoints. Acceleration and velocity profiles are computed for each trajectory segment, with the velocity over each segment dependent on turning radius. Left and right wheel speed setpoints are obtained by combining velocity and path curvature for each low-level timestep. The path correction algorithm uses GPS position and compass orientation information to adjust the wheel speed setpoints according to the 'crosstrack' and 'downtrack' errors and heading error. Nonlinear models of the engine and the skid-steering vehicle/ground interaction were developed for testing the integrated system in simulation. These test lead to several key design improvements which assisted final implementation on the vehicle.

  2. Master's in Autonomous Systems: An Overview of the Robotics Curriculum and Outcomes at ISEP, Portugal

    ERIC Educational Resources Information Center

    Silva, E.; Almeida, J.; Martins, A.; Baptista, J. P.; Campos Neves, B.

    2013-01-01

    Robotics research in Portugal is increasing every year, but few students embrace it as one of their first choices for study. Until recently, job offers for engineers were plentiful, and those looking for a degree in science and technology would avoid areas considered to be demanding, like robotics. At the undergraduate level, robotics programs are…

  3. A multimodal interface for real-time soldier-robot teaming

    NASA Astrophysics Data System (ADS)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  4. A multimodal interface for real-time soldier-robot teaming

    NASA Astrophysics Data System (ADS)

    Barber, Daniel J.; Howard, Thomas M.; Walter, Matthew R.

    2016-05-01

    Recent research and advances in robotics have led to the development of novel platforms leveraging new sensing capabilities for semantic navigation. As these systems becoming increasingly more robust, they support highly complex commands beyond direct teleoperation and waypoint finding facilitating a transition away from robots as tools to robots as teammates. Supporting future Soldier-Robot teaming requires communication capabilities on par with human-human teams for successful integration of robots. Therefore, as robots increase in functionality, it is equally important that the interface between the Soldier and robot advances as well. Multimodal communication (MMC) enables human-robot teaming through redundancy and levels of communications more robust than single mode interaction. Commercial-off-the-shelf (COTS) technologies released in recent years for smart-phones and gaming provide tools for the creation of portable interfaces incorporating MMC through the use of speech, gestures, and visual displays. However, for multimodal interfaces to be successfully used in the military domain, they must be able to classify speech, gestures, and process natural language in real-time with high accuracy. For the present study, a prototype multimodal interface supporting real-time interactions with an autonomous robot was developed. This device integrated COTS Automated Speech Recognition (ASR), a custom gesture recognition glove, and natural language understanding on a tablet. This paper presents performance results (e.g. response times, accuracy) of the integrated device when commanding an autonomous robot to perform reconnaissance and surveillance activities in an unknown outdoor environment.

  5. Outdoor allergens.

    PubMed Central

    Burge, H A; Rogers, C A

    2000-01-01

    Outdoor allergens are an important part of the exposures that lead to allergic disease. Understanding the role of outdoor allergens requires a knowledge of the nature of outdoor allergen-bearing particles, the distributions of their source, and the nature of the aerosols (particle types, sizes, dynamics of concentrations). Primary sources for outdoor allergens include vascular plants (pollen, fern spores, soy dust), and fungi (spores, hyphae). Nonvascular plants, algae, and arthropods contribute small numbers of allergen-bearing particles. Particles are released from sources into the air by wind, rain, mechanical disturbance, or active discharge mechanisms. Once airborne, they follow the physical laws that apply to all airborne particles. Although some outdoor allergens penetrate indoor spaces, exposure occurs mostly outdoors. Even short-term peak outdoor exposures can be important in eliciting acute symptoms. Monitoring of airborne biological particles is usually by particle impaction and microscopic examination. Centrally located monitoring stations give regional-scale measurements for aeroallergen levels. Evidence for the role of outdoor allergens in allergic rhinitis is strong and is rapidly increasing for a role in asthma. Pollen and fungal spore exposures have both been implicated in acute exacerbations of asthma, and sensitivity to some fungal spores predicts the existence of asthma. Synergism and/or antagonism probably occurs with other outdoor air particles and gases. Control involves avoidance of exposure (staying indoors, preventing entry of outdoor aerosols) as well as immunotherapy, which is effective for pollen but of limited effect for spores. Outdoor allergens have been the subject of only limited studies with respect to the epidemiology of asthma. Much remains to be studied with respect to prevalence patterns, exposure and disease relationships, and control. PMID:10931783

  6. Outdoor Education.

    ERIC Educational Resources Information Center

    Staley, Frederick A., Ed.; And Others

    1983-01-01

    Outdoor education programs can enrich a school's overall curriculum, especially when updated approaches are used. A guide for evaluating outdoor programs and suggestions for building administrative, public, and financial support are provided, and sources of further information are described. A Phoenix, Arizona program illustrates the value of a…

  7. Outdoor Classrooms

    ERIC Educational Resources Information Center

    Mayes, Valynda

    2010-01-01

    An outdoor classroom is the ideal vehicle for community involvement: Parents, native plant societies, 4-H, garden clubs, and master naturalists are all resources waiting to be tapped, as are local businesses offering support. If you enlist your community in the development and maintenance of your outdoor classroom, the entire community will…

  8. Performance of a scanning laser line striper in outdoor lighting

    NASA Astrophysics Data System (ADS)

    Mertz, Christoph

    2013-05-01

    For search and rescue robots and reconnaissance robots it is important to detect objects in their vicinity. We have developed a scanning laser line striper that can produce dense 3D images using active illumination. The scanner consists of a camera and a MEMS-micro mirror based projector. It can also detect the presence of optically difficult material like glass and metal. The sensor can be used for autonomous operation or it can help a human operator to better remotely control the robot. In this paper we will evaluate the performance of the scanner under outdoor illumination, i.e. from operating in the shade to operating in full sunlight. We report the range, resolution and accuracy of the sensor and its ability to reconstruct objects like grass, wooden blocks, wires, metal objects, electronic devices like cell phones, blank RPG, and other inert explosive devices. Furthermore we evaluate its ability to detect the presence of glass and polished metal objects. Lastly we report on a user study that shows a significant improvement in a grasping task. The user is tasked with grasping a wire with the remotely controlled hand of a robot. We compare the time it takes to complete the task using the 3D scanner with using a traditional video camera.

  9. Does It "Want" or "Was It Programmed to..."? Kindergarten Children's Explanations of an Autonomous Robot's Adaptive Functioning

    ERIC Educational Resources Information Center

    Levy, Sharona T.; Mioduser, David

    2008-01-01

    This study investigates young children's perspectives in explaining a self-regulating mobile robot, as they learn to program its behaviors from rules. We explore their descriptions of a robot in action to determine the nature of their explanatory frameworks: psychological or technological. We have also studied the role of an adult's intervention…

  10. Development of dog-like retrieving capability in a ground robot

    NASA Astrophysics Data System (ADS)

    MacKenzie, Douglas C.; Ashok, Rahul; Rehg, James M.; Witus, Gary

    2013-01-01

    This paper presents the Mobile Intelligence Team's approach to addressing the CANINE outdoor ground robot competition. The competition required developing a robot that provided retrieving capabilities similar to a dog, while operating fully autonomously in unstructured environments. The vision team consisted of Mobile Intelligence, the Georgia Institute of Technology, and Wayne State University. Important computer vision aspects of the project were the ability to quickly learn the distinguishing characteristics of novel objects, searching images for the object as the robot drove a search pattern, identifying people near the robot for safe operations, correctly identify the object among distractors, and localizing the object for retrieval. The classifier used to identify the objects will be discussed, including an analysis of its performance, and an overview of the entire system architecture presented. A discussion of the robot's performance in the competition will demonstrate the system's successes in real-world testing.

  11. Proceedings of the 1989 CESAR/CEA (Center for Engineering Systems Advanced Research/Commissariat a l'Energie Atomique) workshop on autonomous mobile robots (May 30--June 1, 1989)

    SciTech Connect

    Harber, K.S.; Pin, F.G. . Center for Engineering Systems Advanced Research)

    1990-03-01

    The US DOE Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) and the Commissariat a l'Energie Atomique's (CEA) Office de Robotique et Productique within the Directorat a la Valorization are working toward a long-term cooperative agreement and relationship in the area of Intelligent Systems Research (ISR). This report presents the proceedings of the first CESAR/CEA Workshop on Autonomous Mobile Robots which took place at ORNL on May 30, 31 and June 1, 1989. The purpose of the workshop was to present and discuss methodologies and algorithms under development at the two facilities in the area of perception and navigation for autonomous mobile robots in unstructured environments. Experimental demonstration of the algorithms and comparison of some of their features were proposed to take place within the framework of a previously mutually agreed-upon demonstration scenario or base-case.'' The base-case scenario described in detail in Appendix A, involved autonomous navigation by the robot in an a priori unknown environment with dynamic obstacles, in order to reach a predetermined goal. From the intermediate goal location, the robot had to search for and locate a control panel, move toward it, and dock in front of the panel face. The CESAR demonstration was successfully accomplished using the HERMIES-IIB robot while subsets of the CEA demonstration performed using the ARES robot simulation and animation system were presented. The first session of the workshop focused on these experimental demonstrations and on the needs and considerations for establishing benchmarks'' for testing autonomous robot control algorithms.

  12. High reliability outdoor sonar prototype based on efficient signal coding.

    PubMed

    Alvarez, Fernando J; Ureña, Jesús; Mazo, Manuel; Hernández, Alvaro; García, Juan J; de Marziani, Carlos

    2006-10-01

    Many mobile robots and autonomous vehicles designed for outdoor operation have incorporated ultrasonic sensors in their navigation systems, whose function is mainly to avoid possible collisions with very close obstacles. The use of these systems in more precise tasks requires signal encoding and the incorporation of pulse compression techniques that have already been used with success in the design of high-performance indoor sonars. However, the transmission of ultrasonic encoded signals outdoors entails a new challenge because of the effects of atmospheric turbulence. This phenomenon causes random fluctuations in the phase and amplitude of traveling acoustic waves, a fact that can make the encoded signal completely unrecognizable by its matched receiver. Atmospheric turbulence is investigated in this work, with the aim of determining the conditions under which it is possible to assure the reliable outdoor operation of an ultrasonic pulse compression system. As a result of this analysis, a novel sonar prototype based on complementary sequences coding is developed and experimentally tested. This encoding scheme provides the system with very useful additional features, namely, high robustness to noise, multi-mode operation capability (simultaneous emissions with minimum cross talk interference), and the possibility of applying an efficient detection algorithm that notably decreases the hardware resource requirements.

  13. CASSY Robot

    NASA Astrophysics Data System (ADS)

    Pittman, Anna; Wright, Ann; Rice, Aaron; Shyaka, Claude

    2014-03-01

    The CASSY Robot project involved two square robots coded in RobotC. The goal was to code a robot to do a certain set of tasks autonomously. To begin with, our task was to code the robot so that it would roam a certain area, marked off by black tape. When the robot hit the black tape, it knew to back up and turn around. It was able to do this thanks to the light sensor that was attached to the bottom of the robot. Also, whenever the robot hit an obstacle, it knew to stop, back up, and turn around. This was primarily to prevent the robot from hurting itself if it hit an obstacle. This was accomplished by using touch sensors set up as bumpers. Once that was accomplished, we attached sonar sensors and created code so that one robot was able to find and track the other robot in a sort of intruder/police scenario. The overall goal of this project was to code the robot so that we can test it against a robot coded exactly the same, but using Layered Mode Selection Logic. Professor.

  14. vSLAM: vision-based SLAM for autonomous vehicle navigation

    NASA Astrophysics Data System (ADS)

    Goncalves, Luis; Karlsson, Niklas; Ostrowski, Jim; Di Bernardo, Enrico; Pirjanian, Paolo

    2004-09-01

    Among the numerous challenges of building autonomous/unmanned vehicles is that of reliable and autonomous localization in an unknown environment. In this paper we present a system that can efficiently and autonomously solve the robotics 'SLAM' problem, where a robot placed in an unknown environment, simultaneously must localize itself and make a map of the environment. The system is vision-based, and makes use of Evolution Robotic's powerful object recognition technology. As the robot explores the environment, it is continuously performing four tasks, using information from acquired images and the drive system odometry. The robot: (1) recognizes previously created 3-D visual landmarks; (2) builds new 3-D visual landmarks; (3) updates the current estimate of its location, using the map; (4) updates the landmark map. In indoor environments, the system can build a map of a 5m by 5m area in approximately 20 minutes, and can localize itself with an accuracy of approximately 15 cm in position and 3 degrees in orientation relative to the global reference frame of the landmark map. The same system can be adapted for outdoor, vehicular use.

  15. Autonomous mobile communication relays

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Everett, Hobart R.; Manouk, Narek; Verma, Ambrish

    2002-07-01

    Maintaining a solid radio communication link between a mobile robot entering a building and an external base station is a well-recognized problem. Modern digital radios, while affording high bandwidth and Internet-protocol-based automatic routing capabilities, tend to operate on line-of-sight links. The communication link degrades quickly as a robot penetrates deeper into the interior of a building. This project investigates the use of mobile autonomous communication relay nodes to extend the effective range of a mobile robot exploring a complex interior environment. Each relay node is a small mobile slave robot equipped with sonar, ladar, and 802.11b radio repeater. For demonstration purposes, four Pioneer 2-DX robots are used as autonomous mobile relays, with SSC-San Diego's ROBART III acting as the lead robot. The relay robots follow the lead robot into a building and are automatically deployed at various locations to maintain a networked communication link back to the remote operator. With their on-board external sensors, they also act as rearguards to secure areas already explored by the lead robot. As the lead robot advances and RF shortcuts are detected, relay nodes that become unnecessary will be reclaimed and reused, all transparent to the operator. This project takes advantage of recent research results from several DARPA-funded tasks at various institutions in the areas of robotic simulation, ad hoc wireless networking, route planning, and navigation. This paper describes the progress of the first six months of the project.

  16. Simulation of the outdoor energy efficiency of an autonomous solar kit based on meteorological data for a site in Central Europa

    NASA Astrophysics Data System (ADS)

    Bouzaki, Mohammed Moustafa; Chadel, Meriem; Benyoucef, Boumediene; Petit, Pierre; Aillerie, Michel

    2016-07-01

    This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europa (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In the proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.

  17. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    PubMed

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs. PMID:24760914

  18. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    PubMed

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

  19. On approximate reasoning and minimal models for the development of robust outdoor vehicle navigation schemes

    SciTech Connect

    Pin, F.G.

    1993-11-01

    Outdoor sensor-based operation of autonomous robots has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. Two basic principles, or philosophies, and their associated methodologies are proposed in an attempt to remedy some of these difficulties. The first principle is based on the concept of ``minimal model`` for accomplishing given tasks and proposes to utilize only the minimum level of information and precision necessary to accomplish elemental functions of complex tasks. This approach diverges completely from the direction taken by most artificial vision studies which conventionally call for crisp and detailed analysis of every available component in the perception data. The paper will first review the basic concepts of this approach and will discuss its pragmatic feasibility when embodied in a behaviorist framework. The second principle which is proposed deals with implicit representation of uncertainties using Fuzzy Set Theory-based approximations and approximate reasoning, rather than explicit (crisp) representation through calculation and conventional propagation techniques. A framework which merges these principles and approaches is presented, and its application to the problem of sensor-based outdoor navigation of a mobile robot is discussed. Results of navigation experiments with a real car in actual outdoor environments are also discussed to illustrate the feasibility of the overall concept.

  20. Outdoor Living.

    ERIC Educational Resources Information Center

    Cotter, Kathy

    Course objectives and learning activities are contained in this curriculum guide for a 16-week home economics course which teaches cooking and sewing skills applicable to outdoor living. The course goals include increasing male enrollment in the home economics program, developing students' self-confidence and ability to work in groups, and…

  1. Outdoor Integration

    ERIC Educational Resources Information Center

    Tatarchuk, Shawna; Eick, Charles

    2011-01-01

    An outdoor classroom is an exciting way to connect the learning of science to nature and the environment. Many school grounds include gardens, grassy areas, courtyards, and wooded areas. Some even have nearby streams or creeks. These are built-in laboratories for inquiry! In the authors' third-grade classroom, they align and integrate…

  2. Outdoor Activities.

    ERIC Educational Resources Information Center

    Minneapolis Independent School District 275, Minn.

    Twenty-four activities suitable for outdoor use by elementary school children are outlined. Activities designed to make children aware of their environment include soil painting, burr collecting, insect and pond water collecting, studies of insect galls and field mice, succession studies, and a model of natural selection using dyed toothpicks. A…

  3. Robotics

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert O.

    2007-01-01

    Lunar robotic functions include: 1. Transport of crew and payloads on the surface of the moon; 2. Offloading payloads from a lunar lander; 3. Handling the deployment of surface systems; with 4. Human commanding of these functions from inside a lunar vehicle, habitat, or extravehicular (space walk), with Earth-based supervision. The systems that will perform these functions may not look like robots from science fiction. In fact, robotic functions may be automated trucks, cranes and winches. Use of this equipment prior to the crew s arrival or in the potentially long periods without crews on the surface, will require that these systems be computer controlled machines. The public release of NASA's Exploration plans at the 2nd Space Exploration Conference (Houston, December 2006) included a lunar outpost with as many as four unique mobility chassis designs. The sequence of lander offloading tasks involved as many as ten payloads, each with a unique set of geometry, mass and interface requirements. This plan was refined during a second phase study concluded in August 2007. Among the many improvements to the exploration plan were a reduction in the number of unique mobility chassis designs and a reduction in unique payload specifications. As the lunar surface system payloads have matured, so have the mobility and offloading functional requirements. While the architecture work continues, the community can expect to see functional requirements in the areas of surface mobility, surface handling, and human-systems interaction as follows: Surface Mobility 1. Transport crew on the lunar surface, accelerating construction tasks, expanding the crew s sphere of influence for scientific exploration, and providing a rapid return to an ascent module in an emergency. The crew transport can be with an un-pressurized rover, a small pressurized rover, or a larger mobile habitat. 2. Transport Extra-Vehicular Activity (EVA) equipment and construction payloads. 3. Transport habitats and

  4. Comparison of optical modeling and neural networks for robot guidance

    NASA Astrophysics Data System (ADS)

    Parasnis, Sameer; Velidandla, Sasanka; Hall, Ernest L.; Anand, Sam

    1998-10-01

    A truly autonomous robot must sense its environment and react appropriately. These issues attain greater importance in an outdoor, variable environment. Previous mobile robot perception systems have relied on hand-coded algorithms for processing sensor information. Recent techniques involve the use of artificial neural networks to process sensor data for mobile robot guidance. A comparison of a fuzzy logic control for an AGV and a neural network perception is described in this work. A mobile robot test bed has been constructed using a golf cart base. The test bed has a fuzzy logic controller which uses both vision and obstacle information and provides the steering and speed controls to the robot. A feed-forward neural network is described to guide the robot using vision and range data. Suitable criteria for comparison will be formulated and the hand-coded system compared with a connectionist model. A comparison of the two systems, with performance, efficiency and reliability as the criteria, will be achieved. The significance of this work is that it provides comparative tradeoffs on two important robot guidance methods.

  5. Robotics

    NASA Technical Reports Server (NTRS)

    Rothschild, Lynn J.

    2012-01-01

    Earth's upper atmosphere is an extreme environment: dry, cold, and irradiated. It is unknown whether our aerobiosphere is limited to the transport of life, or there exist organisms that grow and reproduce while airborne (aerophiles); the microenvironments of suspended particles may harbor life at otherwise uninhabited altitudes[2]. The existence of aerophiles would significantly expand the range of planets considered candidates for life by, for example, including the cooler clouds of a hot Venus-like planet. The X project is an effort to engineer a robotic exploration and biosampling payload for a comprehensive survey of Earth's aerobiology. While many one-shot samples have been retrieved from above 15 km, their results are primarily qualitative; variations in method confound comparisons, leaving such major gaps in our knowledge of aerobiology as quantification of populations at different strata and relative species counts[1]. These challenges and X's preliminary solutions are explicated below. X's primary balloon payload is undergoing a series of calibrations before beginning flights in Spring 2012. A suborbital launch is currently planned for Summer 2012. A series of ground samples taken in Winter 2011 is being used to establish baseline counts and identify likely background contaminants.

  6. Autonomous system for cross-country navigation

    NASA Astrophysics Data System (ADS)

    Stentz, Anthony; Brumitt, Barry L.; Coulter, R. C.; Kelly, Alonzo

    1993-05-01

    Autonomous cross-country navigation is essential for outdoor robots moving about in unstructured environments. Most existing systems use range sensors to determine the shape of the terrain, plan a trajectory that avoids obstacles, and then drive the trajectory. Performance has been limited by the range and accuracy of sensors, insufficient vehicle-terrain interaction models, and the availability of high-speed computers. As these elements improve, higher- speed navigation on rougher terrain becomes possible. We have developed a software system for autonomous navigation that provides for greater capability. The perception system supports a large braking distance by fusing multiple range images to build a map of the terrain in front of the vehicle. The system identifies range shadows and interpolates undersamples regions to account for rough terrain effects. The motion planner reduces computational complexity by investigating a minimum number of trajectories. Speeds along the trajectory are set to provide for dynamic stability. The entire system was tested in simulation, and a subset of the capability was demonstrated on a real vehicle. Results to date include a continuous 5.1 kilometer run across moderate terrain with obstacles. This paper begins with the applications, prior work, limitations, and current paradigms for autonomous cross-country navigation, and then describes our contribution to the area.

  7. Autonomic dysreflexia

    MedlinePlus

    Autonomic hyperreflexia; Spinal cord injury - autonomic dysreflexia; SCI - autonomic dysreflexia ... most common cause of autonomic dysreflexia (AD) is spinal cord injury. The nervous system of people with AD ...

  8. Non-destructive inspection in industrial equipment using robotic mobile manipulation

    NASA Astrophysics Data System (ADS)

    Maurtua, Iñaki; Susperregi, Loreto; Ansuategui, Ander; Fernández, Ane; Ibarguren, Aitor; Molina, Jorge; Tubio, Carlos; Villasante, Cristobal; Felsch, Torsten; Pérez, Carmen; Rodriguez, Jorge R.; Ghrissi, Meftah

    2016-05-01

    MAINBOT project has developed service robots based applications to autonomously execute inspection tasks in extensive industrial plants in equipment that is arranged horizontally (using ground robots) or vertically (climbing robots). The industrial objective has been to provide a means to help measuring several physical parameters in multiple points by autonomous robots, able to navigate and climb structures, handling non-destructive testing sensors. MAINBOT has validated the solutions in two solar thermal plants (cylindrical-parabolic collectors and central tower), that are very demanding from mobile manipulation point of view mainly due to the extension (e.g. a thermal solar plant of 50Mw, with 400 hectares, 400.000 mirrors, 180 km of absorber tubes, 140m height tower), the variability of conditions (outdoor, day-night), safety requirements, etc. Once the technology was validated in simulation, the system was deployed in real setups and different validation tests carried out. In this paper two of the achievements related with the ground mobile inspection system are presented: (1) Autonomous navigation localization and planning algorithms to manage navigation in huge extensions and (2) Non-Destructive Inspection operations: thermography based detection algorithms to provide automatic inspection abilities to the robots.

  9. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot.

    PubMed

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-09-09

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm.

  10. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot.

    PubMed

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  11. Fuzzy Sets in Dynamic Adaptation of Parameters of a Bee Colony Optimization for Controlling the Trajectory of an Autonomous Mobile Robot

    PubMed Central

    Amador-Angulo, Leticia; Mendoza, Olivia; Castro, Juan R.; Rodríguez-Díaz, Antonio; Melin, Patricia; Castillo, Oscar

    2016-01-01

    A hybrid approach composed by different types of fuzzy systems, such as the Type-1 Fuzzy Logic System (T1FLS), Interval Type-2 Fuzzy Logic System (IT2FLS) and Generalized Type-2 Fuzzy Logic System (GT2FLS) for the dynamic adaptation of the alpha and beta parameters of a Bee Colony Optimization (BCO) algorithm is presented. The objective of the work is to focus on the BCO technique to find the optimal distribution of the membership functions in the design of fuzzy controllers. We use BCO specifically for tuning membership functions of the fuzzy controller for trajectory stability in an autonomous mobile robot. We add two types of perturbations in the model for the Generalized Type-2 Fuzzy Logic System to better analyze its behavior under uncertainty and this shows better results when compared to the original BCO. We implemented various performance indices; ITAE, IAE, ISE, ITSE, RMSE and MSE to measure the performance of the controller. The experimental results show better performances using GT2FLS then by IT2FLS and T1FLS in the dynamic adaptation the parameters for the BCO algorithm. PMID:27618062

  12. Intelligent control in mobile robotics: the PANORAMA project

    NASA Astrophysics Data System (ADS)

    Greenway, Phil

    1994-03-01

    The European Community's strategic research initiative in information technology has been in place for seven years. A good example of the pan-European collaborative projects conducted under this initiative is PANORAMA: Perception and Navigation for Autonomous Mobile Robot Applications. This four-and-a-half-year project, completed in October 1993, aimed to prove the feasibility of an autonomous mobile robotic system replacing a human-operated vehicle working outdoors in a partially structured environment. The autonomous control of a mobile rock drilling machine was chosen as a challenging and representative test scenario. This paper presents an overview of intelligent mobile robot control architectures. Goals and objectives of the project are described, together with the makeup of the consortium and the roles of the members within it. The main technical achievements from PANORAMA are then presented, with emphasis given to the problems of realizing intelligent control. In particular, the planning and replanning of a mission, and the corresponding architectural choices and infrastructure required to support the chosen task oriented approach, are discussed. Specific attention is paid to the functional decomposition of the system, and how the requirements for `intelligent control' impact on the organization of the identified system components. Future work and outstanding problems are considered in some concluding remarks.

  13. Planning Flight Paths of Autonomous Aerobots

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric; Elfes, Alberto; Sharma, Shivanjli

    2009-01-01

    Algorithms for planning flight paths of autonomous aerobots (robotic blimps) to be deployed in scientific exploration of remote planets are undergoing development. These algorithms are also adaptable to terrestrial applications involving robotic submarines as well as aerobots and other autonomous aircraft used to acquire scientific data or to perform surveying or monitoring functions.

  14. Autonomous surveillance for biosecurity.

    PubMed

    Jurdak, Raja; Elfes, Alberto; Kusy, Branislav; Tews, Ashley; Hu, Wen; Hernandez, Emili; Kottege, Navinda; Sikka, Pavan

    2015-04-01

    The global movement of people and goods has increased the risk of biosecurity threats and their potential to incur large economic, social, and environmental costs. Conventional manual biosecurity surveillance methods are limited by their scalability in space and time. This article focuses on autonomous surveillance systems, comprising sensor networks, robots, and intelligent algorithms, and their applicability to biosecurity threats. We discuss the spatial and temporal attributes of autonomous surveillance technologies and map them to three broad categories of biosecurity threat: (i) vector-borne diseases; (ii) plant pests; and (iii) aquatic pests. Our discussion reveals a broad range of opportunities to serve biosecurity needs through autonomous surveillance.

  15. Autonomous surveillance for biosecurity.

    PubMed

    Jurdak, Raja; Elfes, Alberto; Kusy, Branislav; Tews, Ashley; Hu, Wen; Hernandez, Emili; Kottege, Navinda; Sikka, Pavan

    2015-04-01

    The global movement of people and goods has increased the risk of biosecurity threats and their potential to incur large economic, social, and environmental costs. Conventional manual biosecurity surveillance methods are limited by their scalability in space and time. This article focuses on autonomous surveillance systems, comprising sensor networks, robots, and intelligent algorithms, and their applicability to biosecurity threats. We discuss the spatial and temporal attributes of autonomous surveillance technologies and map them to three broad categories of biosecurity threat: (i) vector-borne diseases; (ii) plant pests; and (iii) aquatic pests. Our discussion reveals a broad range of opportunities to serve biosecurity needs through autonomous surveillance. PMID:25744760

  16. A positional estimation technique for an autonomous land vehicle in an unstructured environment

    NASA Technical Reports Server (NTRS)

    Talluri, Raj; Aggarwal, J. K.

    1990-01-01

    This paper presents a solution to the positional estimation problem of an autonomous land vehicle navigating in an unstructured mountainous terrain. A Digital Elevation Map (DEM) of the area in which the robot is to navigate is assumed to be given. It is also assumed that the robot is equipped with a camera that can be panned and tilted, and a device to measure the elevation of the robot above the ground surface. No recognizable landmarks are assumed to be present in the environment in which the robot is to navigate. The solution presented makes use of the DEM information, and structures the problem as a heuristic search in the DEM for the possible robot location. The shape and position of the horizon line in the image plane and the known camera geometry of the perspective projection are used as parameters to search the DEM. Various heuristics drawn from the geometric constraints are used to prune the search space significantly. The algorithm is made robust to errors in the imaging process by accounting for the worst care errors. The approach is tested using DEM data of areas in Colorado and Texas. The method is suitable for use in outdoor mobile robots and planetary rovers.

  17. Some Outdoor Educators' Experiences of Outdoor Education

    ERIC Educational Resources Information Center

    Gunn, Terry

    2006-01-01

    The phenomenological study presented in this paper attempts to determine, from outdoor educators, what it meant for them to be teaching outdoor education in Victorian secondary schools during 2004. In 1999, Lugg and Martin surveyed Victorian secondary schools to determine the types of outdoor education programs being run, the objectives of those…

  18. Robotic intelligence kernel

    DOEpatents

    Bruemmer, David J.

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  19. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  20. The Outdoor Programming Handbook.

    ERIC Educational Resources Information Center

    Watters, Ron

    This manual provides guidelines for developing outdoor recreation programs. The manual was prepared for adult outdoor recreation programs, but could be useful for other age groups as well. The following topics are discussed: (1) the historical perspectives of outdoor recreation programming; (2) outdoor programming models, including the club model,…

  1. Outdoor Education Manual.

    ERIC Educational Resources Information Center

    Nashville - Davidson County Metropolitan Public Schools, TN.

    Creative ways to use the outdoors as a part of the regular school curriculum are outlined in this teacher's manual for the elementary grades. Presented for consideration are the general objectives of outdoor education, suggestions for evaluating outdoor education experiences, and techniques for teaching outdoor education. The purpose and functions…

  2. Robotic transportation.

    PubMed

    Lob, W S

    1990-09-01

    Mobile robots perform fetch-and-carry tasks autonomously. An intelligent, sensor-equipped mobile robot does not require dedicated pathways or extensive facility modification. In the hospital, mobile robots can be used to carry specimens, pharmaceuticals, meals, etc. between supply centers, patient areas, and laboratories. The HelpMate (Transitions Research Corp.) mobile robot was developed specifically for hospital environments. To reach a desired destination, Help-Mate navigates with an on-board computer that continuously polls a suite of sensors, matches the sensor data against a pre-programmed map of the environment, and issues drive commands and path corrections. A sender operates the robot with a user-friendly menu that prompts for payload insertion and desired destination(s). Upon arrival at its selected destination, the robot prompts the recipient for a security code or physical key and awaits acknowledgement of payload removal. In the future, the integration of HelpMate with robot manipulators, test equipment, and central institutional information systems will open new applications in more localized areas and should help overcome difficulties in filling transport staff positions.

  3. Robotic transportation.

    PubMed

    Lob, W S

    1990-09-01

    Mobile robots perform fetch-and-carry tasks autonomously. An intelligent, sensor-equipped mobile robot does not require dedicated pathways or extensive facility modification. In the hospital, mobile robots can be used to carry specimens, pharmaceuticals, meals, etc. between supply centers, patient areas, and laboratories. The HelpMate (Transitions Research Corp.) mobile robot was developed specifically for hospital environments. To reach a desired destination, Help-Mate navigates with an on-board computer that continuously polls a suite of sensors, matches the sensor data against a pre-programmed map of the environment, and issues drive commands and path corrections. A sender operates the robot with a user-friendly menu that prompts for payload insertion and desired destination(s). Upon arrival at its selected destination, the robot prompts the recipient for a security code or physical key and awaits acknowledgement of payload removal. In the future, the integration of HelpMate with robot manipulators, test equipment, and central institutional information systems will open new applications in more localized areas and should help overcome difficulties in filling transport staff positions. PMID:2208684

  4. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  5. Outdoors classes

    NASA Astrophysics Data System (ADS)

    Szymanska-Markowska, Barbara

    2016-04-01

    Why should students be trapped within the four walls of the classroom when there are a lot of ideas to have lessons led in the different way? I am not a fan of having lessons at school. For many students it is also boring to stay only at school, too. So I decided to organize workshops and trips to Universities or outdoors. I created KMO ( Discoverer's Club for Teenagers) at my school where students gave me some ideas and we started to make them real. I teach at school where students don't like science. I try hard to change their point of view about it. That's why I started to take parts in different competitions with my students. Last year we measured noise everywhere by the use of applications on a tablet to convince them that noise is very harmful for our body and us. We examined that the most harmful noises were at school's breaks, near the motorways and in the households. We also proved that acoustic screens, which were near the motorways, didn't protect us from noise. We measured that 30 meters from the screens the noise is the same as the motorway. We won the main prize for these measurements. We also got awards for calculating the costs of a car supplied by powered by a solar panel. We measured everything by computer. This year we decided to write an essay about trees and weather. We went to the forest and found the cut trees because we wanted to read the age of tree from the stump. I hadn't known earlier that we could read the weather from the tree's grain. We examined a lot of trees and we can tell that trees are good carriers of information about weather and natural disasters. I started studies safety education and I have a lot of ideas how to get my students interested in this subject that is similar to P.E., physics and chemistry, too. I hope that I will use my abilities from European Space Education Resource Office and GIFT workshop. I plan to use satellite and space to teach my students how they can check information about terrorism, floods or other

  6. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  7. Aerial Explorers and Robotic Ecosystems

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Pisanich, Greg

    2004-01-01

    A unique bio-inspired approach to autonomous aerial vehicle, a.k.a. aerial explorer technology is discussed. The work is focused on defining and studying aerial explorer mission concepts, both as an individual robotic system and as a member of a small robotic "ecosystem." Members of this robotic ecosystem include the aerial explorer, air-deployed sensors and robotic symbiotes, and other assets such as rovers, landers, and orbiters.

  8. Camping and Outdoor Education.

    ERIC Educational Resources Information Center

    Bucher, Charles A.

    Outdoor education has become an integral part of the curriculum in a number of schools across the nation. Outdoor education activities can be readily integrated into physical education, recreation, and adult education programs, as well as science, mathematics, and related fields. Camping and outdoor education should become a part of each child's…

  9. Education and Outdoor Recreation.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    A special study was conducted to determine the needs and demands of the public for outdoor recreation. Increasing amounts of leisure time of the American people are being used for outdoor recreation activities. Ways in which education can help people realize optimum benefit from recreational use of the outdoor environment are discussed.…

  10. Education and Outdoor Recreation.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    Responsibility for meeting the needs and demands of the public for outdoor recreation has led the Bureau of Outdoor Recreation to cooperate with educational institutions and others in order to assist in establishing education programs and activities and to encourage public use and benefits from outdoor recreation. To this end the Bureau conducts…

  11. Outdoor Education in Scotland.

    ERIC Educational Resources Information Center

    Higgins, Peter

    2002-01-01

    Presents an overview of the development of outdoor education in Scotland, including geophysical, historical, political, and social influences on attitudes toward outdoor recreation and education; Scottish theoretical perspectives on outdoor education; and the changing pattern of provision, from public provision in schools to greater involvement of…

  12. Outdoor Environments. Beginnings Workshop.

    ERIC Educational Resources Information Center

    Child Care Information Exchange, 2003

    2003-01-01

    Presents seven articles on outdoor play environments: "Are We Losing Ground?" (Greenman); "Designing and Creating Natural Play Environments for Young Children" (Keeler); "Adventure Playgrounds and Outdoor Safety Issues" (McGinnis); "Trust, the Earth and Children: Birth to Three" (Young); "Outdoor Magic for Family Child Care Providers" (Osborn); "A…

  13. Americans and the Outdoors.

    ERIC Educational Resources Information Center

    1987

    A presidential panel was established in January, 1985 to examine the status of outdoor recreation in the United States. This publication briefly summarizes the full report of the President's Commission on Americans Outdoors. It discusses the essential need for providing every citizen with opportunity for outdoor experiences and suggest how it can…

  14. Your Outdoor Classroom

    ERIC Educational Resources Information Center

    Hinman, Laurie

    2005-01-01

    Physical education is still taught in outdoor settings in many warmer climates of the United States. Even when indoor facilities are available, physical education may be moved outside because of other curricular needs or facility issues. How can physical educators make the outdoor setting seem more like an indoor classroom? Outdoor teaching…

  15. A cognitive robotic system based on the Soar cognitive architecture for mobile robot navigation, search, and mapping missions

    NASA Astrophysics Data System (ADS)

    Hanford, Scott D.

    Most unmanned vehicles used for civilian and military applications are remotely operated or are designed for specific applications. As these vehicles are used to perform more difficult missions or a larger number of missions in remote environments, there will be a great need for these vehicles to behave intelligently and autonomously. Cognitive architectures, computer programs that define mechanisms that are important for modeling and generating domain-independent intelligent behavior, have the potential for generating intelligent and autonomous behavior in unmanned vehicles. The research described in this presentation explored the use of the Soar cognitive architecture for cognitive robotics. The Cognitive Robotic System (CRS) has been developed to integrate software systems for motor control and sensor processing with Soar for unmanned vehicle control. The CRS has been tested using two mobile robot missions: outdoor navigation and search in an indoor environment. The use of the CRS for the outdoor navigation mission demonstrated that a Soar agent could autonomously navigate to a specified location while avoiding obstacles, including cul-de-sacs, with only a minimal amount of knowledge about the environment. While most systems use information from maps or long-range perceptual capabilities to avoid cul-de-sacs, a Soar agent in the CRS was able to recognize when a simple approach to avoiding obstacles was unsuccessful and switch to a different strategy for avoiding complex obstacles. During the indoor search mission, the CRS autonomously and intelligently searches a building for an object of interest and common intersection types. While searching the building, the Soar agent builds a topological map of the environment using information about the intersections the CRS detects. The agent uses this topological model (along with Soar's reasoning, planning, and learning mechanisms) to make intelligent decisions about how to effectively search the building. Once the

  16. Semi autonomous mine detection system

    SciTech Connect

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.

  17. Semi autonomous mine detection system

    NASA Astrophysics Data System (ADS)

    Few, Doug; Versteeg, Roelof; Herman, Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude - from an autonomous robotic perspective - the rapid development and deployment of fieldable systems.

  18. NASA's Robotic Lander Takes Flight

    NASA Video Gallery

    On Wednesday, June 8, the lander prototype managed by the Robotic Lunar Lander Development Project at NASA's Marshall Space Flight Center in Huntsville, Ala., hovered autonomously for 15 seconds at...

  19. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  20. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  1. Using fuzzy behaviors for the outdoor navigation of a car with low-resolution sensors

    SciTech Connect

    Pin, F.G.; Watanabe, Y.

    1993-01-01

    Vehicle control in a priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. A proposed approach using superposition of elemental fuzzy behaviors to emulate human-like qualitative reasoning schemes is first discussed. We then describe how a previously developed navigation scheme implemented on custom-designed VLSI fuzzy inferencing boards for indoor navigation of a small laboratory-type robot was progressively enhanced to investigate two control modes for driving a car in a priori unknown environments on the basis of sparse and imprecise sensor data. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility of outdoor navigation using fuzzy behaviors operating on possibly very inaccurate sensor data.

  2. Using fuzzy behaviors for the outdoor navigation of a car with low-resolution sensors

    SciTech Connect

    Pin, F.G.; Watanabe, Y.

    1993-12-31

    Vehicle control in a priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. A proposed approach using superposition of elemental fuzzy behaviors to emulate human-like qualitative reasoning schemes is first discussed. We then describe how a previously developed navigation scheme implemented on custom-designed VLSI fuzzy inferencing boards for indoor navigation of a small laboratory-type robot was progressively enhanced to investigate two control modes for driving a car in a priori unknown environments on the basis of sparse and imprecise sensor data. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver`s aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility of outdoor navigation using fuzzy behaviors operating on possibly very inaccurate sensor data.

  3. Teleautonomous guidance for mobile robots

    NASA Technical Reports Server (NTRS)

    Borenstein, J.; Koren, Y.

    1990-01-01

    Teleautonomous guidance (TG), a technique for the remote guidance of fast mobile robots, has been developed and implemented. With TG, the mobile robot follows the general direction prescribed by an operator. However, if the robot encounters an obstacle, it autonomously avoids collision with that obstacle while trying to match the prescribed direction as closely as possible. This type of shared control is completely transparent and transfers control between teleoperation and autonomous obstacle avoidance gradually. TG allows the operator to steer vehicles and robots at high speeds and in cluttered environments, even without visual contact. TG is based on the virtual force field (VFF) method, which was developed earlier for autonomous obstacle avoidance. The VFF method is especially suited to the accommodation of inaccurate sensor data (such as that produced by ultrasonic sensors) and sensor fusion, and allows the mobile robot to travel quickly without stopping for obstacles.

  4. Robotic Surveying

    SciTech Connect

    Suzy Cantor-McKinney; Michael Kruzic

    2007-03-01

    ZAPATA ENGINEERING challenged our engineers and scientists, which included robotics expertise from Carnegie Mellon University, to design a solution to meet our client's requirements for rapid digital geophysical and radiological data collection of a munitions test range with no down-range personnel. A prime concern of the project was to minimize exposure of personnel to unexploded ordnance and radiation. The field season was limited by extreme heat, cold and snow. Geographical Information System (GIS) tools were used throughout this project to accurately define the limits of mapped areas, build a common mapping platform from various client products, track production progress, allocate resources and relate subsurface geophysical information to geographical features for use in rapidly reacquiring targets for investigation. We were hopeful that our platform could meet the proposed 35 acres per day, towing both a geophysical package and a radiological monitoring trailer. We held our breath and crossed our fingers as the autonomous Speedrower began to crawl across the playa lakebed. We met our proposed production rate, and we averaged just less than 50 acres per 12-hour day using the autonomous platform with a path tracking error of less than +/- 4 inches. Our project team mapped over 1,800 acres in an 8-week (4 days per week) timeframe. The expertise of our partner, Carnegie Mellon University, was recently demonstrated when their two autonomous vehicle entries finished second and third at the 2005 Defense Advanced Research Projects Agency (DARPA) Grand Challenge. 'The Grand Challenge program was established to help foster the development of autonomous vehicle technology that will some day help save the lives of Americans who are protecting our country on the battlefield', said DARPA Grand Challenge Program Manager, Ron Kurjanowicz. Our autonomous remote-controlled vehicle (ARCV) was a modified New Holland 2550 Speedrower retrofitted to allow the machine

  5. Autonomous navigation system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-08

    A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.

  6. Design and control of a fully omnidirectional and holonomic wheeled platform for robotic vehicles

    SciTech Connect

    Killough, S.M.; Pin, F.G.

    1991-01-01

    For practical robotics application in outdoors or complex environments (e.g. firefighting, warehouse management, floor cleaning, plant surveillance, emergency handling, etc.), mobile robotic platforms are needed for the transport of the manipulative or perceptive robots to various work sites and to assist robot movements at the work site. An ideal platform would be one with full 3-degree-of-freedom (DOF) movement (x, y translation and rotation) to enable tighter navigation and docking, obstacle avoidance, and improved manipulator reach. Three DOFs would also simplify control of the platform and make it more general, thus providing more opportunities for autonomous control. Such a 3-DOF platform based on an orthogonal wheel assembly has been developed at the Oak Ridge National Laboratory and is being used for robotics and artificial intelligence research. This paper presents the mathematical details of the design and control of this type of platform and discusses the experiences and lessons learned with the first prototype. Improvements on the basic wheel assembly as well as optimum platform configurations that are considered for various applications are also discussed.

  7. Examples of design and achievement of vision systems for mobile robotics applications

    NASA Astrophysics Data System (ADS)

    Bonnin, Patrick J.; Cabaret, Laurent; Raulet, Ludovic; Hugel, Vincent; Blazevic, Pierre; M'Sirdi, Nacer K.; Coiffet, Philippe

    2000-10-01

    Our goal is to design and to achieve a multiple purpose vision system for various robotics applications : wheeled robots (like cars for autonomous driving), legged robots (six, four (SONY's AIBO) legged robots, and humanoid), flying robots (to inspect bridges for example) in various conditions : indoor or outdoor. Considering that the constraints depend on the application, we propose an edge segmentation implemented either in software, or in hardware using CPLDs (ASICs or FPGAs could be used too). After discussing the criteria of our choice, we propose a chain of image processing operators constituting an edge segmentation. Although this chain is quite simple and very fast to perform, results appear satisfactory. We proposed a software implementation of it. Its temporal optimization is based on : its implementation under the pixel data flow programming model, the gathering of local processing when it is possible, the simplification of computations, and the use of fast access data structures. Then, we describe a first dedicated hardware implementation of the first part, which requires 9CPLS in this low cost version. It is technically possible, but more expensive, to implement these algorithms using only a signle FPGA.

  8. Test results of autonomous behaviors for urban environment exploration

    NASA Astrophysics Data System (ADS)

    Ahuja, G.; Fellars, D.; Kogut, G.; Pacis Rius, E.; Sights, B.; Everett, H. R.

    2009-05-01

    Under various collaborative efforts with other government labs, private industry, and academia, SPAWAR Systems Center Pacific (SSC Pacific) is developing and testing advanced autonomous behaviors for navigation, mapping, and exploration in various indoor and outdoor settings. As part of the Urban Environment Exploration project, SSC Pacific is maturing those technologies and sensor payload configurations that enable man-portable robots to effectively operate within the challenging conditions of urban environments. For example, additional means to augment GPS is needed when operating in and around urban structures. A MOUT site at Camp Pendleton was selected as the test bed because of its variety in building characteristics, paved/unpaved roads, and rough terrain. Metrics are collected based on the overall system's ability to explore different coverage areas, as well as the performance of the individual component behaviors such as localization and mapping. The behaviors have been developed to be portable and independent of one another, and have been integrated under a generic behavior architecture called the Autonomous Capability Suite. This paper describes the tested behaviors, sensors, and behavior architecture, the variables of the test environment, and the performance results collected so far.

  9. Rotorcraft and Enabling Robotic Rescue

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2010-01-01

    This paper examines some of the issues underlying potential robotic rescue devices (RRD) in the context where autonomous or manned rotorcraft deployment of such robotic systems is a crucial attribute for their success in supporting future disaster relief and emergency response (DRER) missions. As a part of this discussion, work related to proof-of-concept prototyping of two notional RRD systems is summarized.

  10. Fundamentals of Outdoor Enjoyment.

    ERIC Educational Resources Information Center

    Mitchell, Jim; Fear, Gene

    The purpose of this preventive search and rescue teachers guide is to help high school aged youth understand the complexities and priorities necessary to manage a human body in outdoor environments and the value of planning ahead to have on hand the skills and equipment needed for outdoor survival, comfort, and enjoyment. Separate sections present…

  11. Economics of Outdoor Recreation.

    ERIC Educational Resources Information Center

    Clawson, Marion; Knetsch, Jack L.

    Written for the purposes of presenting an overview of outdoor recreation in the United States and defining the significant outdoor recreation policy issues of the next 10 to 20 years, this document also includes major sections on recreation resources and economic considerations. Projections to the year 2000 are made for a national time budget,…

  12. Outdoor Education Resource Guide.

    ERIC Educational Resources Information Center

    Prince George's County Board of Education, Upper Marlboro, MD.

    Developed primarily as a source of information for teachers planning outdoor education experiences, the material in this resource book can be used by any teacher in environmental studies. Subjects and activities most often taught as part of the outdoor education program are outlined both as resource (basic information) and teaching units. The…

  13. The Outdoor Classroom.

    ERIC Educational Resources Information Center

    Thomas, Dorothy E.

    An Outdoor Classroom to prepare pre-service and in-service teachers to utilize vital natural resources as an outdoor laboratory was established in 1974 by Elizabeth City State University. Because of its proximity to the Great Dismal Swamp and the Atlantic, the university's geographical location made it especially suitable for such a course of…

  14. Outdoor and Experiential Education.

    ERIC Educational Resources Information Center

    Farris, Dorothea

    1981-01-01

    Describes Aspen School District's (Colorado) outdoor educational program for all students in grades K-12, focusing on the middle school students' experiences in the Outdoor Education Program. Through these experiences, children (grades 5-8) grow in self-worth, abilities, and emotional attachments; they develop a trust level toward other children…

  15. Selected Outdoor Recreation Statistics.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    In this recreational information report, 96 tables are compiled from Bureau of Outdoor Recreation programs and surveys, other governmental agencies, and private sources. Eight sections comprise the document: (1) The Bureau of Outdoor Recreation, (2) Federal Assistance to Recreation, (3) Recreation Surveys for Planning, (4) Selected Statistics of…

  16. Outdoor Classroom Coordinator

    ERIC Educational Resources Information Center

    Keeler, Rusty

    2010-01-01

    Everybody loves the idea of children playing outdoors. Outside, children get to experience the seasons, challenge their minds and bodies, connect with the natural world, and form a special relationship with the planet. But in order for children to get the most of their outdoor time it is important that the environment be prepared by caring adults…

  17. Effective Thinking Outdoors.

    ERIC Educational Resources Information Center

    Hyde, Rod

    1997-01-01

    Effective Thinking Outdoors (ETO) is an organization that teaches thinking skills and strategies via significant outdoor experiences. Identifies the three elements of thinking as creativity, play, and persistence; presents a graphic depiction of the problem-solving process and aims; and describes an ETO exercise, determining old routes of travel…

  18. Outdoorsman: Outdoor Cooking.

    ERIC Educational Resources Information Center

    Alberta Dept. of Agriculture, Edmonton.

    This Outdoor Cookery manual provides information and instruction on the basic outdoor skills of building suitable cooking fires, handling fires safely, and storing food. The necessity of having the right kind of fire is stressed (high flames for boiling, low for stewing, and coals for frying and broiling). Tips on gauging temperature, what types…

  19. Hunting and Outdoor Education.

    ERIC Educational Resources Information Center

    Matthews, Bruce E.

    1991-01-01

    This article addresses the controversy over including hunting as a part of outdoor education. Historically, figures such as Julian Smith, of the Outdoor Education Project of the 1950's, advocated hunting as a critical element of educating children and youth about care and protection of natural resources. Henry David Thoreau saw hunting experiences…

  20. Maple Leaf Outdoor Centre.

    ERIC Educational Resources Information Center

    Maguire, Molly; Gunton, Ric

    2000-01-01

    Maple Leaf Outdoor Centre (Ontario) has added year-round outdoor education facilities and programs to help support its summer camp for disadvantaged children. Schools, youth centers, religious groups, and athletic teams conduct their own programs, collaborate with staff, or use staff-developed programs emphasizing adventure education and personal…

  1. Enriching the Outdoor Environment.

    ERIC Educational Resources Information Center

    McGinnis, Janet R.

    2002-01-01

    Explains how to expand the range of outdoor learning experiences by providing: (1) prop boxes or play crates; (2) a transition area to make going outdoors easier; (3) opportunities to observe birds, insects, weather, other children, and plants; (4) semi-structures to provide protection from weather conditions; and (5) a digging area for plants.…

  2. A fuzzy behaviorist approach to sensor-based robot control

    SciTech Connect

    Pin, F.G.

    1996-05-01

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-based approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.

  3. Open Issues in Evolutionary Robotics.

    PubMed

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  4. ARIES: A mobile robot inspector

    SciTech Connect

    Byrd, J.S.

    1995-12-31

    ARIES (Autonomous Robotic Inspection Experimental System) is a mobile robot inspection system being developed for the Department of Energy (DOE) to survey and inspect drums containing mixed and low-level radioactive waste stored in warehouses at DOE facilities. The drums are typically stacked four high and arranged in rows with three-foot aisle widths. The robot will navigate through the aisles and perform an autonomous inspection operation, typically performed by a human operator. It will make real-time decisions about the condition of the drums, maintain a database of pertinent information about each drum, and generate reports.

  5. Architecture for robot intelligence

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard Alan (Inventor)

    2004-01-01

    An architecture for robot intelligence enables a robot to learn new behaviors and create new behavior sequences autonomously and interact with a dynamically changing environment. Sensory information is mapped onto a Sensory Ego-Sphere (SES) that rapidly identifies important changes in the environment and functions much like short term memory. Behaviors are stored in a DBAM that creates an active map from the robot's current state to a goal state and functions much like long term memory. A dream state converts recent activities stored in the SES and creates or modifies behaviors in the DBAM.

  6. Exploratorium: Robots.

    ERIC Educational Resources Information Center

    Brand, Judith, Ed.

    2002-01-01

    This issue of Exploratorium Magazine focuses on the topic robotics. It explains how to make a vibrating robotic bug and features articles on robots. Contents include: (1) "Where Robot Mice and Robot Men Run Round in Robot Towns" (Ray Bradbury); (2) "Robots at Work" (Jake Widman); (3) "Make a Vibrating Robotic Bug" (Modesto Tamez); (4) "The Robot…

  7. Asteroid Exploration with Autonomic Systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. The prospective ANTS (Autonomous Nano Technology Swarm) mission comprises autonomous agents including worker agents (small spacecra3) designed to cooperate in asteroid exploration under the overall authoriq of at least one ruler agent (a larger spacecraft) whose goal is to cause science data to be returned to Earth. The ANTS team (ruler plus workers and messenger agents), but not necessarily any individual on the team, will exhibit behaviors that qualify it as an autonomic system, where an autonomic system is defined as a system that self-reconfigures, self-optimizes, self-heals, and self-protects. Autonomic system concepts lead naturally to realistic, scalable architectures rich in capabilities and behaviors. In-depth consideration of a major mission like ANTS in terms of autonomic systems brings new insights into alternative definitions of autonomic behavior. This paper gives an overview of the ANTS mission and discusses the autonomic properties of the mission.

  8. [Robots and intellectual property].

    PubMed

    Larrieu, Jacques

    2013-12-01

    This topic is part of the global issue concerning the necessity to adapt intellectual property law to constant changes in technology. The relationship between robots and IP is dual. On one hand, the robots may be regarded as objects of intellectual property. A robot, like any new machine, could qualify for a protection by a patent. A copyright may protect its appearance if it is original. Its memory, like a database, could be covered by a sui generis right. On the other hand, the question of the protection of the outputs of the robot must be raised. The robots, as the physical embodiment of artificial intelligence, are becoming more and more autonomous. Robot-generated works include less and less human inputs. Are these objects created or invented by a robot copyrightable or patentable? To whom the ownership of these IP rights will be allocated? To the person who manufactured the machine ? To the user of the robot? To the robot itself? All these questions are worth discussing.

  9. Outdoor Leadership "Down Under."

    ERIC Educational Resources Information Center

    Priest, Simon

    1985-01-01

    Examines, compares, and contrasts New Zealand and Australian model programs of outdoor leadership development based upon the British Mountain Leadership Certificate System. Offers ideas on risk management and the issue of certification. (NEC)

  10. Take Your Class Outdoors.

    ERIC Educational Resources Information Center

    Shellenberger, Barbara R.

    1981-01-01

    Offers suggestions for designing outdoor activities to provide students with opportunities for exploring, observing, and discovering. Outlines several science activities for each of the following topics: trees, rocks, soil, insects, wild flowers, grasses, lichens, and clouds. (DS)

  11. A design strategy for autonomous systems

    NASA Technical Reports Server (NTRS)

    Forster, Pete

    1989-01-01

    Some solutions to crucial issues regarding the competent performance of an autonomously operating robot are identified; namely, that of handling multiple and variable data sources containing overlapping information and maintaining coherent operation while responding adequately to changes in the environment. Support for the ideas developed for the construction of such behavior are extracted from speculations in the study of cognitive psychology, an understanding of the behavior of controlled mechanisms, and the development of behavior-based robots in a few robot research laboratories. The validity of these ideas is supported by some simple simulation experiments in the field of mobile robot navigation and guidance.

  12. Optic flow and autonomous navigation.

    PubMed

    Campani, M; Giachetti, A; Torre, V

    1995-01-01

    Many animals, especially insects, compute and use optic flow to control their motion direction and to avoid obstacles. Recent advances in computer vision have shown that an adequate optic flow can be computed from image sequences. Therefore studying whether artificial systems, such as robots, can use optic flow for similar purposes is of particular interest. Experiments are reviewed that suggest the possible use of optic flow for the navigation of a robot moving in indoor and outdoor environments. The optic flow is used to detect and localise obstacles in indoor scenes, such as corridors, offices, and laboratories. These routines are based on the computation of a reduced optic flow. The robot is usually able to avoid large obstacles such as a chair or a person. The avoidance performances of the proposed algorithm critically depend on the optomotor reaction of the robot. The optic flow can be used to understand the ego-motion in outdoor scenes, that is, to obtain information on the absolute velocity of the moving vehicle and to detect the presence of other moving objects. A critical step is the correction of the optic flow for shocks and vibrations present during image acquisition. The results obtained suggest that optic flow can be successfully used by biological and artificial systems to control their navigation. Moreover, both systems require fast and accurate optomotor reactions and need to compensate for the instability of the viewed world. PMID:7617428

  13. Optic flow and autonomous navigation.

    PubMed

    Campani, M; Giachetti, A; Torre, V

    1995-01-01

    Many animals, especially insects, compute and use optic flow to control their motion direction and to avoid obstacles. Recent advances in computer vision have shown that an adequate optic flow can be computed from image sequences. Therefore studying whether artificial systems, such as robots, can use optic flow for similar purposes is of particular interest. Experiments are reviewed that suggest the possible use of optic flow for the navigation of a robot moving in indoor and outdoor environments. The optic flow is used to detect and localise obstacles in indoor scenes, such as corridors, offices, and laboratories. These routines are based on the computation of a reduced optic flow. The robot is usually able to avoid large obstacles such as a chair or a person. The avoidance performances of the proposed algorithm critically depend on the optomotor reaction of the robot. The optic flow can be used to understand the ego-motion in outdoor scenes, that is, to obtain information on the absolute velocity of the moving vehicle and to detect the presence of other moving objects. A critical step is the correction of the optic flow for shocks and vibrations present during image acquisition. The results obtained suggest that optic flow can be successfully used by biological and artificial systems to control their navigation. Moreover, both systems require fast and accurate optomotor reactions and need to compensate for the instability of the viewed world.

  14. Toward autonomous spacecraft

    NASA Technical Reports Server (NTRS)

    Fogel, L. J.; Calabrese, P. G.; Walsh, M. J.; Owens, A. J.

    1982-01-01

    Ways in which autonomous behavior of spacecraft can be extended to treat situations wherein a closed loop control by a human may not be appropriate or even possible are explored. Predictive models that minimize mean least squared error and arbitrary cost functions are discussed. A methodology for extracting cyclic components for an arbitrary environment with respect to usual and arbitrary criteria is developed. An approach to prediction and control based on evolutionary programming is outlined. A computer program capable of predicting time series is presented. A design of a control system for a robotic dense with partially unknown physical properties is presented.

  15. Autonomic neuropathies

    NASA Technical Reports Server (NTRS)

    Low, P. A.

    1998-01-01

    A limited autonomic neuropathy may underlie some unusual clinical syndromes, including the postural tachycardia syndrome, pseudo-obstruction syndrome, heat intolerance, and perhaps chronic fatigue syndrome. Antibodies to autonomic structures are common in diabetes, but their specificity is unknown. The presence of autonomic failure worsens prognosis in the diabetic state. Some autonomic neuropathies are treatable. Familial amyloid polyneuropathy may respond to liver transplantation. There are anecdotal reports of acute panautonomic neuropathy responding to intravenous gamma globulin. Orthostatic hypotension may respond to erythropoietin or midodrine.

  16. KC-135 materials handling robotics

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.

    1991-01-01

    Robot dynamics and control will become an important issue for implementing productive platforms in space. Robotic operations will become necessary for man-tended stations and for efficient performance of routine operations in a manned platform. The current constraints on the use of robotic devices in a microgravity environment appears to be due to an anticipated increase in acceleration levels due to manipulator motion and for safety concerns. The objective of this study will be to provide baseline data to meet that need. Most texts and papers dealing with the kinematics and dynamics of robots assume that the manipulator is composed of joints separated by rigid links. However, in recent years several groups have begun to study the dynamics of flexible manipulators, primarily for applying robots in space and for improving the efficiency and precision of robotic systems. Robotic systems which are being planned for implementation in space have a number of constraints to overcome. Additional concepts which have to be worked out in any robotic implementation for a space platform include teleoperation and degree of autonomous control. Some significant results in developing a robotic workcell for performing robotics research on the KC-135 aircraft in preperation for space-based robotics applications in the future were generated. In addition, it was shown that TREETOPS can be used to simulate the dynamics of robot manipulators for both space and ground-based applications.

  17. KC-135 materials handling robotics

    NASA Astrophysics Data System (ADS)

    Workman, Gary L.

    1991-04-01

    Robot dynamics and control will become an important issue for implementing productive platforms in space. Robotic operations will become necessary for man-tended stations and for efficient performance of routine operations in a manned platform. The current constraints on the use of robotic devices in a microgravity environment appears to be due to an anticipated increase in acceleration levels due to manipulator motion and for safety concerns. The objective of this study will be to provide baseline data to meet that need. Most texts and papers dealing with the kinematics and dynamics of robots assume that the manipulator is composed of joints separated by rigid links. However, in recent years several groups have begun to study the dynamics of flexible manipulators, primarily for applying robots in space and for improving the efficiency and precision of robotic systems. Robotic systems which are being planned for implementation in space have a number of constraints to overcome. Additional concepts which have to be worked out in any robotic implementation for a space platform include teleoperation and degree of autonomous control. Some significant results in developing a robotic workcell for performing robotics research on the KC-135 aircraft in preperation for space-based robotics applications in the future were generated. In addition, it was shown that TREETOPS can be used to simulate the dynamics of robot manipulators for both space and ground-based applications.

  18. Mechanochemically Active Soft Robots.

    PubMed

    Gossweiler, Gregory R; Brown, Cameron L; Hewage, Gihan B; Sapiro-Gheiler, Eitan; Trautman, William J; Welshofer, Garrett W; Craig, Stephen L

    2015-10-14

    The functions of soft robotics are intimately tied to their form-channels and voids defined by an elastomeric superstructure that reversibly stores and releases mechanical energy to change shape, grip objects, and achieve complex motions. Here, we demonstrate that covalent polymer mechanochemistry provides a viable mechanism to convert the same mechanical potential energy used for actuation in soft robots into a mechanochromic, covalent chemical response. A bis-alkene functionalized spiropyran (SP) mechanophore is cured into a molded poly(dimethylsiloxane) (PDMS) soft robot walker and gripper. The stresses and strains necessary for SP activation are compatible with soft robot function. The color change associated with actuation suggests opportunities for not only new color changing or camouflaging strategies, but also the possibility for simultaneous activation of latent chemistry (e.g., release of small molecules, change in mechanical properties, activation of catalysts, etc.) in soft robots. In addition, mechanochromic stress mapping in a functional robotic device might provide a useful design and optimization tool, revealing spatial and temporal force evolution within the robot in a way that might be coupled to autonomous feedback loops that allow the robot to regulate its own activity. The demonstration motivates the simultaneous development of new combinations of mechanophores, materials, and soft, active devices for enhanced functionality.

  19. Mechanochemically Active Soft Robots.

    PubMed

    Gossweiler, Gregory R; Brown, Cameron L; Hewage, Gihan B; Sapiro-Gheiler, Eitan; Trautman, William J; Welshofer, Garrett W; Craig, Stephen L

    2015-10-14

    The functions of soft robotics are intimately tied to their form-channels and voids defined by an elastomeric superstructure that reversibly stores and releases mechanical energy to change shape, grip objects, and achieve complex motions. Here, we demonstrate that covalent polymer mechanochemistry provides a viable mechanism to convert the same mechanical potential energy used for actuation in soft robots into a mechanochromic, covalent chemical response. A bis-alkene functionalized spiropyran (SP) mechanophore is cured into a molded poly(dimethylsiloxane) (PDMS) soft robot walker and gripper. The stresses and strains necessary for SP activation are compatible with soft robot function. The color change associated with actuation suggests opportunities for not only new color changing or camouflaging strategies, but also the possibility for simultaneous activation of latent chemistry (e.g., release of small molecules, change in mechanical properties, activation of catalysts, etc.) in soft robots. In addition, mechanochromic stress mapping in a functional robotic device might provide a useful design and optimization tool, revealing spatial and temporal force evolution within the robot in a way that might be coupled to autonomous feedback loops that allow the robot to regulate its own activity. The demonstration motivates the simultaneous development of new combinations of mechanophores, materials, and soft, active devices for enhanced functionality. PMID:26390078

  20. Robotic systems in orthopaedic surgery.

    PubMed

    Lang, J E; Mannava, S; Floyd, A J; Goddard, M S; Smith, B P; Mofidi, A; Seyler, T M; Jinnah, R H

    2011-10-01

    Robots have been used in surgery since the late 1980s. Orthopaedic surgery began to incorporate robotic technology in 1992, with the introduction of ROBODOC, for the planning and performance of total hip replacement. The use of robotic systems has subsequently increased, with promising short-term radiological outcomes when compared with traditional orthopaedic procedures. Robotic systems can be classified into two categories: autonomous and haptic (or surgeon-guided). Passive surgery systems, which represent a third type of technology, have also been adopted recently by orthopaedic surgeons. While autonomous systems have fallen out of favour, tactile systems with technological improvements have become widely used. Specifically, the use of tactile and passive robotic systems in unicompartmental knee replacement (UKR) has addressed some of the historical mechanisms of failure of non-robotic UKR. These systems assist with increasing the accuracy of the alignment of the components and produce more consistent ligament balance. Short-term improvements in clinical and radiological outcomes have increased the popularity of robot-assisted UKR. Robot-assisted orthopaedic surgery has the potential for improving surgical outcomes. We discuss the different types of robotic systems available for use in orthopaedics and consider the indication, contraindications and limitations of these technologies.

  1. Robotic systems in orthopaedic surgery.

    PubMed

    Lang, J E; Mannava, S; Floyd, A J; Goddard, M S; Smith, B P; Mofidi, A; Seyler, T M; Jinnah, R H

    2011-10-01

    Robots have been used in surgery since the late 1980s. Orthopaedic surgery began to incorporate robotic technology in 1992, with the introduction of ROBODOC, for the planning and performance of total hip replacement. The use of robotic systems has subsequently increased, with promising short-term radiological outcomes when compared with traditional orthopaedic procedures. Robotic systems can be classified into two categories: autonomous and haptic (or surgeon-guided). Passive surgery systems, which represent a third type of technology, have also been adopted recently by orthopaedic surgeons. While autonomous systems have fallen out of favour, tactile systems with technological improvements have become widely used. Specifically, the use of tactile and passive robotic systems in unicompartmental knee replacement (UKR) has addressed some of the historical mechanisms of failure of non-robotic UKR. These systems assist with increasing the accuracy of the alignment of the components and produce more consistent ligament balance. Short-term improvements in clinical and radiological outcomes have increased the popularity of robot-assisted UKR. Robot-assisted orthopaedic surgery has the potential for improving surgical outcomes. We discuss the different types of robotic systems available for use in orthopaedics and consider the indication, contraindications and limitations of these technologies. PMID:21969424

  2. Industrial robots and robotics

    SciTech Connect

    Kafrissen, S.; Stephens, M.

    1984-01-01

    This book discusses the study of robotics. It provides information of hardware, software, applications and economics. Eleven chapters examine the following: Minicomputers, Microcomputers, and Microprocessors; The Servo-Control System; The Activators; Robot Vision Systems; and Robot Workcell Environments. Twelve appendices supplement the data.

  3. Types of verbal interaction with instructable robots

    NASA Technical Reports Server (NTRS)

    Crangle, C.; Suppes, P.; Michalowski, S.

    1987-01-01

    An instructable robot is one that accepts instruction in some natural language such as English and uses that instruction to extend its basic repertoire of actions. Such robots are quite different in conception from autonomously intelligent robots, which provide the impetus for much of the research on inference and planning in artificial intelligence. Examined here are the significant problem areas in the design of robots that learn from vebal instruction. Examples are drawn primarily from our earlier work on instructable robots and recent work on the Robotic Aid for the physically disabled. Natural-language understanding by machines is discussed as well as in the possibilities and limits of verbal instruction. The core problem of verbal instruction, namely, how to achieve specific concrete action in the robot in response to commands that express general intentions, is considered, as are two major challenges to instructability: achieving appropriate real-time behavior in the robot, and extending the robot's language capabilities.

  4. Laser radar in robotics

    SciTech Connect

    Carmer, D.C.; Peterson, L.M.

    1996-02-01

    In this paper the authors describe the basic operating principles of laser radar sensors and the typical algorithms used to process laser radar imagery for robotic applications. The authors review 12 laser radar sensors to illustrate the variety of systems that have been applied to robotic applications wherein information extracted from the laser radar data is used to automatically control a mechanism or process. Next, they describe selected robotic applications in seven areas: autonomous vehicle navigation, walking machine foot placement, automated service vehicles, manufacturing and inspection, automotive, military, and agriculture. They conclude with a discussion of the status of laser radar technology and suggest trends seen in the application of laser radar sensors to robotics. Many new applications are expected as the maturity level progresses and system costs are reduced.

  5. Robotic surgery

    MedlinePlus

    Robot-assisted surgery; Robotic-assisted laparoscopic surgery; Laparoscopic surgery with robotic assistance ... Robotic surgery is similar to laparoscopic surgery. It can be performed through smaller cuts than open surgery. ...

  6. Benchmark on outdoor scenes

    NASA Astrophysics Data System (ADS)

    Zhang, Hairong; Wang, Cheng; Chen, Yiping; Jia, Fukai; Li, Jonathan

    2016-03-01

    Depth super-resolution is becoming popular in computer vision, and most of test data is based on indoor data sets with ground-truth measurements such as Middlebury. However, indoor data sets mainly are acquired from structured light techniques under ideal conditions, which cannot represent the objective world with nature light. Unlike indoor scenes, the uncontrolled outdoor environment is much more complicated and is rich both in visual and depth texture. For that reason, we develop a more challenging and meaningful outdoor benchmark for depth super-resolution using the state-of-the-art active laser scanning system.

  7. Outdoor PV Degradation Comparison

    SciTech Connect

    Jordan, D. C.; Smith, R. M.; Osterwald, C. R.; Gelak, E.; Kurtz, S. R.

    2011-02-01

    As photovoltaic (PV) penetration of the power grid increases, it becomes vital to know how decreased power output; may affect cost over time. In order to predict power delivery, the decline or degradation rates must be determined; accurately. At the Performance and Energy Rating Testbed (PERT) at the Outdoor Test Facility (OTF) at the; National Renewable Energy Laboratory (NREL) more than 40 modules from more than 10 different manufacturers; were compared for their long-term outdoor stability. Because it can accommodate a large variety of modules in a; limited footprint the PERT system is ideally suited to compare modules side-by-side under the same conditions.

  8. Outdoor Recreation, Outdoor Education and the Economy of Scotland.

    ERIC Educational Resources Information Center

    Higgins, Peter

    2000-01-01

    Interviews and a literature review found that outdoor recreation contributes significantly to Scotland's tourist income, particularly in rural areas; outdoor education centers are significant employers in certain rural areas; the provision of outdoor education by secondary schools has decreased in the last 20 years; and therapeutic outdoor…

  9. Evolutionary strategy for achieving autonomous navigation

    NASA Astrophysics Data System (ADS)

    Gage, Douglas W.

    1999-01-01

    An approach is presented for the evolutionary development of supervised autonomous navigation capabilities for small 'backpackable' ground robots, in the context of a DARPA- sponsored program to provide robotic support to small units of dismounted warfighters. This development approach relies on the implementation of a baseline visual serving navigation capability, including tools to support operator oversight and override, which is then enhanced with semantically referenced commands and a mission scripting structure. As current and future machine perception techniques are able to automatically designate visual serving goal points, this approach should provide a natural evolutionary pathway to higher levels of autonomous operation and reduced requirements for operator intervention.

  10. Door breaching robotic manipulator

    NASA Astrophysics Data System (ADS)

    Schoenfeld, Erik; Parrington, Lawrence; von Muehlen, Stephan

    2008-04-01

    As unmanned systems become more commonplace in military, police, and other security forces, they are tasked to perform missions that the original hardware was not designed for. Current military robots are built for rough outdoor conditions and have strong inflexible manipulators designed to handle a wide range of operations. However, these manipulators are not well suited for some essential indoor tasks, including opening doors. This is a complicated kinematic task that places prohibitively difficult control challenges on the robot and the operator. Honeybee and iRobot have designed a modular door-breaching manipulator that mechanically simplifies the demands upon operator and robot. The manipulator connects to the existing robotic arm of the iRobot PackBot EOD. The gripper is optimized for grasping a variety of door knobs, levers, and car-door handles. It works in conjunction with a compliant wrist and magnetic lock-out mechanism that allows the wrist to remain rigid until the gripper has a firm grasp of the handle and then bend with its rotation and the swing of the door. Once the door is unlatched, the operator simply drives the robot through the doorway while the wrist compensates for the complex, multiple degree-of-freedom motion of the door. Once in the doorway the operator releases the handle, the wrist pops back into place, and the robot is ready for the next door. The new manipulator dramatically improves a robot's ability to non-destructively breach doors and perform an inspection of a room's content, a capability that was previously out of reach of unmanned systems.

  11. Adaptive Behavior for Mobile Robots

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance

    2009-01-01

    The term "System for Mobility and Access to Rough Terrain" (SMART) denotes a theoretical framework, a control architecture, and an algorithm that implements the framework and architecture, for enabling a land-mobile robot to adapt to changing conditions. SMART is intended to enable the robot to recognize adverse terrain conditions beyond its optimal operational envelope, and, in response, to intelligently reconfigure itself (e.g., adjust suspension heights or baseline distances between suspension points) or adapt its driving techniques (e.g., engage in a crabbing motion as a switchback technique for ascending steep terrain). Conceived for original application aboard Mars rovers and similar autonomous or semi-autonomous mobile robots used in exploration of remote planets, SMART could also be applied to autonomous terrestrial vehicles to be used for search, rescue, and/or exploration on rough terrain.

  12. The Dirt on Outdoor Classrooms.

    ERIC Educational Resources Information Center

    Rich, Steve

    2000-01-01

    Explains the planning procedure for outdoor classrooms and introduces an integrated unit on monarch butterflies called the Monarch Watch program. Makes recommendations to solve financial problems of outdoor classrooms. (YDS)

  13. Outdoor Education. Resource Catalogue.

    ERIC Educational Resources Information Center

    Manitoba Dept. of Education and Training, Winnipeg.

    The material in this catalog has been compiled to serve as a ready reference for teachers to assist them in locating outdoor education materials and obtaining environmental student project assistance available from government departments and private organizations within the province of Manitoba. Part 1 lists agencies that can provide speakers,…

  14. Journey to the Outdoors

    ERIC Educational Resources Information Center

    Boyd, Margaret

    2013-01-01

    A keen personal interest in natural history, involvement in environmental organisations, and experience, first as a secondary biology teacher and later as a field teacher, means that this author has spent many years working outdoors. Any part of the curriculum involving ecological concepts would lead her to open the door and go outside. She…

  15. Outdoor Unified Studies.

    ERIC Educational Resources Information Center

    Liston, Louise

    Escalante (Utah) High School's outdoor unified studies field trip is a learning experience to be remembered. The four-day camping experience begins with pre-trip plans, pretests, and lecture/introductions to the Anasazi culture and to geologic formations to be visited. Horses (and equipment-carrying trucks) take the students into the desert to set…

  16. Your Brain Outdoors

    ERIC Educational Resources Information Center

    MacEachren, Zabe

    2012-01-01

    The way technology influences a person's cognition is seldom recognized, but is of increasing interest among brain researchers. Outdoor educators tend to pay attention to the way different activities offer different perceptions of an environment. When natural spaces can no longer be accessed, they adapt and simulate natural activities in available…

  17. Outdoor Education in Texas.

    ERIC Educational Resources Information Center

    Myers, Ray H.

    In Dallas in 1970, high school outdoor education began as a cocurricular woods and waters boys' club sponsored by a community sportsman. Within one year, it grew into a fully accredited, coeducational, academic course with a curriculum devoted to the study of wildlife in Texas, ecology, conservation, hunting, firearm safety, fishing, boating and…

  18. Outdoor Adventure Training

    ERIC Educational Resources Information Center

    Dickey, Howard L.

    1978-01-01

    Outdoor adventure training resulted in increased sensitivity, self-confidence, carry-over into intellectual activities, and pro-social change in a variety of university, juvenile, and penal institutional settings. Modifications for urban adventure training opportunities have also been developed but not yet evaluated. (MJB)

  19. Outdoors at Grassroots.

    ERIC Educational Resources Information Center

    Linck, David B.

    1981-01-01

    The Grassroots Project provides a one-year, college level program in the conservation occupations of agriculture, forestry, and wildlife management. Each year, 75 young men and women who wish to pursue outdoor careers are selected to study conservation in a small rural community in Vermont. (JN)

  20. Educating Multicultural Groups Outdoors.

    ERIC Educational Resources Information Center

    Bernardy, Marie

    Not only do we need to give students a strong educational foundation, we also must counteract cultural and psychosocial factors that turn minority students away from a curriculum. One of the most powerful aspects of an outdoor education program is that it can provide participants with unique opportunities to work together to solve problems, thus…

  1. Outdoor Ecology School

    ERIC Educational Resources Information Center

    Cole, Anna Gahl

    2004-01-01

    In this article, the author describes how her high school environmental science students led third graders on a dynamic learning adventure as part of their first annual Outdoor Ecology School. At the water-monitoring site in a nearby national forest, the elementary students conducted field research and scavenger hunts, discovered animal habitats,…

  2. [Science in the Outdoors].

    ERIC Educational Resources Information Center

    Sarage, Joe; And Others

    Designed for instruction of emotionally handicapped children and youth, this resource guide presents science activities and concepts relative to rural and urban outdoor education. Included are 25 different articles, varying from broadly generalized to highly specific concept/activity suggestions which include film and book bibliographies and…

  3. [Outdoor Ethics Information Packet.

    ERIC Educational Resources Information Center

    Izaak Walton League of America, Arlington, VA.

    This document contains information about outdoor ethics issues. The information was compiled by the Izaak Walton League of America, established in 1922 as a national nonprofit organization whose members educate the public about emerging natural resource threats and promote citizen involvement in environmental protection efforts. The league…

  4. Vehicles for Outdoor Recreation.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1983

    1983-01-01

    The Wheelchair Motorcycle Association tests various motorized vehicles that might help the physically disabled child get about outdoors. Vehicles found to be practical for older children and adolescents include three-wheeled motorcycles and customized go-carts. An address for obtaining more information on the association is provided. (SW)

  5. Take Math Outdoors.

    ERIC Educational Resources Information Center

    Schall, William E.

    1984-01-01

    Scavenger hunts, collecting bottle caps, observing shadows, and other outdoor activities can be developed into a mathematics unit that motivates students to acquire basic mathematical skills. A variety of natural ways to collect data are offered to help foster learning. (DF)

  6. Children and the Outdoor Environment

    ERIC Educational Resources Information Center

    Niklasson, Laila; Sandberg, Anette

    2010-01-01

    In this article we will discuss the outdoor environment for younger children with the help of two different concepts. The first concept, affordance, is well known in the discussion about outdoor environments. What the affordance in the outdoor environment is perceived as can differ between actors. How the affordance is used can be another source…

  7. Outdoor Play and Play Equipment.

    ERIC Educational Resources Information Center

    Naylor, Heather

    1985-01-01

    Discusses aspects of the play environment and its effect on children's play behavior. Indoor and outdoor play spaces are considered along with factors affecting the use of outdoor environments for play. Children's preferences for different outdoor play environments and for various play structures are explored. Guides for choosing play equipment…

  8. The Fragmentation of Outdoor Leadership.

    ERIC Educational Resources Information Center

    Cockrell, David

    Although outdoor leadership does not appear to be coalescing into a unified profession, there are potential solutions to this fragmentation. Six robust approaches to the professional provision of outdoor leadership are: (1) outfitting and guiding; (2) organized camping; (3) adventure education, such as Outward Bound; (4) the outdoor school; (5)…

  9. Outdoor Education and Science Achievement

    ERIC Educational Resources Information Center

    Rios, José M.; Brewer, Jessica

    2014-01-01

    Elementary students have limited opportunities to learn science in an outdoor setting at school. Some suggest this is partially due to a lack of teacher efficacy teaching in an outdoor setting. Yet the research literature indicates that outdoor learning experiences develop positive environmental attitudes and can positively affect science…

  10. Outdoor Recreation Action. Report 25.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    This report from the Department of Interior presents information concerning individual state actions and projects related to the broad topic of outdoor recreation. Included are data on the following topics: rights-of-way for recreation; federal financing of outdoor recreation; state and local financing of outdoor recreation; federal acquisition…

  11. Contemporary Perspectives in Outdoor Education.

    ERIC Educational Resources Information Center

    Lewis, Charles A., Jr., Ed.; Carlson, Marcia K., Ed.

    Designed to provide the student of outdoor education with a synthesis of current literature in the field, this collection presents 26 articles which range from administrative to practical applications of outdoor education theory and philosophy. Articles include discussions of: (1) the philosophy of outdoor education; (2) a London school and its…

  12. Vision + Community = Outdoor Learning Stations

    ERIC Educational Resources Information Center

    Eick, Charles; Tatarchuk, Shawna; Anderson, Amy

    2013-01-01

    Outdoor learning areas are becoming more popular as a means for community-based, cross-curricular learning where children study issues of local relevance (Sobel 2004). Outdoor learning areas, any place outside of the school building where children can observe and interact with the natural world around them, include outdoor structures for seating…

  13. Toward cognitive robotics

    NASA Astrophysics Data System (ADS)

    Laird, John E.

    2009-05-01

    Our long-term goal is to develop autonomous robotic systems that have the cognitive abilities of humans, including communication, coordination, adapting to novel situations, and learning through experience. Our approach rests on the recent integration of the Soar cognitive architecture with both virtual and physical robotic systems. Soar has been used to develop a wide variety of knowledge-rich agents for complex virtual environments, including distributed training environments and interactive computer games. For development and testing in robotic virtual environments, Soar interfaces to a variety of robotic simulators and a simple mobile robot. We have recently made significant extensions to Soar that add new memories and new non-symbolic reasoning to Soar's original symbolic processing, which should significantly improve Soar abilities for control of robots. These extensions include episodic memory, semantic memory, reinforcement learning, and mental imagery. Episodic memory and semantic memory support the learning and recalling of prior events and situations as well as facts about the world. Reinforcement learning provides the ability of the system to tune its procedural knowledge - knowledge about how to do things. Mental imagery supports the use of diagrammatic and visual representations that are critical to support spatial reasoning. We speculate on the future of unmanned systems and the need for cognitive robotics to support dynamic instruction and taskability.

  14. Remote Control and Children's Understanding of Robots

    ERIC Educational Resources Information Center

    Somanader, Mark C.; Saylor, Megan M.; Levin, Daniel T.

    2011-01-01

    Children use goal-directed motion to classify agents as living things from early in infancy. In the current study, we asked whether preschoolers are flexible in their application of this criterion by introducing them to robots that engaged in goal-directed motion. In one case the robot appeared to move fully autonomously, and in the other case it…

  15. JPL Robotics Technology Applicable to Agriculture

    NASA Technical Reports Server (NTRS)

    Udomkesmalee, Suraphol Gabriel; Kyte, L.

    2008-01-01

    This slide presentation describes several technologies that are developed for robotics that are applicable for agriculture. The technologies discussed are detection of humans to allow safe operations of autonomous vehicles, and vision guided robotic techniques for shoot selection, separation and transfer to growth media,

  16. Robot and robot system

    NASA Technical Reports Server (NTRS)

    Behar, Alberto E. (Inventor); Marzwell, Neville I. (Inventor); Wall, Jonathan N. (Inventor); Poole, Michael D. (Inventor)

    2011-01-01

    A robot and robot system that are capable of functioning in a zero-gravity environment are provided. The robot can include a body having a longitudinal axis and having a control unit and a power source. The robot can include a first leg pair including a first leg and a second leg. Each leg of the first leg pair can be pivotally attached to the body and constrained to pivot in a first leg pair plane that is substantially perpendicular to the longitudinal axis of the body.

  17. Boudreaux the Robot (a.k.a. EVA Robotic Assistant)

    NASA Technical Reports Server (NTRS)

    Shillcutt, Kimberly; Burridge, Robert; Graham, Jeffrey

    2002-01-01

    The EVA Robotic Assistant is a prototype for an autonomous rover designed to assist human astronauts. The primary focus of the research is to explore the interaction between humans and robots, particularly in extreme environments, and to develop a software infrastructure that could be applied to any type of assistant robot, whether for planetary exploration or orbital missions. This paper describes the background and current status of the project, the types of scenarios addressed in field demonstrations, the hardware and software that comprise the current prototype, and future research plans.

  18. Integrated mobile robot control

    NASA Technical Reports Server (NTRS)

    Amidi, Omead; Thorpe, Charles

    1991-01-01

    This paper describes the structure, implementation, and operation of a real-time mobile robot controller which integrates capabilities such as: position estimation, path specification and tracking, human interfaces, fast communication, and multiple client support. The benefits of such high-level capabilities in a low-level controller was shown by its implementation for the Navlab autonomous vehicle. In addition, performance results from positioning and tracking systems are reported and analyzed.

  19. Outdoor Lighting Ordinances

    NASA Astrophysics Data System (ADS)

    Davis, S.

    2004-05-01

    A principal means to prevent poor exterior lighting practices is a lighting control ordinance. It is an enforceable legal restriction on specific lighting practices that are deemed unacceptable by the government body having jurisdiction. Outdoor lighting codes have proven to be effective at reducing polluting and trespassing light. A well written exterior lighting code will permit all forms of necessary illumination at reasonable intensities, but will demand shielding and other measures to prevent trespass and light pollution. A good code will also apply to all forms of outdoor lighting, including streets, highways, and exterior signs, as well as the lighting on dwellings, commercial and industrial buildings and building sites. A good code can make exceptions for special uses, provided it complies with an effective standard. The IDA Model Lighting Ordinance is a response to these requests. It is intended as an aid to communities that are seeking to take control of their outdoor lighting, to "take back the night" that is being lost to careless and excessive use of night lighting.

  20. Science, technology and the future of small autonomous drones.

    PubMed

    Floreano, Dario; Wood, Robert J

    2015-05-28

    We are witnessing the advent of a new era of robots - drones - that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.

  1. Science, technology and the future of small autonomous drones

    NASA Astrophysics Data System (ADS)

    Floreano, Dario; Wood, Robert J.

    2015-05-01

    We are witnessing the advent of a new era of robots -- drones -- that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.

  2. Science, technology and the future of small autonomous drones.

    PubMed

    Floreano, Dario; Wood, Robert J

    2015-05-28

    We are witnessing the advent of a new era of robots - drones - that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications. PMID:26017445

  3. Robot Lies in Health Care: When Is Deception Morally Permissible?

    PubMed

    Matthias, Andreas

    2015-06-01

    Autonomous robots are increasingly interacting with users who have limited knowledge of robotics and are likely to have an erroneous mental model of the robot's workings, capabilities, and internal structure. The robot's real capabilities may diverge from this mental model to the extent that one might accuse the robot's manufacturer of deceiving the user, especially in cases where the user naturally tends to ascribe exaggerated capabilities to the machine (e.g. conversational systems in elder-care contexts, or toy robots in child care). This poses the question, whether misleading or even actively deceiving the user of an autonomous artifact about the capabilities of the machine is morally bad and why. By analyzing trust, autonomy, and the erosion of trust in communicative acts as consequences of deceptive robot behavior, we formulate four criteria that must be fulfilled in order for robot deception to be morally permissible, and in some cases even morally indicated.

  4. Robotic control and inspection verification

    NASA Technical Reports Server (NTRS)

    Davis, Virgil Leon

    1991-01-01

    Three areas of possible commercialization involving robots at the Kennedy Space Center (KSC) are discussed: a six degree-of-freedom target tracking system for remote umbilical operations; an intelligent torque sensing end effector for operating hand valves in hazardous locations; and an automatic radiator inspection device, a 13 by 65 foot robotic mechanism involving completely redundant motors, drives, and controls. Aspects concerning the first two innovations can be integrated to enable robots or teleoperators to perform tasks involving orientation and panal actuation operations that can be done with existing technology rather than waiting for telerobots to incorporate artificial intelligence (AI) to perform 'smart' autonomous operations. The third robot involves the application of complete control hardware redundancy to enable performance of work over and near expensive Space Shuttle hardware. The consumer marketplace may wish to explore commercialization of similiar component redundancy techniques for applications when a robot would not normally be used because of reliability concerns.

  5. Autonomous Soaring

    NASA Technical Reports Server (NTRS)

    Lin, Victor P.

    2007-01-01

    This viewgraph presentation reviews the autonomous soaring flight of unmanned aerial vehicles (UAV). It reviews energy sources for UAVs, and two examples of UAV's that used alternative energy sources, and thermal currents for soaring. Examples of flight tests, plans, and results are given. Ultimately, the concept of a UAV harvesting energy from the atmosphere has been shown to be feasible with existing technology.

  6. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  7. Sensor selection for outdoor air quality monitoring

    NASA Astrophysics Data System (ADS)

    Dorsey, K. L.; Herr, John R.; Pisano, A. P.

    2014-06-01

    Gas chemical monitoring for next-generation robotics applications such as fire fighting, explosive gas detection, ubiquitous urban monitoring, and mine safety require high performance, reliable sensors. In this work, we discuss the performance requirements of fixed-location, mobile vehicle, and personal sensor nodes for outdoor air quality sensing. We characterize and compare the performance of a miniature commercial electrochemical and a metal oxide gas sensor and discuss their suitability for environmental monitoring applications. Metal oxide sensors are highly cross-sensitive to factors that affect chemical adsorption (e.g., air speed, pressure) and require careful enclosure design or compensation methods. In contrast, electrochemical sensors are less susceptible to environmental variations, have very low power consumption, and are well matched for mobile air quality monitoring.

  8. Indoor and Outdoor Allergies.

    PubMed

    Singh, Madhavi; Hays, Amy

    2016-09-01

    In last 30 to 40 years there has been a significant increase in the incidence of allergy. This increase cannot be explained by genetic factors alone. Increasing air pollution and its interaction with biological allergens along with changing lifestyles are contributing factors. Dust mites, molds, and animal allergens contribute to most of the sensitization in the indoor setting. Tree and grass pollens are the leading allergens in the outdoor setting. Worsening air pollution and increasing particulate matter worsen allergy symptoms and associated morbidity. Cross-sensitization of allergens is common. Treatment involves avoidance of allergens, modifying lifestyle, medical treatment, and immunotherapy. PMID:27545734

  9. Design of a walking robot

    NASA Technical Reports Server (NTRS)

    Whittaker, William; Dowling, Kevin

    1994-01-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  10. Design of a walking robot

    NASA Astrophysics Data System (ADS)

    Whittaker, William; Dowling, Kevin

    1994-03-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  11. Obstacle detection in range-image sequence for outdoor navigation

    NASA Astrophysics Data System (ADS)

    Garduno, M.; Vachon, Bertrand

    1994-08-01

    We deal with the conception of a perception system that's goal is to assist a mobile robot teleoperator by providing pertinent information about eventual obstacles appearing in the robot work space. This range image based perception system is to be embedded on a vehicle able to move at speeds up to 40 km/h in an outdoor environment. A method taking speed constraints into account is proposed. In the first step of this method, a segmentation algorithm is applied to the first range image scanned by the motionless robot to determine areas of interest. From these areas, distinctive attributes are computed and recorded as symbolic representations of each obstacle region. In the second and following steps, obstacles are localized in images scanned during robot motion. The difference between actual object position in the range image and its predicted value is used by an extended Kalman filter to correct the estimated robot configuration. A dynamic image segmentation using emergency and security criteria is carried out and new obstacles can now be detected from the range image and expressed in the robot coordinates system.

  12. Mobile robot for hazardous environments

    SciTech Connect

    Bains, N.

    1995-12-31

    This paper describes the architecture and potential applications of the autonomous robot for a known environment (ARK). The ARK project has developed an autonomous mobile robot that can move around by itself in a complicated nuclear environment utilizing a number of sensors for navigation. The primary sensor system is computer vision. The ARK has the intelligence to determine its position utilizing {open_quotes}natural landmarks,{close_quotes} such as ordinary building features at any point along its path. It is this feature that gives ARK its uniqueness to operate in an industrial type of environment. The prime motivation to develop ARK was the potential application of mobile robots in radioactive areas within nuclear generating stations and for nuclear waste sites. The project budget is $9 million over 4 yr and will be completed in October 1995.

  13. Autonomous vehicles

    SciTech Connect

    Meyrowitz, A.L.; Blidberg, D.R.; Michelson, R.C. |

    1996-08-01

    There are various kinds of autonomous vehicles (AV`s) which can operate with varying levels of autonomy. This paper is concerned with underwater, ground, and aerial vehicles operating in a fully autonomous (nonteleoperated) mode. Further, this paper deals with AV`s as a special kind of device, rather than full-scale manned vehicles operating unmanned. The distinction is one in which the AV is likely to be designed for autonomous operation rather than being adapted for it as would be the case for manned vehicles. The authors provide a survey of the technological progress that has been made in AV`s, the current research issues and approaches that are continuing that progress, and the applications which motivate this work. It should be noted that issues of control are pervasive regardless of the kind of AV being considered, but that there are special considerations in the design and operation of AV`s depending on whether the focus is on vehicles underwater, on the ground, or in the air. The authors have separated the discussion into sections treating each of these categories.

  14. 9 CFR 3.27 - Facilities, outdoor.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Pigs and Hamsters Facilities and Operating Standards § 3.27 Facilities, outdoor. (a) Hamsters shall not be housed in outdoor facilities. (b) Guinea pigs shall not be housed in outdoor facilities...

  15. 9 CFR 3.27 - Facilities, outdoor.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Pigs and Hamsters Facilities and Operating Standards § 3.27 Facilities, outdoor. (a) Hamsters shall not be housed in outdoor facilities. (b) Guinea pigs shall not be housed in outdoor facilities...

  16. Can Robots and Humans Get Along?

    SciTech Connect

    Scholtz, Jean

    2007-06-01

    Now that robots have moved into the mainstream—as vacuum cleaners, lawn mowers, autonomous vehicles, tour guides, and even pets—it is important to consider how everyday people will interact with them. A robot is really just a computer, but many researchers are beginning to understand that human-robot interactions are much different than human-computer interactions. So while the metrics used to evaluate the human-computer interaction (usability of the software interface in terms of time, accuracy, and user satisfaction) may also be appropriate for human-robot interactions, we need to determine whether there are additional metrics that should be considered.

  17. Outdoor and Risk Educational Practices.

    ERIC Educational Resources Information Center

    Goldenberg, Marni

    Outdoor adventure education is an experiential method of learning which takes place primarily through sensory involvement with the outdoors. Characteristics of adventure education include uncertain outcomes, risk, inescapable consequences, energetic action, and willingness to participate. Adventure education occurs in a diversity of venues,…

  18. Preparing Effective Outdoor Pursuit Leaders.

    ERIC Educational Resources Information Center

    Priest, Simon

    Information related to selecting, training, and certifying outdoor leaders for high adventure pursuits, is provided by selected experts from five English-speaking nations (Great Britain, Australia, New Zealand, Canada and the United States). Patterns of differences and similarities among these nations regarding outdoor leadership components and…

  19. Financing of Private Outdoor Recreation.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    A survey of financial institutions was undertaken by the Bureau of Outdoor Recreation to evaluate the demand and availability of private credit for enterprises that provide outdoor recreation. The survey provided basic information for (1) evaluating legislative proposals for loan guarantee programs, (2) nationwide planning, and (3) assessing the…

  20. Outdoor Education: Definition and Philosophy.

    ERIC Educational Resources Information Center

    Ford, Phyllis

    Because outdoor education programs occur in every geographic location, are sponsored by all levels of educational institutions, state and local government agencies, and private entrepreneurs, and have no nationally standardized curriculum or measures of competency or knowledge, outdoor education may best be defines as "education in, about, and for…

  1. Group Cooperation in Outdoor Education

    ERIC Educational Resources Information Center

    Matthews, Bruce E.

    1978-01-01

    Utilizing the Beatles' Yellow Submarine fantasy (e.g., the Blue Meanies), this outdoor education program is designed for sixth graders and special education students. Activities developed at the Cortland Resident Outdoor Education Camp include a series of group stress/challenge activities to be accomplished by everyone in the group, as a group.…

  2. Outdoor Education for Elementary Schools.

    ERIC Educational Resources Information Center

    Chase, Craig C.; Rosenstein, Irwin

    As a planning guide for administrators and public school teachers of elementary school children, this document was developed to assist in planning and implementing outdoor education activities. The document contains objectives, an introduction, contributions of outdoor education to the curruculum, suggested instructional laboratory environments,…

  3. OBIS: Outdoor Biology Instructional Strategies.

    ERIC Educational Resources Information Center

    Donovan, Edward P.; Richmond, Robert F.

    The Outdoor Biology Instructional Strategies (OBIS) project began in 1972 to enable non-school youth groups (aged 10-15) to gain firsthand experiences in outdoor environments. This descriptive paper explains the program including its purpose and historical background. Specific objectives are to: (1) stimulate curiosity about local environments;…

  4. Wilderness Survival and Outdoor Education.

    ERIC Educational Resources Information Center

    Ball, Matt

    Outdoor education is often delivered through games and activities such as nature hikes or observing an ecosystem within a 1-foot circle on the ground. Often, participants look closely at the earth only for that brief moment. Wilderness survival is another way to teach about the outdoors. It offers skills that encourage participants to become more…

  5. Cultural Adaptation in Outdoor Programming

    ERIC Educational Resources Information Center

    Fabrizio, Sheila M.; Neill, James

    2005-01-01

    Outdoor programs often intentionally provide a different culture and the challenge of working out how to adapt. Failure to adapt, however, can cause symptoms of culture shock, including homesickness, negative personal behavior, and interpersonal conflict. This article links cross-cultural and outdoor programming literature and provides case…

  6. Technology Works in the Outdoors

    ERIC Educational Resources Information Center

    Zita, Adam

    2008-01-01

    Technology is all around us and no matter how hard educators promote the value of outdoor and experiential education (OEE) to adults and children alike, they are pulled away by a different reality--one might say, a virtual reality. Even when one is engaged in the outdoors either through a night hike or a stream study, technology is lingering…

  7. Cultural Diversity in Outdoor Education

    ERIC Educational Resources Information Center

    Thompson, Graham; Horvath, Erin

    2007-01-01

    At first glance Sioux Lookout is a typical northern Ontario town, situated within an intricate lake and river system, socially focused on year-round outdoor activities, and enveloped by kilometres and more kilometres of undomesticated Canadian Shield landscape. One might think this would be an ideal spot for outdoor education, just as these…

  8. Robotic surgery.

    PubMed

    Oleynikov, Dmitry

    2008-10-01

    This article discusses the developments that led up to robotic surgical systems as well as what is on the horizon for new robotic technology. Topics include how robotics is enabling new types of procedures, including natural orifice endoscopic translumenal surgery in which one cannot reach by hand under any circumstances, and how these developments will drive the next generation of robots. PMID:18790158

  9. Autonomous exploration and mapping of unknown environments

    NASA Astrophysics Data System (ADS)

    Owens, Jason; Osteen, Phil; Fields, MaryAnne

    2012-06-01

    Autonomous exploration and mapping is a vital capability for future robotic systems expected to function in arbitrary complex environments. In this paper, we describe an end-to-end robotic solution for remotely mapping buildings. For a typical mapping system, an unmanned system is directed to enter an unknown building at a distance, sense the internal structure, and, barring additional tasks, while in situ, create a 2-D map of the building. This map provides a useful and intuitive representation of the environment for the remote operator. We have integrated a robust mapping and exploration system utilizing laser range scanners and RGB-D cameras, and we demonstrate an exploration and metacognition algorithm on a robotic platform. The algorithm allows the robot to safely navigate the building, explore the interior, report significant features to the operator, and generate a consistent map - all while maintaining localization.

  10. Rover: Autonomous concepts for Mars exploration

    NASA Astrophysics Data System (ADS)

    Baiget, A.; Castets, B.; Chochon, H.; Hayard, M.; Lamarre, H.; Lamothe, A.

    1993-01-01

    The development of a mobile, autonomous vehicle that will be launched towards an unknown planet is considered. The rover significant constraints are: Ariane 5 compatibility, Earth/Mars transfer capability, 1000 km autonomous moving in Mars environment, on board localization, and maximum science capability. Two different types of subsystem were considered: classical subsystems (mechanical and mechanisms, thermal, telecommunications, power, onboard data processing) and robotics subsystem, (perception/navigation, autonomous displacement generation, autonomous localization). The needs of each subsystem were studied in terms of energy and data handling capability, in order to choose an on board architecture which best use the available capability, by means of specialized parts. A compromise must always be done between every subsystem in order to obtain the real need with respect to the goal, for example: between perception/navigation and the motion capability. A compromise must also be found between mechanical assembly and calibration need, which is a real problem.

  11. Knowledge acquisition for autonomous systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry; Heer, Ewald

    1988-01-01

    Knowledge-based capabilities for autonomous aerospace systems, such as the NASA Space Station, must encompass conflict-resolution functions comparable to those of human operators, with all elements of the system working toward system goals in a concurrent, asynchronous-but-coordinated fashion. Knowledge extracted from a design database will support robotic systems by furnishing geometric, structural, and causal descriptions required for repair, disassembly, and assembly. The factual knowledge for these databases will be obtained from a master database through a technical management information system, and it will in many cases have to be augmented by domain-specific heuristic knowledge acquired from domain experts.

  12. Autonomous navigation for structured exterior environments

    SciTech Connect

    Pletta, J B

    1993-12-01

    The Telemanaged Mobile Security Station (TMSS) was developed at Sandia National Laboratories to investigate the role of mobile robotics in exterior perimeter security systems. A major feature of the system is its capability to perform autonomous patrols of the security site`s network of roads. Perimeter security sites are well known, structured environments; the locations of the roads, buildings, and fences are relatively static. A security robot has the advantage of being able to learn its new environment prior to autonomous travel. The TMSS robot combines information from a microwave beacon system and on-board dead reckoning sensors to determine its location within the site. The operator is required to teleoperate the robot in a teach mode over all desired paths before autonomous operations can commence. During this teach phase, TMSS stores points from its position location system at two meter intervals. This map data base is used for planning paths and for reference during path following. Details of the position location and path following systems will be described along with system performance and recommendations for future enhancements.

  13. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design

  14. A biologically-inspired autonomous robot

    NASA Astrophysics Data System (ADS)

    Beer, Randall D.

    1993-12-01

    A treadmill has been developed to support our cockroach locomotion studies. We have developed a small treadmill with a transparent belt for studying leg joint movements along with EMG's as the animal walks or runs at various speeds. This allows us to match the electrical activity in muscles with the kinematics of joint movement. Along with intracellular stimulation studies performed previously, the tools are now in place to make major advances in understanding how the insect's walking movements are actually accomplished.

  15. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...

  16. Autonomous Systems and Robotics: 2000-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes technologies to monitor, maintain, and where possible, repair complex space systems. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  17. Robots for Astrobiology!

    NASA Technical Reports Server (NTRS)

    Boston, Penelope J.

    2016-01-01

    The search for life and its study is known as astrobiology. Conducting that search on other planets in our Solar System is a major goal of NASA and other space agencies, and a driving passion of the community of scientists and engineers around the world. We practice for that search in many ways, from exploring and studying extreme environments on Earth, to developing robots to go to other planets and help us look for any possible life that may be there or may have been there in the past. The unique challenges of space exploration make collaborations between robots and humans essential. The products of those collaborations will be novel and driven by the features of wholly new environments. For space and planetary environments that are intolerable for humans or where humans present an unacceptable risk to possible biologically sensitive sites, autonomous robots or telepresence offer excellent choices. The search for life signs on Mars fits within this category, especially in advance of human landed missions there, but also as assistants and tools once humans reach the Red Planet. For planetary destinations where we do not envision humans ever going in person, like bitterly cold icy moons, or ocean worlds with thick ice roofs that essentially make them planetary-sized ice caves, we will rely on robots alone to visit those environments for us and enable us to explore and understand any life that we may find there. Current generation robots are not quite ready for some of the tasks that we need them to do, so there are many opportunities for roboticists of the future to advance novel types of mobility, autonomy, and bio-inspired robotic designs to help us accomplish our astrobiological goals. We see an exciting partnership between robotics and astrobiology continually strengthening as we jointly pursue the quest to find extraterrestrial life.

  18. A feedback-trained autonomous control system for heterogeneous search and rescue applications

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2012-06-01

    Due to the environment in which operation occurs, earch and rescue (SAR) applications present a challenge to autonomous systems. A control technique for a heterogeneous multi-robot group is discussed. The proposed methodology is not fully autonomous; however, human operators are freed from most control tasks and allowed to focus on perception tasks while robots execute a collaborative search and identification plan. Robotic control combines a centralized dispatch and learning system (which continuously refines heuristics used for planning) with local autonomous task ordering (based on existing task priority and proximity and local conditions). This technique was tested in a SAR analogous (from a control perspective) environment.

  19. Autonomous control

    NASA Technical Reports Server (NTRS)

    Brown, Barbara

    1990-01-01

    KSC has been developing the Knowledge-Based Autonomous Test Engineer (KATE), which is a tool for performing automated monitoring, diagnosis, and control of electromechanical devices. KATE employs artificial intelligence computing techniques to perform these functions. The KATE system consists of a generic shell and a knowledge base. The KATE shell is the portion of the system which performs the monitoring, diagnosis, and control functions. It is generic in the sense that it is application independent. This means that the monitoring activity, for instance, will be performed with the same algorithms regardless of the particular physical device being used. The knowledge base is the portion of the system which contains specific functional and behavorial information about the physical device KATE is working with. Work is nearing completion on a project at KSC to interface a Texas Instruments Explorer running a LISP version of KATE with a Generic Checkout System (GCS) test-bed to control a physical simulation of a shuttle tanking system (humorously called the Red Wagon because of its color and mobility). The Autonomous Control System (ACS) project supplements and extends the KATE/GCS project by adding three other major activities. The activities include: porting KATE from the Texas Instruments Explorer machine to an Intel 80386-based UNIX workstation in the LISP language; rewriting KATE as necessary to run on the same 80386 workstation but in the Ada language; and investigating software and techniques to translate ANSI Standard Common LISP to Mil Standard Ada. Primary goals of this task are as follows: (1) establish the advantages of using expert systems to provide intelligent autonomous software for Space Station Freedom applications; (2) determine the feasibility of using Ada as the run-time environment for model-based expert systems; (3) provide insight into the advantages and disadvantagesof using LISP or Ada in the run-time environment for expert systems; and (4

  20. Robots, systems, and methods for hazard evaluation and visualization

    DOEpatents

    Nielsen, Curtis W.; Bruemmer, David J.; Walton, Miles C.; Hartley, Robert S.; Gertman, David I.; Kinoshita, Robert A.; Whetten, Jonathan

    2013-01-15

    A robot includes a hazard sensor, a locomotor, and a system controller. The robot senses a hazard intensity at a location of the robot, moves to a new location in response to the hazard intensity, and autonomously repeats the sensing and moving to determine multiple hazard levels at multiple locations. The robot may also include a communicator to communicate the multiple hazard levels to a remote controller. The remote controller includes a communicator for sending user commands to the robot and receiving the hazard levels from the robot. A graphical user interface displays an environment map of the environment proximate the robot and a scale for indicating a hazard intensity. A hazard indicator corresponds to a robot position in the environment map and graphically indicates the hazard intensity at the robot position relative to the scale.